Jan 23 08:21:34 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 23 08:21:34 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 23 08:21:34 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 08:21:34 localhost kernel: BIOS-provided physical RAM map:
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdafff] usable
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x000000007ffdb000-0x000000007fffffff] reserved
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 23 08:21:34 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000027fffffff] usable
Jan 23 08:21:34 localhost kernel: NX (Execute Disable) protection: active
Jan 23 08:21:34 localhost kernel: APIC: Static calls initialized
Jan 23 08:21:34 localhost kernel: SMBIOS 2.8 present.
Jan 23 08:21:34 localhost kernel: DMI: Red Hat OpenStack Compute/RHEL, BIOS 1.16.1-1.el9 04/01/2014
Jan 23 08:21:34 localhost kernel: Hypervisor detected: KVM
Jan 23 08:21:34 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 23 08:21:34 localhost kernel: kvm-clock: using sched offset of 3205138143 cycles
Jan 23 08:21:34 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 23 08:21:34 localhost kernel: tsc: Detected 2445.404 MHz processor
Jan 23 08:21:34 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 23 08:21:34 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 23 08:21:34 localhost kernel: last_pfn = 0x280000 max_arch_pfn = 0x400000000
Jan 23 08:21:34 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 23 08:21:34 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 23 08:21:34 localhost kernel: last_pfn = 0x7ffdb max_arch_pfn = 0x400000000
Jan 23 08:21:34 localhost kernel: found SMP MP-table at [mem 0x000f5b60-0x000f5b6f]
Jan 23 08:21:34 localhost kernel: Using GB pages for direct mapping
Jan 23 08:21:34 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 23 08:21:34 localhost kernel: ACPI: Early table checksum verification disabled
Jan 23 08:21:34 localhost kernel: ACPI: RSDP 0x00000000000F5B20 000014 (v00 BOCHS )
Jan 23 08:21:34 localhost kernel: ACPI: RSDT 0x000000007FFE35EB 000034 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 08:21:34 localhost kernel: ACPI: FACP 0x000000007FFE3403 0000F4 (v03 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 08:21:34 localhost kernel: ACPI: DSDT 0x000000007FFDFCC0 003743 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 08:21:34 localhost kernel: ACPI: FACS 0x000000007FFDFC80 000040
Jan 23 08:21:34 localhost kernel: ACPI: APIC 0x000000007FFE34F7 000090 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 08:21:34 localhost kernel: ACPI: MCFG 0x000000007FFE3587 00003C (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 08:21:34 localhost kernel: ACPI: WAET 0x000000007FFE35C3 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 08:21:34 localhost kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe3403-0x7ffe34f6]
Jan 23 08:21:34 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfcc0-0x7ffe3402]
Jan 23 08:21:34 localhost kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfc80-0x7ffdfcbf]
Jan 23 08:21:34 localhost kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe34f7-0x7ffe3586]
Jan 23 08:21:34 localhost kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe3587-0x7ffe35c2]
Jan 23 08:21:34 localhost kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe35c3-0x7ffe35ea]
Jan 23 08:21:34 localhost kernel: No NUMA configuration found
Jan 23 08:21:34 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000027fffffff]
Jan 23 08:21:34 localhost kernel: NODE_DATA(0) allocated [mem 0x27ffd5000-0x27fffffff]
Jan 23 08:21:34 localhost kernel: crashkernel reserved: 0x000000006f000000 - 0x000000007f000000 (256 MB)
Jan 23 08:21:34 localhost kernel: Zone ranges:
Jan 23 08:21:34 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 23 08:21:34 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 23 08:21:34 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000027fffffff]
Jan 23 08:21:34 localhost kernel:   Device   empty
Jan 23 08:21:34 localhost kernel: Movable zone start for each node
Jan 23 08:21:34 localhost kernel: Early memory node ranges
Jan 23 08:21:34 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 23 08:21:34 localhost kernel:   node   0: [mem 0x0000000000100000-0x000000007ffdafff]
Jan 23 08:21:34 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000027fffffff]
Jan 23 08:21:34 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000027fffffff]
Jan 23 08:21:34 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 23 08:21:34 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 23 08:21:34 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 23 08:21:34 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 23 08:21:34 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 23 08:21:34 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 23 08:21:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 23 08:21:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 23 08:21:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 23 08:21:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 23 08:21:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 23 08:21:34 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 23 08:21:34 localhost kernel: TSC deadline timer available
Jan 23 08:21:34 localhost kernel: CPU topo: Max. logical packages:   4
Jan 23 08:21:34 localhost kernel: CPU topo: Max. logical dies:       4
Jan 23 08:21:34 localhost kernel: CPU topo: Max. dies per package:   1
Jan 23 08:21:34 localhost kernel: CPU topo: Max. threads per core:   1
Jan 23 08:21:34 localhost kernel: CPU topo: Num. cores per package:     1
Jan 23 08:21:34 localhost kernel: CPU topo: Num. threads per package:   1
Jan 23 08:21:34 localhost kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs
Jan 23 08:21:34 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 23 08:21:34 localhost kernel: kvm-guest: KVM setup pv remote TLB flush
Jan 23 08:21:34 localhost kernel: kvm-guest: setup PV sched yield
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x7ffdb000-0x7fffffff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x80000000-0xafffffff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xb0000000-0xbfffffff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfed1bfff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfed1c000-0xfed1ffff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfed20000-0xfeffbfff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 23 08:21:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 23 08:21:34 localhost kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices
Jan 23 08:21:34 localhost kernel: Booting paravirtualized kernel on KVM
Jan 23 08:21:34 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 23 08:21:34 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1
Jan 23 08:21:34 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u524288
Jan 23 08:21:34 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u524288 alloc=1*2097152
Jan 23 08:21:34 localhost kernel: pcpu-alloc: [0] 0 1 2 3 
Jan 23 08:21:34 localhost kernel: kvm-guest: PV spinlocks enabled
Jan 23 08:21:34 localhost kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear)
Jan 23 08:21:34 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 08:21:34 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 23 08:21:34 localhost kernel: random: crng init done
Jan 23 08:21:34 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 23 08:21:34 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 23 08:21:34 localhost kernel: Fallback order for Node 0: 0 
Jan 23 08:21:34 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 23 08:21:34 localhost kernel: Policy zone: Normal
Jan 23 08:21:34 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 23 08:21:34 localhost kernel: software IO TLB: area num 4.
Jan 23 08:21:34 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1
Jan 23 08:21:34 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 23 08:21:34 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 23 08:21:34 localhost kernel: Dynamic Preempt: voluntary
Jan 23 08:21:34 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 23 08:21:34 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 23 08:21:34 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=4.
Jan 23 08:21:34 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 23 08:21:34 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 23 08:21:34 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 23 08:21:34 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 23 08:21:34 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4
Jan 23 08:21:34 localhost kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 23 08:21:34 localhost kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 23 08:21:34 localhost kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 23 08:21:34 localhost kernel: NR_IRQS: 524544, nr_irqs: 456, preallocated irqs: 16
Jan 23 08:21:34 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 23 08:21:34 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 23 08:21:34 localhost kernel: Console: colour VGA+ 80x25
Jan 23 08:21:34 localhost kernel: printk: console [ttyS0] enabled
Jan 23 08:21:34 localhost kernel: ACPI: Core revision 20230331
Jan 23 08:21:34 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 23 08:21:34 localhost kernel: x2apic enabled
Jan 23 08:21:34 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 23 08:21:34 localhost kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask()
Jan 23 08:21:34 localhost kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself()
Jan 23 08:21:34 localhost kernel: kvm-guest: setup PV IPIs
Jan 23 08:21:34 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 23 08:21:34 localhost kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404)
Jan 23 08:21:34 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 23 08:21:34 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 23 08:21:34 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 23 08:21:34 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 23 08:21:34 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 23 08:21:34 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 23 08:21:34 localhost kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls
Jan 23 08:21:34 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 23 08:21:34 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 23 08:21:34 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 23 08:21:34 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 23 08:21:34 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 23 08:21:34 localhost kernel: Transient Scheduler Attacks: Vulnerable: No microcode
Jan 23 08:21:34 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 23 08:21:34 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 23 08:21:34 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 23 08:21:34 localhost kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers'
Jan 23 08:21:34 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 23 08:21:34 localhost kernel: x86/fpu: xstate_offset[9]:  832, xstate_sizes[9]:    8
Jan 23 08:21:34 localhost kernel: x86/fpu: Enabled xstate features 0x207, context size is 840 bytes, using 'compacted' format.
Jan 23 08:21:34 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 23 08:21:34 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 23 08:21:34 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 23 08:21:34 localhost kernel: landlock: Up and running.
Jan 23 08:21:34 localhost kernel: Yama: becoming mindful.
Jan 23 08:21:34 localhost kernel: SELinux:  Initializing.
Jan 23 08:21:34 localhost kernel: LSM support for eBPF active
Jan 23 08:21:34 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 08:21:34 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 08:21:34 localhost kernel: smpboot: CPU0: AMD EPYC 7763 64-Core Processor (family: 0x19, model: 0x1, stepping: 0x1)
Jan 23 08:21:34 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 23 08:21:34 localhost kernel: ... version:                0
Jan 23 08:21:34 localhost kernel: ... bit width:              48
Jan 23 08:21:34 localhost kernel: ... generic registers:      6
Jan 23 08:21:34 localhost kernel: ... value mask:             0000ffffffffffff
Jan 23 08:21:34 localhost kernel: ... max period:             00007fffffffffff
Jan 23 08:21:34 localhost kernel: ... fixed-purpose events:   0
Jan 23 08:21:34 localhost kernel: ... event mask:             000000000000003f
Jan 23 08:21:34 localhost kernel: signal: max sigframe size: 3376
Jan 23 08:21:34 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 23 08:21:34 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 23 08:21:34 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 23 08:21:34 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 23 08:21:34 localhost kernel: .... node  #0, CPUs:      #1 #2 #3
Jan 23 08:21:34 localhost kernel: smp: Brought up 1 node, 4 CPUs
Jan 23 08:21:34 localhost kernel: smpboot: Total of 4 processors activated (19563.23 BogoMIPS)
Jan 23 08:21:34 localhost kernel: node 0 deferred pages initialised in 7ms
Jan 23 08:21:34 localhost kernel: Memory: 7766072K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 617308K reserved, 0K cma-reserved)
Jan 23 08:21:34 localhost kernel: devtmpfs: initialized
Jan 23 08:21:34 localhost kernel: x86/mm: Memory block size: 128MB
Jan 23 08:21:34 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 23 08:21:34 localhost kernel: futex hash table entries: 1024 (65536 bytes on 1 NUMA nodes, total 64 KiB, linear).
Jan 23 08:21:34 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 23 08:21:34 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 23 08:21:34 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 23 08:21:34 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 23 08:21:34 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 23 08:21:34 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 23 08:21:34 localhost kernel: audit: type=2000 audit(1769156494.321:1): state=initialized audit_enabled=0 res=1
Jan 23 08:21:34 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 23 08:21:34 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 23 08:21:34 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 23 08:21:34 localhost kernel: cpuidle: using governor menu
Jan 23 08:21:34 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 23 08:21:34 localhost kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff]
Jan 23 08:21:34 localhost kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry
Jan 23 08:21:34 localhost kernel: PCI: Using configuration type 1 for base access
Jan 23 08:21:34 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 23 08:21:34 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 23 08:21:34 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 23 08:21:34 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 23 08:21:34 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 23 08:21:34 localhost kernel: Demotion targets for Node 0: null
Jan 23 08:21:34 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 23 08:21:34 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 23 08:21:34 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 23 08:21:34 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 23 08:21:34 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 23 08:21:34 localhost kernel: ACPI: Interpreter enabled
Jan 23 08:21:34 localhost kernel: ACPI: PM: (supports S0 S5)
Jan 23 08:21:34 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 23 08:21:34 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 23 08:21:34 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 23 08:21:34 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 3F
Jan 23 08:21:34 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 23 08:21:34 localhost kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 23 08:21:34 localhost kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR DPC]
Jan 23 08:21:34 localhost kernel: acpi PNP0A08:00: _OSC: OS now controls [SHPCHotplug PME AER PCIeCapability]
Jan 23 08:21:34 localhost kernel: PCI host bridge to bus 0000:00
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x280000000-0xa7fffffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: BAR 0 [mem 0xf9800000-0xf9ffffff pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: BAR 2 [mem 0xfc200000-0xfc203fff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1: BAR 0 [mem 0xfea1a000-0xfea1afff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2: BAR 0 [mem 0xfea1b000-0xfea1bfff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3: BAR 0 [mem 0xfea1c000-0xfea1cfff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4: BAR 0 [mem 0xfea1d000-0xfea1dfff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5: BAR 0 [mem 0xfea1e000-0xfea1efff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: BAR 0 [mem 0xfea1f000-0xfea1ffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: BAR 0 [mem 0xfea20000-0xfea20fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: BAR 0 [mem 0xfea21000-0xfea21fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:00:1f.0: quirk: [io  0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO
Jan 23 08:21:34 localhost kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:00:1f.2: BAR 4 [io  0xd040-0xd05f]
Jan 23 08:21:34 localhost kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea22000-0xfea22fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:00:1f.3: BAR 4 [io  0x0700-0x073f]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0: BAR 0 [mem 0xfc800000-0xfc8000ff 64bit]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:02: extended config space not accessible
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [1] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [2] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [3] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [4] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [5] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [6] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [7] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [8] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [9] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [10] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [11] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [12] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [13] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [14] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [15] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [16] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [17] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [18] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [19] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [20] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [21] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [22] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [23] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [24] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [25] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [26] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [27] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [28] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [29] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [30] registered
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [31] registered
Jan 23 08:21:34 localhost kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:02:01.0: BAR 4 [io  0xc000-0xc01f]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-2] registered
Jan 23 08:21:34 localhost kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe840000-0xfe840fff]
Jan 23 08:21:34 localhost kernel: pci 0000:03:00.0: BAR 4 [mem 0xfbe00000-0xfbe03fff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:03:00.0: ROM [mem 0xfe800000-0xfe83ffff pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-3] registered
Jan 23 08:21:34 localhost kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:04:00.0: BAR 1 [mem 0xfe600000-0xfe600fff]
Jan 23 08:21:34 localhost kernel: pci 0000:04:00.0: BAR 4 [mem 0xfbc00000-0xfbc03fff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-4] registered
Jan 23 08:21:34 localhost kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:05:00.0: BAR 4 [mem 0xfba00000-0xfba03fff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-5] registered
Jan 23 08:21:34 localhost kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint
Jan 23 08:21:34 localhost kernel: pci 0000:06:00.0: BAR 4 [mem 0xfb800000-0xfb803fff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-6] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-7] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-8] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-9] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-10] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-11] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-12] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-13] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-14] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-15] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-16] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 23 08:21:34 localhost kernel: acpiphp: Slot [0-17] registered
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22
Jan 23 08:21:34 localhost kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23
Jan 23 08:21:34 localhost kernel: iommu: Default domain type: Translated
Jan 23 08:21:34 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 23 08:21:34 localhost kernel: SCSI subsystem initialized
Jan 23 08:21:34 localhost kernel: ACPI: bus type USB registered
Jan 23 08:21:34 localhost kernel: usbcore: registered new interface driver usbfs
Jan 23 08:21:34 localhost kernel: usbcore: registered new interface driver hub
Jan 23 08:21:34 localhost kernel: usbcore: registered new device driver usb
Jan 23 08:21:34 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 23 08:21:34 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 23 08:21:34 localhost kernel: PTP clock support registered
Jan 23 08:21:34 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 23 08:21:34 localhost kernel: NetLabel: Initializing
Jan 23 08:21:34 localhost kernel: NetLabel:  domain hash size = 128
Jan 23 08:21:34 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 23 08:21:34 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 23 08:21:34 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 23 08:21:34 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 23 08:21:34 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 23 08:21:34 localhost kernel: e820: reserve RAM buffer [mem 0x7ffdb000-0x7fffffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: vgaarb: bridge control possible
Jan 23 08:21:34 localhost kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 23 08:21:34 localhost kernel: vgaarb: loaded
Jan 23 08:21:34 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 23 08:21:34 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 23 08:21:34 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 23 08:21:34 localhost kernel: pnp: PnP ACPI init
Jan 23 08:21:34 localhost kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved
Jan 23 08:21:34 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 23 08:21:34 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 23 08:21:34 localhost kernel: NET: Registered PF_INET protocol family
Jan 23 08:21:34 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 23 08:21:34 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 23 08:21:34 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 23 08:21:34 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 23 08:21:34 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 23 08:21:34 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 23 08:21:34 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 23 08:21:34 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 08:21:34 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 08:21:34 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 23 08:21:34 localhost kernel: NET: Registered PF_XDP protocol family
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x0fff] to [bus 03] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: bridge window [io  0x1000-0x0fff] to [bus 04] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: bridge window [io  0x1000-0x0fff] to [bus 05] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4: bridge window [io  0x1000-0x0fff] to [bus 06] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5: bridge window [io  0x1000-0x0fff] to [bus 07] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6: bridge window [io  0x1000-0x0fff] to [bus 08] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7: bridge window [io  0x1000-0x0fff] to [bus 09] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0: bridge window [io  0x1000-0x0fff] to [bus 0a] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1: bridge window [io  0x1000-0x0fff] to [bus 0b] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2: bridge window [io  0x1000-0x0fff] to [bus 0c] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3: bridge window [io  0x1000-0x0fff] to [bus 0d] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4: bridge window [io  0x1000-0x0fff] to [bus 0e] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5: bridge window [io  0x1000-0x0fff] to [bus 0f] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: bridge window [io  0x1000-0x0fff] to [bus 10] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: bridge window [io  0x1000-0x0fff] to [bus 11] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x0fff] to [bus 12] add_size 1000
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x1fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: bridge window [io  0x2000-0x2fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: bridge window [io  0x3000-0x3fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4: bridge window [io  0x4000-0x4fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5: bridge window [io  0x5000-0x5fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6: bridge window [io  0x6000-0x6fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7: bridge window [io  0x7000-0x7fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0: bridge window [io  0x8000-0x8fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1: bridge window [io  0x9000-0x9fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2: bridge window [io  0xa000-0xafff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3: bridge window [io  0xb000-0xbfff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4: bridge window [io  0xe000-0xefff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5: bridge window [io  0xf000-0xffff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: can't assign; no space
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: failed to assign
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: can't assign; no space
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: failed to assign
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: can't assign; no space
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: failed to assign
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x1fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: bridge window [io  0x2000-0x2fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: bridge window [io  0x3000-0x3fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5: bridge window [io  0x4000-0x4fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4: bridge window [io  0x5000-0x5fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3: bridge window [io  0x6000-0x6fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2: bridge window [io  0x7000-0x7fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1: bridge window [io  0x8000-0x8fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0: bridge window [io  0x9000-0x9fff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7: bridge window [io  0xa000-0xafff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6: bridge window [io  0xb000-0xbfff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5: bridge window [io  0xe000-0xefff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4: bridge window [io  0xf000-0xffff]: assigned
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: can't assign; no space
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: failed to assign
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: can't assign; no space
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: failed to assign
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: can't assign; no space
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: failed to assign
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4:   bridge window [io  0xf000-0xffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5:   bridge window [io  0xe000-0xefff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6:   bridge window [io  0xb000-0xbfff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7:   bridge window [io  0xa000-0xafff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0:   bridge window [io  0x9000-0x9fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1:   bridge window [io  0x8000-0x8fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2:   bridge window [io  0x7000-0x7fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3:   bridge window [io  0x6000-0x6fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4:   bridge window [io  0x5000-0x5fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5:   bridge window [io  0x4000-0x4fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6:   bridge window [io  0x3000-0x3fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7:   bridge window [io  0x2000-0x2fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0:   bridge window [io  0x1000-0x1fff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Jan 23 08:21:34 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:00: resource 9 [mem 0x280000000-0xa7fffffff window]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:01: resource 0 [io  0xc000-0xcfff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:01: resource 1 [mem 0xfc600000-0xfc9fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:01: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:02: resource 0 [io  0xc000-0xcfff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:02: resource 1 [mem 0xfc600000-0xfc7fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:02: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:03: resource 2 [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:04: resource 2 [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:05: resource 2 [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:06: resource 0 [io  0xf000-0xffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:06: resource 2 [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:07: resource 0 [io  0xe000-0xefff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:07: resource 2 [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:08: resource 0 [io  0xb000-0xbfff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:08: resource 2 [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:09: resource 0 [io  0xa000-0xafff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:09: resource 2 [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0a: resource 0 [io  0x9000-0x9fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0a: resource 1 [mem 0xfda00000-0xfdbfffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0a: resource 2 [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0b: resource 0 [io  0x8000-0x8fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0b: resource 1 [mem 0xfd800000-0xfd9fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0b: resource 2 [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0c: resource 0 [io  0x7000-0x7fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0c: resource 1 [mem 0xfd600000-0xfd7fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0c: resource 2 [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0d: resource 0 [io  0x6000-0x6fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0d: resource 1 [mem 0xfd400000-0xfd5fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0d: resource 2 [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0e: resource 0 [io  0x5000-0x5fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0e: resource 1 [mem 0xfd200000-0xfd3fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0e: resource 2 [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0f: resource 0 [io  0x4000-0x4fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0f: resource 1 [mem 0xfd000000-0xfd1fffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:0f: resource 2 [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:10: resource 0 [io  0x3000-0x3fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:10: resource 1 [mem 0xfce00000-0xfcffffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:10: resource 2 [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:11: resource 0 [io  0x2000-0x2fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:11: resource 1 [mem 0xfcc00000-0xfcdfffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:11: resource 2 [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:12: resource 0 [io  0x1000-0x1fff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:12: resource 1 [mem 0xfca00000-0xfcbfffff]
Jan 23 08:21:34 localhost kernel: pci_bus 0000:12: resource 2 [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 23 08:21:34 localhost kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22
Jan 23 08:21:34 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 23 08:21:34 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 23 08:21:34 localhost kernel: software IO TLB: mapped [mem 0x000000006b000000-0x000000006f000000] (64MB)
Jan 23 08:21:34 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 23 08:21:34 localhost kernel: ACPI: bus type thunderbolt registered
Jan 23 08:21:34 localhost kernel: Initialise system trusted keyrings
Jan 23 08:21:34 localhost kernel: Key type blacklist registered
Jan 23 08:21:34 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 23 08:21:34 localhost kernel: zbud: loaded
Jan 23 08:21:34 localhost kernel: integrity: Platform Keyring initialized
Jan 23 08:21:34 localhost kernel: integrity: Machine keyring initialized
Jan 23 08:21:34 localhost kernel: Freeing initrd memory: 87956K
Jan 23 08:21:34 localhost kernel: NET: Registered PF_ALG protocol family
Jan 23 08:21:34 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 23 08:21:34 localhost kernel: Key type asymmetric registered
Jan 23 08:21:34 localhost kernel: Asymmetric key parser 'x509' registered
Jan 23 08:21:34 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 23 08:21:34 localhost kernel: io scheduler mq-deadline registered
Jan 23 08:21:34 localhost kernel: io scheduler kyber registered
Jan 23 08:21:34 localhost kernel: io scheduler bfq registered
Jan 23 08:21:34 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31
Jan 23 08:21:34 localhost kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39
Jan 23 08:21:34 localhost kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40
Jan 23 08:21:34 localhost kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40
Jan 23 08:21:34 localhost kernel: shpchp 0000:01:00.0: HPC vendor_id 1b36 device_id e ss_vid 0 ss_did 0
Jan 23 08:21:34 localhost kernel: shpchp 0000:01:00.0: pci_hp_register failed with error -16
Jan 23 08:21:34 localhost kernel: shpchp 0000:01:00.0: Slot initialization failed
Jan 23 08:21:34 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 23 08:21:34 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 23 08:21:34 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 23 08:21:34 localhost kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21
Jan 23 08:21:34 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 23 08:21:34 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 23 08:21:34 localhost kernel: Non-volatile memory driver v1.3
Jan 23 08:21:34 localhost kernel: rdac: device handler registered
Jan 23 08:21:34 localhost kernel: hp_sw: device handler registered
Jan 23 08:21:34 localhost kernel: emc: device handler registered
Jan 23 08:21:34 localhost kernel: alua: device handler registered
Jan 23 08:21:34 localhost kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller
Jan 23 08:21:34 localhost kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1
Jan 23 08:21:34 localhost kernel: uhci_hcd 0000:02:01.0: detected 2 ports
Jan 23 08:21:34 localhost kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x0000c000
Jan 23 08:21:34 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 23 08:21:34 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 23 08:21:34 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 23 08:21:34 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 23 08:21:34 localhost kernel: usb usb1: SerialNumber: 0000:02:01.0
Jan 23 08:21:34 localhost kernel: hub 1-0:1.0: USB hub found
Jan 23 08:21:34 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 23 08:21:34 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 23 08:21:34 localhost kernel: usbserial: USB Serial support registered for generic
Jan 23 08:21:34 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 23 08:21:34 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 23 08:21:34 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 23 08:21:34 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 23 08:21:34 localhost kernel: rtc_cmos 00:03: RTC can wake from S4
Jan 23 08:21:34 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 23 08:21:34 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 23 08:21:34 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 23 08:21:34 localhost kernel: rtc_cmos 00:03: registered as rtc0
Jan 23 08:21:34 localhost kernel: rtc_cmos 00:03: setting system clock to 2026-01-23T08:21:34 UTC (1769156494)
Jan 23 08:21:34 localhost kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram
Jan 23 08:21:34 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 23 08:21:34 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 23 08:21:34 localhost kernel: usbcore: registered new interface driver usbhid
Jan 23 08:21:34 localhost kernel: usbhid: USB HID core driver
Jan 23 08:21:34 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 23 08:21:34 localhost kernel: Initializing XFRM netlink socket
Jan 23 08:21:34 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 23 08:21:34 localhost kernel: Segment Routing with IPv6
Jan 23 08:21:34 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 23 08:21:34 localhost kernel: mpls_gso: MPLS GSO support
Jan 23 08:21:34 localhost kernel: IPI shorthand broadcast: enabled
Jan 23 08:21:34 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 23 08:21:34 localhost kernel: AES CTR mode by8 optimization enabled
Jan 23 08:21:34 localhost kernel: sched_clock: Marking stable (1175002120, 145732374)->(1389498278, -68763784)
Jan 23 08:21:34 localhost kernel: registered taskstats version 1
Jan 23 08:21:34 localhost kernel: Loading compiled-in X.509 certificates
Jan 23 08:21:34 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 08:21:34 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 23 08:21:34 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 23 08:21:34 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 23 08:21:34 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 23 08:21:34 localhost kernel: Demotion targets for Node 0: null
Jan 23 08:21:34 localhost kernel: page_owner is disabled
Jan 23 08:21:34 localhost kernel: Key type .fscrypt registered
Jan 23 08:21:34 localhost kernel: Key type fscrypt-provisioning registered
Jan 23 08:21:34 localhost kernel: Key type big_key registered
Jan 23 08:21:34 localhost kernel: Key type encrypted registered
Jan 23 08:21:34 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 23 08:21:34 localhost kernel: Loading compiled-in module X.509 certificates
Jan 23 08:21:34 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 08:21:34 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 23 08:21:34 localhost kernel: ima: No architecture policies found
Jan 23 08:21:34 localhost kernel: evm: Initialising EVM extended attributes:
Jan 23 08:21:34 localhost kernel: evm: security.selinux
Jan 23 08:21:34 localhost kernel: evm: security.SMACK64 (disabled)
Jan 23 08:21:34 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 23 08:21:34 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 23 08:21:34 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 23 08:21:34 localhost kernel: evm: security.apparmor (disabled)
Jan 23 08:21:34 localhost kernel: evm: security.ima
Jan 23 08:21:34 localhost kernel: evm: security.capability
Jan 23 08:21:34 localhost kernel: evm: HMAC attrs: 0x1
Jan 23 08:21:34 localhost kernel: Running certificate verification RSA selftest
Jan 23 08:21:34 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 23 08:21:34 localhost kernel: Running certificate verification ECDSA selftest
Jan 23 08:21:34 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 23 08:21:34 localhost kernel: clk: Disabling unused clocks
Jan 23 08:21:34 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 23 08:21:34 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 23 08:21:34 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 23 08:21:34 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 23 08:21:34 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 23 08:21:34 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 23 08:21:34 localhost kernel: Run /init as init process
Jan 23 08:21:34 localhost kernel:   with arguments:
Jan 23 08:21:34 localhost kernel:     /init
Jan 23 08:21:34 localhost kernel:   with environment:
Jan 23 08:21:34 localhost kernel:     HOME=/
Jan 23 08:21:34 localhost kernel:     TERM=linux
Jan 23 08:21:34 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 23 08:21:34 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 08:21:34 localhost systemd[1]: Detected virtualization kvm.
Jan 23 08:21:34 localhost systemd[1]: Detected architecture x86-64.
Jan 23 08:21:34 localhost systemd[1]: Running in initrd.
Jan 23 08:21:34 localhost systemd[1]: No hostname configured, using default hostname.
Jan 23 08:21:34 localhost systemd[1]: Hostname set to <localhost>.
Jan 23 08:21:34 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 23 08:21:34 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 23 08:21:34 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 08:21:34 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 23 08:21:34 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 23 08:21:34 localhost systemd[1]: Reached target Local File Systems.
Jan 23 08:21:34 localhost systemd[1]: Reached target Path Units.
Jan 23 08:21:34 localhost systemd[1]: Reached target Slice Units.
Jan 23 08:21:34 localhost systemd[1]: Reached target Swaps.
Jan 23 08:21:34 localhost systemd[1]: Reached target Timer Units.
Jan 23 08:21:34 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 08:21:34 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 23 08:21:34 localhost systemd[1]: Listening on Journal Socket.
Jan 23 08:21:34 localhost systemd[1]: Listening on udev Control Socket.
Jan 23 08:21:34 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 23 08:21:34 localhost systemd[1]: Reached target Socket Units.
Jan 23 08:21:34 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 23 08:21:34 localhost systemd[1]: Starting Journal Service...
Jan 23 08:21:34 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 08:21:34 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 23 08:21:34 localhost systemd[1]: Starting Create System Users...
Jan 23 08:21:34 localhost systemd[1]: Starting Setup Virtual Console...
Jan 23 08:21:34 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 08:21:34 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 23 08:21:34 localhost systemd[1]: Finished Create System Users.
Jan 23 08:21:34 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 23 08:21:34 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 23 08:21:34 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 23 08:21:34 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 23 08:21:34 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:02.0:00.0:01.0-1
Jan 23 08:21:34 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 23 08:21:34 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0
Jan 23 08:21:34 localhost systemd-journald[283]: Journal started
Jan 23 08:21:34 localhost systemd-journald[283]: Runtime Journal (/run/log/journal/5a008b0a4e784797b2625f7749cb75af) is 8.0M, max 153.6M, 145.6M free.
Jan 23 08:21:34 localhost systemd-sysusers[286]: Creating group 'users' with GID 100.
Jan 23 08:21:34 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Jan 23 08:21:34 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 23 08:21:34 localhost systemd[1]: Started Journal Service.
Jan 23 08:21:34 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 08:21:34 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 08:21:34 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 08:21:35 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 08:21:35 localhost systemd[1]: Finished Setup Virtual Console.
Jan 23 08:21:35 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 23 08:21:35 localhost systemd[1]: Starting dracut cmdline hook...
Jan 23 08:21:35 localhost dracut-cmdline[300]: dracut-9 dracut-057-102.git20250818.el9
Jan 23 08:21:35 localhost dracut-cmdline[300]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 08:21:35 localhost systemd[1]: Finished dracut cmdline hook.
Jan 23 08:21:35 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 23 08:21:35 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 23 08:21:35 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 23 08:21:35 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 23 08:21:35 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 23 08:21:35 localhost kernel: RPC: Registered udp transport module.
Jan 23 08:21:35 localhost kernel: RPC: Registered tcp transport module.
Jan 23 08:21:35 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 23 08:21:35 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 23 08:21:35 localhost rpc.statd[415]: Version 2.5.4 starting
Jan 23 08:21:35 localhost rpc.statd[415]: Initializing NSM state
Jan 23 08:21:35 localhost rpc.idmapd[420]: Setting log level to 0
Jan 23 08:21:35 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 23 08:21:35 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 08:21:35 localhost systemd-udevd[434]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 08:21:35 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 08:21:35 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 23 08:21:35 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 23 08:21:35 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 23 08:21:35 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 23 08:21:35 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 08:21:35 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 08:21:35 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 08:21:35 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 23 08:21:35 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 08:21:35 localhost systemd[1]: Reached target Network.
Jan 23 08:21:35 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 08:21:35 localhost systemd[1]: Starting dracut initqueue hook...
Jan 23 08:21:35 localhost kernel: virtio_blk virtio2: 4/0/0 default/read/poll queues
Jan 23 08:21:35 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 23 08:21:35 localhost kernel:  vda: vda1
Jan 23 08:21:35 localhost kernel: libata version 3.00 loaded.
Jan 23 08:21:35 localhost systemd-udevd[460]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 08:21:35 localhost kernel: ahci 0000:00:1f.2: version 3.0
Jan 23 08:21:35 localhost kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16
Jan 23 08:21:35 localhost kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode
Jan 23 08:21:35 localhost kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f)
Jan 23 08:21:35 localhost kernel: ahci 0000:00:1f.2: flags: 64bit ncq only 
Jan 23 08:21:35 localhost kernel: scsi host0: ahci
Jan 23 08:21:35 localhost kernel: scsi host1: ahci
Jan 23 08:21:35 localhost kernel: scsi host2: ahci
Jan 23 08:21:35 localhost kernel: scsi host3: ahci
Jan 23 08:21:35 localhost kernel: scsi host4: ahci
Jan 23 08:21:35 localhost kernel: scsi host5: ahci
Jan 23 08:21:35 localhost kernel: ata1: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22100 irq 49 lpm-pol 0
Jan 23 08:21:35 localhost kernel: ata2: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22180 irq 49 lpm-pol 0
Jan 23 08:21:35 localhost kernel: ata3: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22200 irq 49 lpm-pol 0
Jan 23 08:21:35 localhost kernel: ata4: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22280 irq 49 lpm-pol 0
Jan 23 08:21:35 localhost kernel: ata5: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22300 irq 49 lpm-pol 0
Jan 23 08:21:35 localhost kernel: ata6: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22380 irq 49 lpm-pol 0
Jan 23 08:21:35 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 08:21:35 localhost systemd[1]: Reached target Initrd Root Device.
Jan 23 08:21:35 localhost kernel: ata2: SATA link down (SStatus 0 SControl 300)
Jan 23 08:21:35 localhost kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300)
Jan 23 08:21:35 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 23 08:21:35 localhost kernel: ata1.00: applying bridge limits
Jan 23 08:21:35 localhost kernel: ata1.00: configured for UDMA/100
Jan 23 08:21:35 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 23 08:21:35 localhost kernel: ata6: SATA link down (SStatus 0 SControl 300)
Jan 23 08:21:35 localhost kernel: ata3: SATA link down (SStatus 0 SControl 300)
Jan 23 08:21:35 localhost kernel: ata5: SATA link down (SStatus 0 SControl 300)
Jan 23 08:21:35 localhost kernel: ata4: SATA link down (SStatus 0 SControl 300)
Jan 23 08:21:35 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 23 08:21:35 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 23 08:21:35 localhost systemd[1]: Reached target System Initialization.
Jan 23 08:21:35 localhost systemd[1]: Reached target Basic System.
Jan 23 08:21:35 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 23 08:21:35 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 23 08:21:35 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 23 08:21:35 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 23 08:21:36 localhost systemd[1]: Finished dracut initqueue hook.
Jan 23 08:21:36 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 08:21:36 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 23 08:21:36 localhost systemd[1]: Reached target Remote File Systems.
Jan 23 08:21:36 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 23 08:21:36 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 23 08:21:36 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 23 08:21:36 localhost systemd-fsck[526]: /usr/sbin/fsck.xfs: XFS file system.
Jan 23 08:21:36 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 08:21:36 localhost systemd[1]: Mounting /sysroot...
Jan 23 08:21:36 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 23 08:21:36 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 23 08:21:36 localhost kernel: XFS (vda1): Ending clean mount
Jan 23 08:21:36 localhost systemd[1]: Mounted /sysroot.
Jan 23 08:21:36 localhost systemd[1]: Reached target Initrd Root File System.
Jan 23 08:21:36 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 23 08:21:36 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 23 08:21:36 localhost systemd[1]: Reached target Initrd File Systems.
Jan 23 08:21:36 localhost systemd[1]: Reached target Initrd Default Target.
Jan 23 08:21:36 localhost systemd[1]: Starting dracut mount hook...
Jan 23 08:21:36 localhost systemd[1]: Finished dracut mount hook.
Jan 23 08:21:36 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 23 08:21:36 localhost rpc.idmapd[420]: exiting on signal 15
Jan 23 08:21:36 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 23 08:21:36 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 23 08:21:36 localhost systemd[1]: Stopped target Network.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Timer Units.
Jan 23 08:21:36 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 23 08:21:36 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Basic System.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Path Units.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Remote File Systems.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Slice Units.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Socket Units.
Jan 23 08:21:36 localhost systemd[1]: Stopped target System Initialization.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Local File Systems.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Swaps.
Jan 23 08:21:36 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped dracut mount hook.
Jan 23 08:21:36 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 23 08:21:36 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 23 08:21:36 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 23 08:21:36 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 23 08:21:36 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 23 08:21:36 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 23 08:21:36 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 23 08:21:36 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 23 08:21:36 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 23 08:21:36 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 23 08:21:36 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 23 08:21:36 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 23 08:21:36 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Closed udev Control Socket.
Jan 23 08:21:36 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Closed udev Kernel Socket.
Jan 23 08:21:36 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 23 08:21:36 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 23 08:21:36 localhost systemd[1]: Starting Cleanup udev Database...
Jan 23 08:21:36 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 23 08:21:36 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 23 08:21:36 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Stopped Create System Users.
Jan 23 08:21:36 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 23 08:21:36 localhost systemd[1]: Finished Cleanup udev Database.
Jan 23 08:21:36 localhost systemd[1]: Reached target Switch Root.
Jan 23 08:21:36 localhost systemd[1]: Starting Switch Root...
Jan 23 08:21:36 localhost systemd[1]: Switching root.
Jan 23 08:21:36 localhost systemd-journald[283]: Received SIGTERM from PID 1 (systemd).
Jan 23 08:21:36 localhost systemd-journald[283]: Journal stopped
Jan 23 08:21:37 localhost kernel: audit: type=1404 audit(1769156496.925:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 23 08:21:37 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 08:21:37 localhost kernel: SELinux:  policy capability open_perms=1
Jan 23 08:21:37 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 08:21:37 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 23 08:21:37 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 08:21:37 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 08:21:37 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 08:21:37 localhost kernel: audit: type=1403 audit(1769156497.031:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 23 08:21:37 localhost systemd[1]: Successfully loaded SELinux policy in 108.645ms.
Jan 23 08:21:37 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.264ms.
Jan 23 08:21:37 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 08:21:37 localhost systemd[1]: Detected virtualization kvm.
Jan 23 08:21:37 localhost systemd[1]: Detected architecture x86-64.
Jan 23 08:21:37 localhost systemd-rc-local-generator[608]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:21:37 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 23 08:21:37 localhost systemd[1]: Stopped Switch Root.
Jan 23 08:21:37 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 23 08:21:37 localhost systemd[1]: Created slice Slice /system/getty.
Jan 23 08:21:37 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 23 08:21:37 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 23 08:21:37 localhost systemd[1]: Created slice User and Session Slice.
Jan 23 08:21:37 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 08:21:37 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 23 08:21:37 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 23 08:21:37 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 23 08:21:37 localhost systemd[1]: Stopped target Switch Root.
Jan 23 08:21:37 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 23 08:21:37 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 23 08:21:37 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 23 08:21:37 localhost systemd[1]: Reached target Path Units.
Jan 23 08:21:37 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 23 08:21:37 localhost systemd[1]: Reached target Slice Units.
Jan 23 08:21:37 localhost systemd[1]: Reached target Swaps.
Jan 23 08:21:37 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 23 08:21:37 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 23 08:21:37 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 23 08:21:37 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 23 08:21:37 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 23 08:21:37 localhost systemd[1]: Listening on udev Control Socket.
Jan 23 08:21:37 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 23 08:21:37 localhost systemd[1]: Mounting Huge Pages File System...
Jan 23 08:21:37 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 23 08:21:37 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 23 08:21:37 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 23 08:21:37 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 08:21:37 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 23 08:21:37 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 08:21:37 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 23 08:21:37 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 23 08:21:37 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 23 08:21:37 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 23 08:21:37 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 23 08:21:37 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 23 08:21:37 localhost systemd[1]: Stopped Journal Service.
Jan 23 08:21:37 localhost systemd[1]: Starting Journal Service...
Jan 23 08:21:37 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 08:21:37 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 23 08:21:37 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 08:21:37 localhost kernel: fuse: init (API version 7.37)
Jan 23 08:21:37 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 23 08:21:37 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 23 08:21:37 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 23 08:21:37 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 23 08:21:37 localhost systemd[1]: Mounted Huge Pages File System.
Jan 23 08:21:37 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 23 08:21:37 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 23 08:21:37 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 23 08:21:37 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 08:21:37 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 23 08:21:37 localhost systemd-journald[650]: Journal started
Jan 23 08:21:37 localhost systemd-journald[650]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 08:21:37 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 23 08:21:37 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 23 08:21:37 localhost systemd[1]: Started Journal Service.
Jan 23 08:21:37 localhost kernel: ACPI: bus type drm_connector registered
Jan 23 08:21:37 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 08:21:37 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 08:21:37 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 23 08:21:37 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 23 08:21:37 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 23 08:21:37 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 23 08:21:37 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 23 08:21:37 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 23 08:21:37 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 23 08:21:37 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 23 08:21:37 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 23 08:21:37 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 23 08:21:37 localhost systemd[1]: Mounting FUSE Control File System...
Jan 23 08:21:37 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 08:21:37 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 23 08:21:37 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 23 08:21:37 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 23 08:21:37 localhost systemd-journald[650]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 08:21:37 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 23 08:21:37 localhost systemd-journald[650]: Received client request to flush runtime journal.
Jan 23 08:21:37 localhost systemd[1]: Starting Create System Users...
Jan 23 08:21:37 localhost systemd[1]: Mounted FUSE Control File System.
Jan 23 08:21:37 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 23 08:21:37 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 23 08:21:37 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 08:21:37 localhost systemd[1]: Finished Create System Users.
Jan 23 08:21:37 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 08:21:37 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 23 08:21:37 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 08:21:37 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 23 08:21:37 localhost systemd[1]: Reached target Local File Systems.
Jan 23 08:21:37 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 23 08:21:37 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 23 08:21:37 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 23 08:21:37 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 23 08:21:37 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 23 08:21:37 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 23 08:21:37 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 08:21:37 localhost bootctl[667]: Couldn't find EFI system partition, skipping.
Jan 23 08:21:37 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 23 08:21:37 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 08:21:37 localhost systemd[1]: Starting Security Auditing Service...
Jan 23 08:21:37 localhost systemd[1]: Starting RPC Bind...
Jan 23 08:21:37 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 23 08:21:37 localhost auditd[673]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 23 08:21:37 localhost auditd[673]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 23 08:21:37 localhost systemd[1]: Started RPC Bind.
Jan 23 08:21:37 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 23 08:21:37 localhost augenrules[678]: /sbin/augenrules: No change
Jan 23 08:21:37 localhost augenrules[693]: No rules
Jan 23 08:21:37 localhost augenrules[693]: enabled 1
Jan 23 08:21:37 localhost augenrules[693]: failure 1
Jan 23 08:21:37 localhost augenrules[693]: pid 673
Jan 23 08:21:37 localhost augenrules[693]: rate_limit 0
Jan 23 08:21:37 localhost augenrules[693]: backlog_limit 8192
Jan 23 08:21:37 localhost augenrules[693]: lost 0
Jan 23 08:21:37 localhost augenrules[693]: backlog 0
Jan 23 08:21:37 localhost augenrules[693]: backlog_wait_time 60000
Jan 23 08:21:37 localhost augenrules[693]: backlog_wait_time_actual 0
Jan 23 08:21:37 localhost augenrules[693]: enabled 1
Jan 23 08:21:37 localhost augenrules[693]: failure 1
Jan 23 08:21:37 localhost augenrules[693]: pid 673
Jan 23 08:21:37 localhost augenrules[693]: rate_limit 0
Jan 23 08:21:37 localhost augenrules[693]: backlog_limit 8192
Jan 23 08:21:37 localhost augenrules[693]: lost 0
Jan 23 08:21:37 localhost augenrules[693]: backlog 4
Jan 23 08:21:37 localhost augenrules[693]: backlog_wait_time 60000
Jan 23 08:21:37 localhost augenrules[693]: backlog_wait_time_actual 0
Jan 23 08:21:37 localhost augenrules[693]: enabled 1
Jan 23 08:21:37 localhost augenrules[693]: failure 1
Jan 23 08:21:37 localhost augenrules[693]: pid 673
Jan 23 08:21:37 localhost augenrules[693]: rate_limit 0
Jan 23 08:21:37 localhost augenrules[693]: backlog_limit 8192
Jan 23 08:21:37 localhost augenrules[693]: lost 0
Jan 23 08:21:37 localhost augenrules[693]: backlog 4
Jan 23 08:21:37 localhost augenrules[693]: backlog_wait_time 60000
Jan 23 08:21:37 localhost augenrules[693]: backlog_wait_time_actual 0
Jan 23 08:21:37 localhost systemd[1]: Started Security Auditing Service.
Jan 23 08:21:37 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 23 08:21:37 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 23 08:21:37 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 23 08:21:37 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 23 08:21:37 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 08:21:37 localhost systemd[1]: Starting Update is Completed...
Jan 23 08:21:37 localhost systemd[1]: Finished Update is Completed.
Jan 23 08:21:37 localhost systemd-udevd[701]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 08:21:37 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 08:21:37 localhost systemd[1]: Reached target System Initialization.
Jan 23 08:21:37 localhost systemd[1]: Started dnf makecache --timer.
Jan 23 08:21:37 localhost systemd[1]: Started Daily rotation of log files.
Jan 23 08:21:37 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 23 08:21:37 localhost systemd[1]: Reached target Timer Units.
Jan 23 08:21:37 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 08:21:37 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 23 08:21:37 localhost systemd[1]: Reached target Socket Units.
Jan 23 08:21:37 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 23 08:21:37 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 08:21:37 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 23 08:21:37 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 08:21:37 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 08:21:37 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 08:21:37 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 23 08:21:37 localhost systemd[1]: Reached target Basic System.
Jan 23 08:21:37 localhost systemd-udevd[716]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 08:21:37 localhost dbus-broker-lau[725]: Ready
Jan 23 08:21:37 localhost systemd[1]: Starting NTP client/server...
Jan 23 08:21:37 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 23 08:21:37 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 23 08:21:37 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 23 08:21:37 localhost systemd[1]: Started irqbalance daemon.
Jan 23 08:21:37 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 23 08:21:37 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 08:21:37 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 08:21:37 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 08:21:37 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 23 08:21:37 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 23 08:21:37 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 23 08:21:37 localhost systemd[1]: Starting User Login Management...
Jan 23 08:21:38 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 23 08:21:38 localhost chronyd[754]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 08:21:38 localhost chronyd[754]: Loaded 0 symmetric keys
Jan 23 08:21:38 localhost chronyd[754]: Using right/UTC timezone to obtain leap second data
Jan 23 08:21:38 localhost chronyd[754]: Loaded seccomp filter (level 2)
Jan 23 08:21:38 localhost systemd[1]: Started NTP client/server.
Jan 23 08:21:38 localhost systemd-logind[746]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 08:21:38 localhost systemd-logind[746]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 08:21:38 localhost systemd-logind[746]: New seat seat0.
Jan 23 08:21:38 localhost systemd[1]: Started User Login Management.
Jan 23 08:21:38 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 23 08:21:38 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 23 08:21:38 localhost kernel: lpc_ich 0000:00:1f.0: I/O space for GPIO uninitialized
Jan 23 08:21:38 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:01.0
Jan 23 08:21:38 localhost kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt
Jan 23 08:21:38 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 23 08:21:38 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 23 08:21:38 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 23 08:21:38 localhost kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console
Jan 23 08:21:38 localhost iptables.init[739]: iptables: Applying firewall rules: [  OK  ]
Jan 23 08:21:38 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 23 08:21:38 localhost kernel: Console: switching to colour dummy device 80x25
Jan 23 08:21:38 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 23 08:21:38 localhost kernel: [drm] features: -context_init
Jan 23 08:21:38 localhost kernel: [drm] number of scanouts: 1
Jan 23 08:21:38 localhost kernel: [drm] number of cap sets: 0
Jan 23 08:21:38 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0
Jan 23 08:21:38 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 23 08:21:38 localhost kernel: Console: switching to colour frame buffer device 160x50
Jan 23 08:21:38 localhost kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 23 08:21:38 localhost kernel: iTCO_vendor_support: vendor-support=0
Jan 23 08:21:38 localhost kernel: iTCO_wdt iTCO_wdt.1.auto: Found a ICH9 TCO device (Version=2, TCOBASE=0x0660)
Jan 23 08:21:38 localhost kernel: iTCO_wdt iTCO_wdt.1.auto: initialized. heartbeat=30 sec (nowayout=0)
Jan 23 08:21:38 localhost kernel: kvm_amd: TSC scaling supported
Jan 23 08:21:38 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 23 08:21:38 localhost kernel: kvm_amd: Nested Paging enabled
Jan 23 08:21:38 localhost kernel: kvm_amd: LBR virtualization supported
Jan 23 08:21:38 localhost kernel: kvm_amd: Virtual VMLOAD VMSAVE supported
Jan 23 08:21:38 localhost kernel: kvm_amd: Virtual GIF supported
Jan 23 08:21:38 localhost cloud-init[794]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 23 Jan 2026 08:21:38 +0000. Up 5.11 seconds.
Jan 23 08:21:38 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 23 08:21:38 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 23 08:21:38 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp3b36xcpf.mount: Deactivated successfully.
Jan 23 08:21:38 localhost systemd[1]: Starting Hostname Service...
Jan 23 08:21:38 localhost systemd[1]: Started Hostname Service.
Jan 23 08:21:38 np0005593250 systemd-hostnamed[808]: Hostname set to <np0005593250> (static)
Jan 23 08:21:38 np0005593250 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 23 08:21:38 np0005593250 systemd[1]: Reached target Preparation for Network.
Jan 23 08:21:38 np0005593250 systemd[1]: Starting Network Manager...
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.8833] NetworkManager (version 1.54.3-2.el9) is starting... (boot:1b526f85-5c85-4390-8f4f-c5e9ef0f805d)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.8837] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.8922] manager[0x562411a3d000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.8956] hostname: hostname: using hostnamed
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.8956] hostname: static hostname changed from (none) to "np0005593250"
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.8959] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9036] manager[0x562411a3d000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9037] manager[0x562411a3d000]: rfkill: WWAN hardware radio set enabled
Jan 23 08:21:38 np0005593250 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9080] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9081] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9082] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9082] manager: Networking is enabled by state file
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9086] settings: Loaded settings plugin: keyfile (internal)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9109] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9129] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9142] dhcp: init: Using DHCP client 'internal'
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9145] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9158] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9168] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9178] device (lo): Activation: starting connection 'lo' (ab218173-ee72-4eee-b4f5-cff3eabad9ac)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9187] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9191] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9214] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9220] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9222] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9227] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 08:21:38 np0005593250 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9230] device (eth0): carrier: link connected
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9232] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9235] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 08:21:38 np0005593250 systemd[1]: Started Network Manager.
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9245] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9248] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9249] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9253] manager: NetworkManager state is now CONNECTING
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9254] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:21:38 np0005593250 systemd[1]: Reached target Network.
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9263] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9270] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9275] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 23 08:21:38 np0005593250 systemd[1]: Starting Network Manager Wait Online...
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9298] dhcp4 (eth0): state changed new lease, address=192.168.26.29
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9304] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 08:21:38 np0005593250 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 23 08:21:38 np0005593250 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9427] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9430] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 08:21:38 np0005593250 NetworkManager[812]: <info>  [1769156498.9441] device (lo): Activation: successful, device activated.
Jan 23 08:21:38 np0005593250 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 23 08:21:38 np0005593250 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 08:21:38 np0005593250 systemd[1]: Reached target NFS client services.
Jan 23 08:21:38 np0005593250 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 08:21:38 np0005593250 systemd[1]: Reached target Remote File Systems.
Jan 23 08:21:38 np0005593250 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 08:21:40 np0005593250 NetworkManager[812]: <info>  [1769156500.5295] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:21:41 np0005593250 NetworkManager[812]: <info>  [1769156501.5572] dhcp6 (eth0): state changed new lease, address=2001:db8::397
Jan 23 08:21:42 np0005593250 NetworkManager[812]: <info>  [1769156502.7703] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:21:42 np0005593250 NetworkManager[812]: <info>  [1769156502.7740] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:21:42 np0005593250 NetworkManager[812]: <info>  [1769156502.7741] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:21:42 np0005593250 NetworkManager[812]: <info>  [1769156502.7744] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 08:21:42 np0005593250 NetworkManager[812]: <info>  [1769156502.7747] device (eth0): Activation: successful, device activated.
Jan 23 08:21:42 np0005593250 NetworkManager[812]: <info>  [1769156502.7751] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 08:21:42 np0005593250 NetworkManager[812]: <info>  [1769156502.7753] manager: startup complete
Jan 23 08:21:42 np0005593250 systemd[1]: Finished Network Manager Wait Online.
Jan 23 08:21:42 np0005593250 systemd[1]: Starting Cloud-init: Network Stage...
Jan 23 08:21:43 np0005593250 cloud-init[878]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 23 Jan 2026 08:21:43 +0000. Up 9.69 seconds.
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |  eth0  | True |        192.168.26.29         | 255.255.255.0 | global | fa:16:3e:a8:bb:c7 |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |  eth0  | True |      2001:db8::397/128       |       .       | global | fa:16:3e:a8:bb:c7 |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |  eth0  | True | fe80::f816:3eff:fea8:bbc7/64 |       .       |  link  | fa:16:3e:a8:bb:c7 |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: | Route |   Destination   |   Gateway    |     Genmask     | Interface | Flags |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   0   |     0.0.0.0     | 192.168.26.1 |     0.0.0.0     |    eth0   |   UG  |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   1   | 169.254.169.254 | 192.168.26.2 | 255.255.255.255 |    eth0   |  UGH  |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   2   |   192.168.26.0  |   0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: ++++++++++++++++++++++Route IPv6 info++++++++++++++++++++++
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: | Route |  Destination  |   Gateway   | Interface | Flags |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   1   |  2001:db8::1  |      ::     |    eth0   |   U   |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   2   | 2001:db8::397 |      ::     |    eth0   |   U   |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   3   |   fe80::/64   |      ::     |    eth0   |   U   |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   4   |      ::/0     | 2001:db8::1 |    eth0   |   UG  |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   6   |     local     |      ::     |    eth0   |   U   |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   7   |     local     |      ::     |    eth0   |   U   |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: |   8   |   multicast   |      ::     |    eth0   |   U   |
Jan 23 08:21:43 np0005593250 cloud-init[878]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 23 08:21:43 np0005593250 useradd[945]: new group: name=cloud-user, GID=1001
Jan 23 08:21:43 np0005593250 useradd[945]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 23 08:21:43 np0005593250 useradd[945]: add 'cloud-user' to group 'adm'
Jan 23 08:21:43 np0005593250 useradd[945]: add 'cloud-user' to group 'systemd-journal'
Jan 23 08:21:43 np0005593250 useradd[945]: add 'cloud-user' to shadow group 'adm'
Jan 23 08:21:43 np0005593250 useradd[945]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 23 08:21:43 np0005593250 chronyd[754]: Selected source 172.238.164.57 (2.centos.pool.ntp.org)
Jan 23 08:21:43 np0005593250 chronyd[754]: System clock wrong by 1.120971 seconds
Jan 23 08:21:45 np0005593250 chronyd[754]: System clock was stepped by 1.120971 seconds
Jan 23 08:21:45 np0005593250 chronyd[754]: System clock TAI offset set to 37 seconds
Jan 23 08:21:45 np0005593250 cloud-init[878]: Generating public/private rsa key pair.
Jan 23 08:21:45 np0005593250 cloud-init[878]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 23 08:21:45 np0005593250 cloud-init[878]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 23 08:21:45 np0005593250 cloud-init[878]: The key fingerprint is:
Jan 23 08:21:45 np0005593250 cloud-init[878]: SHA256:8uJefgxglv5LT00fMI0VkFNXV8ruDjNqf0pvkZ721og root@np0005593250
Jan 23 08:21:45 np0005593250 cloud-init[878]: The key's randomart image is:
Jan 23 08:21:45 np0005593250 cloud-init[878]: +---[RSA 3072]----+
Jan 23 08:21:45 np0005593250 cloud-init[878]: |            .++.B|
Jan 23 08:21:45 np0005593250 cloud-init[878]: |            o= o.|
Jan 23 08:21:45 np0005593250 cloud-init[878]: |       .    +.+  |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |      =      +   |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |     +..S   . o. |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |      .o.  o oo. |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |      ..+o. *oo+.|
Jan 23 08:21:45 np0005593250 cloud-init[878]: |     . =.o+oE*B o|
Jan 23 08:21:45 np0005593250 cloud-init[878]: |     .o o+o.o=oo.|
Jan 23 08:21:45 np0005593250 cloud-init[878]: +----[SHA256]-----+
Jan 23 08:21:45 np0005593250 cloud-init[878]: Generating public/private ecdsa key pair.
Jan 23 08:21:45 np0005593250 cloud-init[878]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 23 08:21:45 np0005593250 cloud-init[878]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 23 08:21:45 np0005593250 cloud-init[878]: The key fingerprint is:
Jan 23 08:21:45 np0005593250 cloud-init[878]: SHA256:Y9S0bBrpeEA1W6+HXOJ52JYpgU+YlJJq7RK4hgyCY2g root@np0005593250
Jan 23 08:21:45 np0005593250 cloud-init[878]: The key's randomart image is:
Jan 23 08:21:45 np0005593250 cloud-init[878]: +---[ECDSA 256]---+
Jan 23 08:21:45 np0005593250 cloud-init[878]: |      .o+.o      |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |     .o..% o     |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |o  . o..O O o    |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |*E. + .= O X o   |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |*o o o. S X B    |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |..o . .o . =     |
Jan 23 08:21:45 np0005593250 cloud-init[878]: | .   .           |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |                 |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |                 |
Jan 23 08:21:45 np0005593250 cloud-init[878]: +----[SHA256]-----+
Jan 23 08:21:45 np0005593250 cloud-init[878]: Generating public/private ed25519 key pair.
Jan 23 08:21:45 np0005593250 cloud-init[878]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 23 08:21:45 np0005593250 cloud-init[878]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 23 08:21:45 np0005593250 cloud-init[878]: The key fingerprint is:
Jan 23 08:21:45 np0005593250 cloud-init[878]: SHA256:4Mx4pMLTtpDCsxWTVsHSphkYsCGO+DCeA1QY8tQ42Ro root@np0005593250
Jan 23 08:21:45 np0005593250 cloud-init[878]: The key's randomart image is:
Jan 23 08:21:45 np0005593250 cloud-init[878]: +--[ED25519 256]--+
Jan 23 08:21:45 np0005593250 cloud-init[878]: |*+B*oo.          |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |BBEo=+           |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |B..O* o          |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |+=+=oB .         |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |.*B.= = S        |
Jan 23 08:21:45 np0005593250 cloud-init[878]: | .== o           |
Jan 23 08:21:45 np0005593250 cloud-init[878]: | .  .            |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |                 |
Jan 23 08:21:45 np0005593250 cloud-init[878]: |                 |
Jan 23 08:21:45 np0005593250 cloud-init[878]: +----[SHA256]-----+
Jan 23 08:21:45 np0005593250 systemd[1]: Finished Cloud-init: Network Stage.
Jan 23 08:21:45 np0005593250 systemd[1]: Reached target Cloud-config availability.
Jan 23 08:21:45 np0005593250 systemd[1]: Reached target Network is Online.
Jan 23 08:21:45 np0005593250 systemd[1]: Starting Cloud-init: Config Stage...
Jan 23 08:21:45 np0005593250 systemd[1]: Starting Crash recovery kernel arming...
Jan 23 08:21:45 np0005593250 systemd[1]: Starting Notify NFS peers of a restart...
Jan 23 08:21:45 np0005593250 systemd[1]: Starting System Logging Service...
Jan 23 08:21:45 np0005593250 sm-notify[961]: Version 2.5.4 starting
Jan 23 08:21:45 np0005593250 systemd[1]: Starting OpenSSH server daemon...
Jan 23 08:21:45 np0005593250 systemd[1]: Starting Permit User Sessions...
Jan 23 08:21:45 np0005593250 systemd[1]: Started Notify NFS peers of a restart.
Jan 23 08:21:45 np0005593250 sshd[963]: Server listening on 0.0.0.0 port 22.
Jan 23 08:21:45 np0005593250 sshd[963]: Server listening on :: port 22.
Jan 23 08:21:45 np0005593250 systemd[1]: Started OpenSSH server daemon.
Jan 23 08:21:45 np0005593250 systemd[1]: Finished Permit User Sessions.
Jan 23 08:21:45 np0005593250 systemd[1]: Started Command Scheduler.
Jan 23 08:21:45 np0005593250 systemd[1]: Started Getty on tty1.
Jan 23 08:21:45 np0005593250 rsyslogd[962]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="962" x-info="https://www.rsyslog.com"] start
Jan 23 08:21:45 np0005593250 rsyslogd[962]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 23 08:21:45 np0005593250 systemd[1]: Started Serial Getty on ttyS0.
Jan 23 08:21:45 np0005593250 systemd[1]: Reached target Login Prompts.
Jan 23 08:21:45 np0005593250 systemd[1]: Started System Logging Service.
Jan 23 08:21:45 np0005593250 systemd[1]: Reached target Multi-User System.
Jan 23 08:21:45 np0005593250 crond[968]: (CRON) STARTUP (1.5.7)
Jan 23 08:21:45 np0005593250 crond[968]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 23 08:21:45 np0005593250 crond[968]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 3% if used.)
Jan 23 08:21:45 np0005593250 crond[968]: (CRON) INFO (running with inotify support)
Jan 23 08:21:45 np0005593250 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 23 08:21:45 np0005593250 sshd-session[983]: Unable to negotiate with 192.168.26.11 port 51680: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 23 08:21:45 np0005593250 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 23 08:21:45 np0005593250 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 23 08:21:45 np0005593250 sshd-session[993]: Unable to negotiate with 192.168.26.11 port 51704: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 23 08:21:45 np0005593250 sshd-session[1001]: Unable to negotiate with 192.168.26.11 port 51720: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 23 08:21:45 np0005593250 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 08:21:45 np0005593250 sshd-session[1016]: Connection closed by 192.168.26.11 port 51744 [preauth]
Jan 23 08:21:45 np0005593250 sshd-session[1025]: Unable to negotiate with 192.168.26.11 port 51756: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 23 08:21:45 np0005593250 sshd-session[966]: Connection closed by 192.168.26.11 port 51668 [preauth]
Jan 23 08:21:45 np0005593250 sshd-session[1033]: Unable to negotiate with 192.168.26.11 port 51764: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 23 08:21:45 np0005593250 sshd-session[987]: Connection closed by 192.168.26.11 port 51690 [preauth]
Jan 23 08:21:45 np0005593250 kdumpctl[970]: kdump: No kdump initial ramdisk found.
Jan 23 08:21:45 np0005593250 kdumpctl[970]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 23 08:21:45 np0005593250 sshd-session[1010]: Connection closed by 192.168.26.11 port 51732 [preauth]
Jan 23 08:21:45 np0005593250 cloud-init[1107]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 23 Jan 2026 08:21:45 +0000. Up 10.88 seconds.
Jan 23 08:21:45 np0005593250 systemd[1]: Finished Cloud-init: Config Stage.
Jan 23 08:21:45 np0005593250 systemd[1]: Starting Cloud-init: Final Stage...
Jan 23 08:21:45 np0005593250 dracut[1240]: dracut-057-102.git20250818.el9
Jan 23 08:21:45 np0005593250 cloud-init[1258]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 23 Jan 2026 08:21:45 +0000. Up 11.21 seconds.
Jan 23 08:21:45 np0005593250 dracut[1242]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 23 08:21:45 np0005593250 cloud-init[1277]: #############################################################
Jan 23 08:21:45 np0005593250 cloud-init[1281]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 23 08:21:45 np0005593250 cloud-init[1286]: 256 SHA256:Y9S0bBrpeEA1W6+HXOJ52JYpgU+YlJJq7RK4hgyCY2g root@np0005593250 (ECDSA)
Jan 23 08:21:45 np0005593250 cloud-init[1292]: 256 SHA256:4Mx4pMLTtpDCsxWTVsHSphkYsCGO+DCeA1QY8tQ42Ro root@np0005593250 (ED25519)
Jan 23 08:21:45 np0005593250 cloud-init[1295]: 3072 SHA256:8uJefgxglv5LT00fMI0VkFNXV8ruDjNqf0pvkZ721og root@np0005593250 (RSA)
Jan 23 08:21:45 np0005593250 cloud-init[1298]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 23 08:21:45 np0005593250 cloud-init[1301]: #############################################################
Jan 23 08:21:45 np0005593250 cloud-init[1258]: Cloud-init v. 24.4-8.el9 finished at Fri, 23 Jan 2026 08:21:45 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.34 seconds
Jan 23 08:21:45 np0005593250 systemd[1]: Finished Cloud-init: Final Stage.
Jan 23 08:21:45 np0005593250 systemd[1]: Reached target Cloud-init target.
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 23 08:21:46 np0005593250 dracut[1242]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 23 08:21:46 np0005593250 dracut[1242]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: memstrack is not available
Jan 23 08:21:46 np0005593250 dracut[1242]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 08:21:46 np0005593250 dracut[1242]: memstrack is not available
Jan 23 08:21:46 np0005593250 dracut[1242]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 08:21:46 np0005593250 dracut[1242]: *** Including module: systemd ***
Jan 23 08:21:47 np0005593250 dracut[1242]: *** Including module: fips ***
Jan 23 08:21:47 np0005593250 dracut[1242]: *** Including module: systemd-initrd ***
Jan 23 08:21:47 np0005593250 dracut[1242]: *** Including module: i18n ***
Jan 23 08:21:47 np0005593250 dracut[1242]: *** Including module: drm ***
Jan 23 08:21:47 np0005593250 dracut[1242]: *** Including module: prefixdevname ***
Jan 23 08:21:47 np0005593250 dracut[1242]: *** Including module: kernel-modules ***
Jan 23 08:21:47 np0005593250 kernel: block vda: the capability attribute has been deprecated.
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: kernel-modules-extra ***
Jan 23 08:21:48 np0005593250 dracut[1242]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 23 08:21:48 np0005593250 dracut[1242]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 23 08:21:48 np0005593250 dracut[1242]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 23 08:21:48 np0005593250 dracut[1242]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: qemu ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: fstab-sys ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: rootfs-block ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: terminfo ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: udev-rules ***
Jan 23 08:21:48 np0005593250 dracut[1242]: Skipping udev rule: 91-permissions.rules
Jan 23 08:21:48 np0005593250 dracut[1242]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: virtiofs ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: dracut-systemd ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: usrmount ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: base ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: fs-lib ***
Jan 23 08:21:48 np0005593250 dracut[1242]: *** Including module: kdumpbase ***
Jan 23 08:21:49 np0005593250 dracut[1242]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 23 08:21:49 np0005593250 dracut[1242]:   microcode_ctl module: mangling fw_dir
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 23 08:21:49 np0005593250 irqbalance[742]: Cannot change IRQ 45 affinity: Operation not permitted
Jan 23 08:21:49 np0005593250 irqbalance[742]: IRQ 45 affinity is now unmanaged
Jan 23 08:21:49 np0005593250 irqbalance[742]: Cannot change IRQ 44 affinity: Operation not permitted
Jan 23 08:21:49 np0005593250 irqbalance[742]: IRQ 44 affinity is now unmanaged
Jan 23 08:21:49 np0005593250 irqbalance[742]: Cannot change IRQ 42 affinity: Operation not permitted
Jan 23 08:21:49 np0005593250 irqbalance[742]: IRQ 42 affinity is now unmanaged
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 23 08:21:49 np0005593250 dracut[1242]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 23 08:21:49 np0005593250 dracut[1242]: *** Including module: openssl ***
Jan 23 08:21:49 np0005593250 dracut[1242]: *** Including module: shutdown ***
Jan 23 08:21:49 np0005593250 dracut[1242]: *** Including module: squash ***
Jan 23 08:21:49 np0005593250 dracut[1242]: *** Including modules done ***
Jan 23 08:21:49 np0005593250 dracut[1242]: *** Installing kernel module dependencies ***
Jan 23 08:21:50 np0005593250 dracut[1242]: *** Installing kernel module dependencies done ***
Jan 23 08:21:50 np0005593250 dracut[1242]: *** Resolving executable dependencies ***
Jan 23 08:21:51 np0005593250 dracut[1242]: *** Resolving executable dependencies done ***
Jan 23 08:21:51 np0005593250 dracut[1242]: *** Generating early-microcode cpio image ***
Jan 23 08:21:51 np0005593250 dracut[1242]: *** Store current command line parameters ***
Jan 23 08:21:51 np0005593250 dracut[1242]: Stored kernel commandline:
Jan 23 08:21:51 np0005593250 dracut[1242]: No dracut internal kernel commandline stored in the initramfs
Jan 23 08:21:51 np0005593250 dracut[1242]: *** Install squash loader ***
Jan 23 08:21:52 np0005593250 dracut[1242]: *** Squashing the files inside the initramfs ***
Jan 23 08:21:53 np0005593250 dracut[1242]: *** Squashing the files inside the initramfs done ***
Jan 23 08:21:53 np0005593250 dracut[1242]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 23 08:21:53 np0005593250 dracut[1242]: *** Hardlinking files ***
Jan 23 08:21:53 np0005593250 dracut[1242]: Mode:           real
Jan 23 08:21:53 np0005593250 dracut[1242]: Files:          50
Jan 23 08:21:53 np0005593250 dracut[1242]: Linked:         0 files
Jan 23 08:21:53 np0005593250 dracut[1242]: Compared:       0 xattrs
Jan 23 08:21:53 np0005593250 dracut[1242]: Compared:       0 files
Jan 23 08:21:53 np0005593250 dracut[1242]: Saved:          0 B
Jan 23 08:21:53 np0005593250 dracut[1242]: Duration:       0.000440 seconds
Jan 23 08:21:53 np0005593250 dracut[1242]: *** Hardlinking files done ***
Jan 23 08:21:53 np0005593250 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 08:21:53 np0005593250 dracut[1242]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 23 08:21:54 np0005593250 kdumpctl[970]: kdump: kexec: loaded kdump kernel
Jan 23 08:21:54 np0005593250 kdumpctl[970]: kdump: Starting kdump: [OK]
Jan 23 08:21:54 np0005593250 systemd[1]: Finished Crash recovery kernel arming.
Jan 23 08:21:54 np0005593250 systemd[1]: Startup finished in 1.409s (kernel) + 2.159s (initrd) + 16.266s (userspace) = 19.835s.
Jan 23 08:22:10 np0005593250 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 08:24:43 np0005593250 sshd-session[4376]: Accepted publickey for zuul from 192.168.26.12 port 44826 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 23 08:24:43 np0005593250 systemd[1]: Created slice User Slice of UID 1000.
Jan 23 08:24:43 np0005593250 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 23 08:24:43 np0005593250 systemd-logind[746]: New session 1 of user zuul.
Jan 23 08:24:44 np0005593250 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 23 08:24:44 np0005593250 systemd[1]: Starting User Manager for UID 1000...
Jan 23 08:24:44 np0005593250 systemd[4380]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:24:44 np0005593250 systemd[4380]: Queued start job for default target Main User Target.
Jan 23 08:24:44 np0005593250 systemd[4380]: Created slice User Application Slice.
Jan 23 08:24:44 np0005593250 systemd[4380]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 08:24:44 np0005593250 systemd[4380]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 08:24:44 np0005593250 systemd[4380]: Reached target Paths.
Jan 23 08:24:44 np0005593250 systemd[4380]: Reached target Timers.
Jan 23 08:24:44 np0005593250 systemd[4380]: Starting D-Bus User Message Bus Socket...
Jan 23 08:24:44 np0005593250 systemd[4380]: Starting Create User's Volatile Files and Directories...
Jan 23 08:24:44 np0005593250 systemd[4380]: Listening on D-Bus User Message Bus Socket.
Jan 23 08:24:44 np0005593250 systemd[4380]: Reached target Sockets.
Jan 23 08:24:44 np0005593250 systemd[4380]: Finished Create User's Volatile Files and Directories.
Jan 23 08:24:44 np0005593250 systemd[4380]: Reached target Basic System.
Jan 23 08:24:44 np0005593250 systemd[4380]: Reached target Main User Target.
Jan 23 08:24:44 np0005593250 systemd[4380]: Startup finished in 83ms.
Jan 23 08:24:44 np0005593250 systemd[1]: Started User Manager for UID 1000.
Jan 23 08:24:44 np0005593250 systemd[1]: Started Session 1 of User zuul.
Jan 23 08:24:44 np0005593250 sshd-session[4376]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:24:44 np0005593250 python3[4462]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:24:46 np0005593250 python3[4490]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:24:52 np0005593250 python3[4544]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:24:53 np0005593250 python3[4584]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 23 08:24:54 np0005593250 python3[4610]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDW9tHRUnfYzwF276MDllxiGkbtesQd3NGwckEumA9NTME6fS78zS1FiyLnsYl9vwB/dI35aDK8HE+107wuVfrjl4h8jOUeB2NsCSEb/F+rMt50/jwyQZnP3mfwUcgNNMNnnJTvPSL3xdDZQTsxgmHi2xaj1ic3uyaFUO/n6T6KSFRzPYtcAun4HYim2Qn2RXuM24QyhfSUKJP5ogp0Gqci5fkI5NKQE2e2Dwjjz1rEQju0eWYnEexCu/6Lf94ScTK2CBaulbHAGvNgrxRB6yWapJTWQTKyxvD9YfyBz95d5xZ2T77nbF9LjVuucREHD8NtYXaWukdplYVvtmdQPKxNCKKagq4kecgKOaWcx3U6w+1F3iiIotXe+kkFlG8ow3c9PCEcjs8Gs3L5hZXNQjQUf54T1QMvdp4r+6Gh6UbMkT/XGqvbzP472DIGZIG61SScG7Kj+9aEMJraJbZBMiXO+8eJT6k5iEkeAF2eXGi2+VGvxh6XmlQQFLat5CX6Q0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:24:55 np0005593250 python3[4634]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:24:55 np0005593250 python3[4733]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:24:55 np0005593250 python3[4804]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156695.4397206-251-80444932869399/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=b1b80db616c24c299c1e17279cd8a822_id_rsa follow=False checksum=9949a6c0cf2c56690fc05590688b18985827ec7b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:24:56 np0005593250 python3[4927]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:24:56 np0005593250 python3[4998]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156696.0747626-306-192538349113269/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=b1b80db616c24c299c1e17279cd8a822_id_rsa.pub follow=False checksum=ab0ccadd5578d44a9af9b7d1c4319c85626104ad backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:24:57 np0005593250 python3[5046]: ansible-ping Invoked with data=pong
Jan 23 08:24:58 np0005593250 python3[5070]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:25:00 np0005593250 chronyd[754]: Selected source 158.51.99.19 (2.centos.pool.ntp.org)
Jan 23 08:25:01 np0005593250 python3[5124]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 23 08:25:01 np0005593250 python3[5156]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:02 np0005593250 python3[5180]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:02 np0005593250 python3[5204]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:02 np0005593250 python3[5228]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:02 np0005593250 python3[5252]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:02 np0005593250 python3[5276]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:04 np0005593250 sudo[5300]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lilmetswsmsehcsrcygwidxpxqvuesnc ; /usr/bin/python3'
Jan 23 08:25:04 np0005593250 sudo[5300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:04 np0005593250 python3[5302]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:04 np0005593250 sudo[5300]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:04 np0005593250 sudo[5378]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxnixyrcqrjvxscsktshowtpphwfgvdy ; /usr/bin/python3'
Jan 23 08:25:04 np0005593250 sudo[5378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:04 np0005593250 python3[5380]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:25:04 np0005593250 sudo[5378]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:04 np0005593250 sudo[5451]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icxtslwspglruzpcrdetkvwvjtqukddx ; /usr/bin/python3'
Jan 23 08:25:04 np0005593250 sudo[5451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:05 np0005593250 python3[5453]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156704.4472306-31-191571782357144/source follow=False _original_basename=mirror_info.sh.j2 checksum=3f92644b791816833989d215b9a84c589a7b8ebd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:05 np0005593250 sudo[5451]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:05 np0005593250 python3[5501]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:05 np0005593250 python3[5525]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:05 np0005593250 python3[5549]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:06 np0005593250 python3[5573]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:06 np0005593250 python3[5597]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:06 np0005593250 python3[5621]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:06 np0005593250 python3[5645]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:06 np0005593250 python3[5669]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:07 np0005593250 python3[5693]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:07 np0005593250 python3[5717]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:07 np0005593250 python3[5741]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:07 np0005593250 python3[5765]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:07 np0005593250 python3[5789]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:08 np0005593250 python3[5813]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:08 np0005593250 python3[5837]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:08 np0005593250 python3[5861]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:08 np0005593250 python3[5885]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:09 np0005593250 python3[5909]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:09 np0005593250 python3[5933]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:09 np0005593250 python3[5957]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:09 np0005593250 python3[5981]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:09 np0005593250 python3[6005]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:10 np0005593250 python3[6029]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:10 np0005593250 python3[6053]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:10 np0005593250 python3[6077]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:10 np0005593250 python3[6101]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:25:12 np0005593250 sudo[6125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvupmsiavcqtsnpabucnmlavozvlytfi ; /usr/bin/python3'
Jan 23 08:25:12 np0005593250 sudo[6125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:12 np0005593250 python3[6127]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 08:25:12 np0005593250 systemd[1]: Starting Time & Date Service...
Jan 23 08:25:13 np0005593250 systemd[1]: Started Time & Date Service.
Jan 23 08:25:13 np0005593250 systemd-timedated[6129]: Changed time zone to 'UTC' (UTC).
Jan 23 08:25:13 np0005593250 sudo[6125]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:15 np0005593250 sudo[6156]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpkbcybrcaitfexfbuqhoacvtpadqpmx ; /usr/bin/python3'
Jan 23 08:25:15 np0005593250 sudo[6156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:15 np0005593250 python3[6158]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:15 np0005593250 sudo[6156]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:15 np0005593250 python3[6234]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:25:15 np0005593250 python3[6305]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769156715.528617-251-218871189473764/source _original_basename=tmpr_jmr_x6 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:16 np0005593250 python3[6405]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:25:16 np0005593250 python3[6476]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769156716.1164439-301-127291932844660/source _original_basename=tmpj7br9z2d follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:17 np0005593250 sudo[6576]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unvcujccaxpfbeydkeevqobuqtcoqkwb ; /usr/bin/python3'
Jan 23 08:25:17 np0005593250 sudo[6576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:17 np0005593250 python3[6578]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:25:17 np0005593250 sudo[6576]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:17 np0005593250 sudo[6649]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmlmdpigcanrzuviqfwiisoibutwpjpt ; /usr/bin/python3'
Jan 23 08:25:17 np0005593250 sudo[6649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:17 np0005593250 python3[6651]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769156717.0301242-381-90883492208446/source _original_basename=tmpvq_0p6y5 follow=False checksum=6722a0bcd4c1c779ab61e7248edf06c9ac2cf575 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:17 np0005593250 sudo[6649]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:17 np0005593250 python3[6699]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:25:18 np0005593250 python3[6725]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:25:18 np0005593250 sudo[6803]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvcevxxzarkamgtwjezsrjflzbnavilt ; /usr/bin/python3'
Jan 23 08:25:18 np0005593250 sudo[6803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:18 np0005593250 python3[6805]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:25:18 np0005593250 sudo[6803]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:18 np0005593250 sudo[6876]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfvkngmnffkzafwyubnqafqnjmelsrdq ; /usr/bin/python3'
Jan 23 08:25:18 np0005593250 sudo[6876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:18 np0005593250 python3[6878]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156718.2084556-451-158934062123320/source _original_basename=tmp8wor1snv follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:18 np0005593250 sudo[6876]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:18 np0005593250 sudo[6927]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsfzwvahelacycnknqnjauvqkjqasxep ; /usr/bin/python3'
Jan 23 08:25:18 np0005593250 sudo[6927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:19 np0005593250 python3[6929]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e4f-9ce5-3f1f-1cc1-00000000001f-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:25:19 np0005593250 sudo[6927]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:19 np0005593250 python3[6957]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                             _uses_shell=True zuul_log_id=fa163e4f-9ce5-3f1f-1cc1-000000000020-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 23 08:25:20 np0005593250 python3[6985]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:35 np0005593250 sudo[7009]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsrnxajrgrkiymxcxdvvoydbwavcqdmv ; /usr/bin/python3'
Jan 23 08:25:35 np0005593250 sudo[7009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:25:36 np0005593250 python3[7011]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:25:36 np0005593250 sudo[7009]: pam_unix(sudo:session): session closed for user root
Jan 23 08:25:43 np0005593250 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 08:26:05 np0005593250 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Jan 23 08:26:05 np0005593250 kernel: pci 0000:07:00.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 23 08:26:05 np0005593250 kernel: pci 0000:07:00.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 23 08:26:05 np0005593250 kernel: pci 0000:07:00.0: ROM [mem 0x00000000-0x0003ffff pref]
Jan 23 08:26:05 np0005593250 kernel: pci 0000:07:00.0: ROM [mem 0xfe000000-0xfe03ffff pref]: assigned
Jan 23 08:26:05 np0005593250 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfb600000-0xfb603fff 64bit pref]: assigned
Jan 23 08:26:05 np0005593250 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfe040000-0xfe040fff]: assigned
Jan 23 08:26:05 np0005593250 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002)
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2144] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 08:26:05 np0005593250 systemd-udevd[7015]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2250] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2269] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2271] device (eth1): carrier: link connected
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2272] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2276] policy: auto-activating connection 'Wired connection 1' (17ffb41b-c510-3ee4-8367-2fd6b152fbfa)
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2278] device (eth1): Activation: starting connection 'Wired connection 1' (17ffb41b-c510-3ee4-8367-2fd6b152fbfa)
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2279] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2280] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2282] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:26:05 np0005593250 NetworkManager[812]: <info>  [1769156765.2286] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:26:05 np0005593250 python3[7041]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e4f-9ce5-4808-abfc-000000000128-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:26:12 np0005593250 sudo[7119]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqmpvpgldcozjayjokbdksdgkjqscgrn ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Jan 23 08:26:12 np0005593250 sudo[7119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:26:12 np0005593250 python3[7121]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:26:12 np0005593250 sudo[7119]: pam_unix(sudo:session): session closed for user root
Jan 23 08:26:12 np0005593250 sudo[7192]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzzhicpuiftkiwiyqgsygggyrhbzeild ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Jan 23 08:26:12 np0005593250 sudo[7192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:26:12 np0005593250 python3[7194]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156772.0918326-113-179119216485441/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=71438e4a9e0900705558243c7e8cdb6da91439f6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:26:12 np0005593250 sudo[7192]: pam_unix(sudo:session): session closed for user root
Jan 23 08:26:12 np0005593250 sudo[7242]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgrvykrdabdbdkbibldtvwykgkazqgts ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Jan 23 08:26:12 np0005593250 sudo[7242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:26:13 np0005593250 python3[7244]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 08:26:13 np0005593250 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 08:26:13 np0005593250 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 08:26:13 np0005593250 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 08:26:13 np0005593250 systemd[1]: Stopping Network Manager...
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2004] caught SIGTERM, shutting down normally.
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2013] dhcp4 (eth0): canceled DHCP transaction
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2014] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2014] dhcp4 (eth0): state changed no lease
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2015] dhcp6 (eth0): canceled DHCP transaction
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2015] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2015] dhcp6 (eth0): state changed no lease
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2018] manager: NetworkManager state is now CONNECTING
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2087] dhcp4 (eth1): canceled DHCP transaction
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2088] dhcp4 (eth1): state changed no lease
Jan 23 08:26:13 np0005593250 NetworkManager[812]: <info>  [1769156773.2108] exiting (success)
Jan 23 08:26:13 np0005593250 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 08:26:13 np0005593250 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 08:26:13 np0005593250 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 08:26:13 np0005593250 systemd[1]: Stopped Network Manager.
Jan 23 08:26:13 np0005593250 systemd[1]: Starting Network Manager...
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.2628] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:1b526f85-5c85-4390-8f4f-c5e9ef0f805d)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.2630] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.2675] manager[0x555b959c6000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 08:26:13 np0005593250 systemd[1]: Starting Hostname Service...
Jan 23 08:26:13 np0005593250 systemd[1]: Started Hostname Service.
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3260] hostname: hostname: using hostnamed
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3261] hostname: static hostname changed from (none) to "np0005593250"
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3264] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3267] manager[0x555b959c6000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3267] manager[0x555b959c6000]: rfkill: WWAN hardware radio set enabled
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3288] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3288] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3289] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3289] manager: Networking is enabled by state file
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3291] settings: Loaded settings plugin: keyfile (internal)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3294] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3319] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3328] dhcp: init: Using DHCP client 'internal'
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3333] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3338] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3343] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3351] device (lo): Activation: starting connection 'lo' (ab218173-ee72-4eee-b4f5-cff3eabad9ac)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3358] device (eth0): carrier: link connected
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3361] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3366] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3366] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3372] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3376] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3381] device (eth1): carrier: link connected
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3385] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3390] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (17ffb41b-c510-3ee4-8367-2fd6b152fbfa) (indicated)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3390] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3396] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3403] device (eth1): Activation: starting connection 'Wired connection 1' (17ffb41b-c510-3ee4-8367-2fd6b152fbfa)
Jan 23 08:26:13 np0005593250 systemd[1]: Started Network Manager.
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3427] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3431] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3433] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3434] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3436] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3437] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3439] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3440] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3442] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3445] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3449] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3451] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3453] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3458] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3464] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3484] dhcp4 (eth0): state changed new lease, address=192.168.26.29
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3489] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3517] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3518] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 08:26:13 np0005593250 systemd[1]: Starting Network Manager Wait Online...
Jan 23 08:26:13 np0005593250 NetworkManager[7254]: <info>  [1769156773.3524] device (lo): Activation: successful, device activated.
Jan 23 08:26:13 np0005593250 sudo[7242]: pam_unix(sudo:session): session closed for user root
Jan 23 08:26:13 np0005593250 python3[7316]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e4f-9ce5-4808-abfc-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:26:14 np0005593250 NetworkManager[7254]: <info>  [1769156774.4069] dhcp6 (eth0): state changed new lease, address=2001:db8::397
Jan 23 08:26:14 np0005593250 NetworkManager[7254]: <info>  [1769156774.4082] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 08:26:14 np0005593250 NetworkManager[7254]: <info>  [1769156774.4119] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 08:26:14 np0005593250 NetworkManager[7254]: <info>  [1769156774.4121] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 08:26:14 np0005593250 NetworkManager[7254]: <info>  [1769156774.4123] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 08:26:14 np0005593250 NetworkManager[7254]: <info>  [1769156774.4125] device (eth0): Activation: successful, device activated.
Jan 23 08:26:14 np0005593250 NetworkManager[7254]: <info>  [1769156774.4131] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 08:26:24 np0005593250 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 08:26:43 np0005593250 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 08:26:55 np0005593250 systemd[4380]: Starting Mark boot as successful...
Jan 23 08:26:55 np0005593250 systemd[4380]: Finished Mark boot as successful.
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.4764] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 08:26:58 np0005593250 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 08:26:58 np0005593250 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.4968] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.4969] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.4973] device (eth1): Activation: successful, device activated.
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.4977] manager: startup complete
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.4978] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <warn>  [1769156818.4980] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.4985] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 23 08:26:58 np0005593250 systemd[1]: Finished Network Manager Wait Online.
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5026] dhcp4 (eth1): canceled DHCP transaction
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5027] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5027] dhcp4 (eth1): state changed no lease
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5035] policy: auto-activating connection 'ci-private-network' (eb1a661b-2380-537d-bcf6-15c64e0c452b)
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5038] device (eth1): Activation: starting connection 'ci-private-network' (eb1a661b-2380-537d-bcf6-15c64e0c452b)
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5039] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5041] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5045] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5052] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5075] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5076] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:26:58 np0005593250 NetworkManager[7254]: <info>  [1769156818.5080] device (eth1): Activation: successful, device activated.
Jan 23 08:27:08 np0005593250 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 08:27:13 np0005593250 sshd-session[4389]: Received disconnect from 192.168.26.12 port 44826:11: disconnected by user
Jan 23 08:27:13 np0005593250 sshd-session[4389]: Disconnected from user zuul 192.168.26.12 port 44826
Jan 23 08:27:13 np0005593250 sshd-session[4376]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:27:13 np0005593250 systemd-logind[746]: Session 1 logged out. Waiting for processes to exit.
Jan 23 08:27:29 np0005593250 sshd-session[7365]: Accepted publickey for zuul from 192.168.26.12 port 50548 ssh2: RSA SHA256:AKJp1nqa0URRkslOBoQNVhtrZCfp8Xk5d6c5GGuTL0E
Jan 23 08:27:29 np0005593250 systemd-logind[746]: New session 3 of user zuul.
Jan 23 08:27:29 np0005593250 systemd[1]: Started Session 3 of User zuul.
Jan 23 08:27:29 np0005593250 sshd-session[7365]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:27:29 np0005593250 sudo[7444]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywehbycewaecpomrytopzqekzhrpcvtz ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Jan 23 08:27:29 np0005593250 sudo[7444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:27:29 np0005593250 python3[7446]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:27:29 np0005593250 sudo[7444]: pam_unix(sudo:session): session closed for user root
Jan 23 08:27:29 np0005593250 sudo[7517]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnpskjhdkycixvakdkdooqlpgufmfyqg ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Jan 23 08:27:29 np0005593250 sudo[7517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:27:29 np0005593250 python3[7519]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156849.4460666-368-64758759321322/source _original_basename=tmpwkopa7hx follow=False checksum=1f2e30b022ca0e5b9f593607af46da58154c84dd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:27:29 np0005593250 sudo[7517]: pam_unix(sudo:session): session closed for user root
Jan 23 08:27:32 np0005593250 sshd-session[7368]: Connection closed by 192.168.26.12 port 50548
Jan 23 08:27:32 np0005593250 sshd-session[7365]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:27:32 np0005593250 systemd[1]: session-3.scope: Deactivated successfully.
Jan 23 08:27:32 np0005593250 systemd-logind[746]: Session 3 logged out. Waiting for processes to exit.
Jan 23 08:27:32 np0005593250 systemd-logind[746]: Removed session 3.
Jan 23 08:29:55 np0005593250 systemd[4380]: Created slice User Background Tasks Slice.
Jan 23 08:29:55 np0005593250 systemd[4380]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 08:29:55 np0005593250 systemd[4380]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 08:31:50 np0005593250 sshd-session[7547]: Accepted publickey for zuul from 192.168.26.12 port 47470 ssh2: RSA SHA256:AKJp1nqa0URRkslOBoQNVhtrZCfp8Xk5d6c5GGuTL0E
Jan 23 08:31:50 np0005593250 systemd-logind[746]: New session 4 of user zuul.
Jan 23 08:31:50 np0005593250 systemd[1]: Started Session 4 of User zuul.
Jan 23 08:31:50 np0005593250 sshd-session[7547]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:31:50 np0005593250 sudo[7574]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iejzodrsisnoytmqfazedijgqlvogqis ; /usr/bin/python3'
Jan 23 08:31:50 np0005593250 sudo[7574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:51 np0005593250 python3[7576]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                             _uses_shell=True zuul_log_id=fa163e4f-9ce5-5b4e-271e-000000000c9e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:31:51 np0005593250 sudo[7574]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:51 np0005593250 sudo[7603]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdbmzmkxjcseygjezegmckxzbxaysfqm ; /usr/bin/python3'
Jan 23 08:31:51 np0005593250 sudo[7603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:51 np0005593250 python3[7605]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:31:51 np0005593250 sudo[7603]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:51 np0005593250 sudo[7629]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjnaggiqplokefegggqjxorsuibpgezj ; /usr/bin/python3'
Jan 23 08:31:51 np0005593250 sudo[7629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:51 np0005593250 python3[7631]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:31:51 np0005593250 sudo[7629]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:51 np0005593250 sudo[7655]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jekzngtwzlnyvzmxciiomrxmlupseflf ; /usr/bin/python3'
Jan 23 08:31:51 np0005593250 sudo[7655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:51 np0005593250 python3[7657]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:31:51 np0005593250 sudo[7655]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:51 np0005593250 sudo[7681]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyjxkekjsrmgpvhggnzkyawtqqpfabvg ; /usr/bin/python3'
Jan 23 08:31:51 np0005593250 sudo[7681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:51 np0005593250 python3[7683]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:31:51 np0005593250 sudo[7681]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:52 np0005593250 sudo[7707]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-figdidkspmcxillqhiabiunfpxdbtffo ; /usr/bin/python3'
Jan 23 08:31:52 np0005593250 sudo[7707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:52 np0005593250 python3[7709]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:31:52 np0005593250 sudo[7707]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:52 np0005593250 sudo[7785]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raldbhohhmlhfyykoubqczubdklgvllz ; /usr/bin/python3'
Jan 23 08:31:52 np0005593250 sudo[7785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:52 np0005593250 python3[7787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:31:52 np0005593250 sudo[7785]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:52 np0005593250 sudo[7858]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzepnbqujspzmdnaqiqqghrpvwbidkv ; /usr/bin/python3'
Jan 23 08:31:52 np0005593250 sudo[7858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:52 np0005593250 python3[7860]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769157112.5267498-361-122521139436276/source _original_basename=tmpesz222tu follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:31:52 np0005593250 sudo[7858]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:53 np0005593250 sudo[7908]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vglfxbmzlrqqvsziywqhtyqpwcjepzcb ; /usr/bin/python3'
Jan 23 08:31:53 np0005593250 sudo[7908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:53 np0005593250 python3[7910]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 08:31:53 np0005593250 systemd[1]: Reloading.
Jan 23 08:31:53 np0005593250 systemd-rc-local-generator[7926]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:31:53 np0005593250 sudo[7908]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:55 np0005593250 sudo[7963]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxhmlfekwfglstshrckpksffzdzcuwrr ; /usr/bin/python3'
Jan 23 08:31:55 np0005593250 sudo[7963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:55 np0005593250 python3[7965]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 23 08:31:55 np0005593250 sudo[7963]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:55 np0005593250 sudo[7989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnvdgfexwewxkhjgcsoutdglxjhqmyyv ; /usr/bin/python3'
Jan 23 08:31:55 np0005593250 sudo[7989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:55 np0005593250 python3[7991]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:31:55 np0005593250 sudo[7989]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:55 np0005593250 sudo[8017]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huajbhivvnkzllbclpfrrghymymlaajf ; /usr/bin/python3'
Jan 23 08:31:55 np0005593250 sudo[8017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:55 np0005593250 python3[8019]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:31:55 np0005593250 sudo[8017]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:55 np0005593250 sudo[8045]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nthjqbozvmobkwrabwewisuinlyapohd ; /usr/bin/python3'
Jan 23 08:31:55 np0005593250 sudo[8045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:55 np0005593250 python3[8047]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:31:55 np0005593250 sudo[8045]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:55 np0005593250 sudo[8073]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bothcndogshrxcqxgjkslbozlxaqfmlg ; /usr/bin/python3'
Jan 23 08:31:55 np0005593250 sudo[8073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:31:55 np0005593250 python3[8075]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:31:55 np0005593250 sudo[8073]: pam_unix(sudo:session): session closed for user root
Jan 23 08:31:56 np0005593250 python3[8102]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                             _uses_shell=True zuul_log_id=fa163e4f-9ce5-5b4e-271e-000000000ca5-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:31:56 np0005593250 python3[8132]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 08:31:59 np0005593250 sshd-session[7550]: Connection closed by 192.168.26.12 port 47470
Jan 23 08:31:59 np0005593250 sshd-session[7547]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:31:59 np0005593250 systemd[1]: session-4.scope: Deactivated successfully.
Jan 23 08:31:59 np0005593250 systemd[1]: session-4.scope: Consumed 2.917s CPU time.
Jan 23 08:31:59 np0005593250 systemd-logind[746]: Session 4 logged out. Waiting for processes to exit.
Jan 23 08:31:59 np0005593250 systemd-logind[746]: Removed session 4.
Jan 23 08:32:00 np0005593250 sshd-session[8138]: Accepted publickey for zuul from 192.168.26.12 port 36432 ssh2: RSA SHA256:AKJp1nqa0URRkslOBoQNVhtrZCfp8Xk5d6c5GGuTL0E
Jan 23 08:32:00 np0005593250 systemd-logind[746]: New session 5 of user zuul.
Jan 23 08:32:00 np0005593250 systemd[1]: Started Session 5 of User zuul.
Jan 23 08:32:00 np0005593250 sshd-session[8138]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:32:00 np0005593250 sudo[8165]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmvcscjyxnytrsmuafngbgdyymscugrw ; /usr/bin/python3'
Jan 23 08:32:00 np0005593250 sudo[8165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:32:01 np0005593250 python3[8167]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 08:32:09 np0005593250 setsebool[8211]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 23 08:32:09 np0005593250 setsebool[8211]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 23 08:32:17 np0005593250 kernel: SELinux:  Converting 386 SID table entries...
Jan 23 08:32:17 np0005593250 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 08:32:17 np0005593250 kernel: SELinux:  policy capability open_perms=1
Jan 23 08:32:17 np0005593250 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 08:32:17 np0005593250 kernel: SELinux:  policy capability always_check_network=0
Jan 23 08:32:17 np0005593250 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 08:32:17 np0005593250 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 08:32:17 np0005593250 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 08:32:24 np0005593250 kernel: SELinux:  Converting 389 SID table entries...
Jan 23 08:32:24 np0005593250 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 08:32:24 np0005593250 kernel: SELinux:  policy capability open_perms=1
Jan 23 08:32:24 np0005593250 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 08:32:24 np0005593250 kernel: SELinux:  policy capability always_check_network=0
Jan 23 08:32:24 np0005593250 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 08:32:24 np0005593250 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 08:32:24 np0005593250 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 08:32:36 np0005593250 dbus-broker-launch[733]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 08:32:36 np0005593250 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 08:32:36 np0005593250 systemd[1]: Starting man-db-cache-update.service...
Jan 23 08:32:36 np0005593250 systemd[1]: Reloading.
Jan 23 08:32:36 np0005593250 systemd-rc-local-generator[8976]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:32:36 np0005593250 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 08:32:37 np0005593250 sudo[8165]: pam_unix(sudo:session): session closed for user root
Jan 23 08:32:46 np0005593250 python3[17757]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                              _uses_shell=True zuul_log_id=fa163e4f-9ce5-d66a-8955-00000000000c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:32:47 np0005593250 kernel: evm: overlay not supported
Jan 23 08:32:47 np0005593250 systemd[4380]: Starting D-Bus User Message Bus...
Jan 23 08:32:47 np0005593250 dbus-broker-launch[18379]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 23 08:32:47 np0005593250 dbus-broker-launch[18379]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 23 08:32:47 np0005593250 systemd[4380]: Started D-Bus User Message Bus.
Jan 23 08:32:47 np0005593250 dbus-broker-lau[18379]: Ready
Jan 23 08:32:47 np0005593250 systemd[4380]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 08:32:47 np0005593250 systemd[4380]: Created slice Slice /user.
Jan 23 08:32:47 np0005593250 systemd[4380]: podman-18304.scope: unit configures an IP firewall, but not running as root.
Jan 23 08:32:47 np0005593250 systemd[4380]: (This warning is only shown for the first unit using IP firewalling.)
Jan 23 08:32:47 np0005593250 systemd[4380]: Started podman-18304.scope.
Jan 23 08:32:48 np0005593250 systemd[4380]: Started podman-pause-dc113ce5.scope.
Jan 23 08:32:48 np0005593250 sudo[19027]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ficrkfevqwsvxewwagfydbgquvzzkvja ; /usr/bin/python3'
Jan 23 08:32:48 np0005593250 sudo[19027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:32:48 np0005593250 python3[19037]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                             location = "38.129.56.147:5001"
                                             insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                             location = "38.129.56.147:5001"
                                             insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:32:48 np0005593250 python3[19037]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 23 08:32:48 np0005593250 sudo[19027]: pam_unix(sudo:session): session closed for user root
Jan 23 08:32:49 np0005593250 sshd-session[8141]: Connection closed by 192.168.26.12 port 36432
Jan 23 08:32:49 np0005593250 sshd-session[8138]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:32:49 np0005593250 systemd[1]: session-5.scope: Deactivated successfully.
Jan 23 08:32:49 np0005593250 systemd[1]: session-5.scope: Consumed 30.737s CPU time.
Jan 23 08:32:49 np0005593250 systemd-logind[746]: Session 5 logged out. Waiting for processes to exit.
Jan 23 08:32:49 np0005593250 systemd-logind[746]: Removed session 5.
Jan 23 08:33:01 np0005593250 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 08:33:01 np0005593250 systemd[1]: Finished man-db-cache-update.service.
Jan 23 08:33:01 np0005593250 systemd[1]: man-db-cache-update.service: Consumed 31.612s CPU time.
Jan 23 08:33:01 np0005593250 systemd[1]: run-r82b80fa0e2ee4ca1bb2d89132748fe09.service: Deactivated successfully.
Jan 23 08:33:05 np0005593250 sshd-session[29643]: Connection closed by 192.168.26.111 port 49660 [preauth]
Jan 23 08:33:05 np0005593250 sshd-session[29644]: Connection closed by 192.168.26.111 port 49676 [preauth]
Jan 23 08:33:05 np0005593250 sshd-session[29645]: Unable to negotiate with 192.168.26.111 port 49680: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 23 08:33:05 np0005593250 sshd-session[29646]: Unable to negotiate with 192.168.26.111 port 49696: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 23 08:33:05 np0005593250 sshd-session[29647]: Unable to negotiate with 192.168.26.111 port 49700: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 23 08:33:14 np0005593250 sshd-session[29653]: Accepted publickey for zuul from 192.168.26.12 port 33138 ssh2: RSA SHA256:AKJp1nqa0URRkslOBoQNVhtrZCfp8Xk5d6c5GGuTL0E
Jan 23 08:33:14 np0005593250 systemd-logind[746]: New session 6 of user zuul.
Jan 23 08:33:14 np0005593250 systemd[1]: Started Session 6 of User zuul.
Jan 23 08:33:14 np0005593250 sshd-session[29653]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:33:14 np0005593250 python3[29680]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDcratYXAu4DypJd0WJTYbtDKK6LoKibHuKqoQ3qxMlNKNqgHsxjDcNM7T5pnFK+tXxzrQCE9mQeIPdaMd7KzTI= zuul@np0005593249
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:33:14 np0005593250 sudo[29704]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydtimocojyumiopkxsamiehvsuvollba ; /usr/bin/python3'
Jan 23 08:33:14 np0005593250 sudo[29704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:33:14 np0005593250 python3[29706]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDcratYXAu4DypJd0WJTYbtDKK6LoKibHuKqoQ3qxMlNKNqgHsxjDcNM7T5pnFK+tXxzrQCE9mQeIPdaMd7KzTI= zuul@np0005593249
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:33:14 np0005593250 sudo[29704]: pam_unix(sudo:session): session closed for user root
Jan 23 08:33:15 np0005593250 sudo[29730]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elilbxzdjhmlkuhcfyaxatshrjurqbcz ; /usr/bin/python3'
Jan 23 08:33:15 np0005593250 sudo[29730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:33:15 np0005593250 python3[29732]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005593250 update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 23 08:33:15 np0005593250 useradd[29734]: new group: name=cloud-admin, GID=1002
Jan 23 08:33:15 np0005593250 useradd[29734]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 23 08:33:15 np0005593250 sudo[29730]: pam_unix(sudo:session): session closed for user root
Jan 23 08:33:15 np0005593250 sudo[29764]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwkkawpvmacvhzckohyyzmywdtuvjlxu ; /usr/bin/python3'
Jan 23 08:33:15 np0005593250 sudo[29764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:33:15 np0005593250 python3[29766]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDcratYXAu4DypJd0WJTYbtDKK6LoKibHuKqoQ3qxMlNKNqgHsxjDcNM7T5pnFK+tXxzrQCE9mQeIPdaMd7KzTI= zuul@np0005593249
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 08:33:15 np0005593250 sudo[29764]: pam_unix(sudo:session): session closed for user root
Jan 23 08:33:16 np0005593250 sudo[29842]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhzslfhfnbzhfawaegbujmbcqloqghoy ; /usr/bin/python3'
Jan 23 08:33:16 np0005593250 sudo[29842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:33:16 np0005593250 python3[29844]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:33:16 np0005593250 sudo[29842]: pam_unix(sudo:session): session closed for user root
Jan 23 08:33:16 np0005593250 sudo[29915]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgzrfsfndutptdekzcsxbvcbfsaqloyx ; /usr/bin/python3'
Jan 23 08:33:16 np0005593250 sudo[29915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:33:16 np0005593250 python3[29917]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769157196.054448-169-112723194152752/source _original_basename=tmpetvxs8ef follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:33:16 np0005593250 sudo[29915]: pam_unix(sudo:session): session closed for user root
Jan 23 08:33:17 np0005593250 sudo[29965]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znkaqzoqbapxihgfeijguuwspltvylcg ; /usr/bin/python3'
Jan 23 08:33:17 np0005593250 sudo[29965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:33:17 np0005593250 python3[29967]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 23 08:33:17 np0005593250 systemd[1]: Starting Hostname Service...
Jan 23 08:33:17 np0005593250 systemd[1]: Started Hostname Service.
Jan 23 08:33:17 np0005593250 systemd-hostnamed[29971]: Changed pretty hostname to 'compute-0'
Jan 23 08:33:17 compute-0 systemd-hostnamed[29971]: Hostname set to <compute-0> (static)
Jan 23 08:33:17 compute-0 NetworkManager[7254]: <info>  [1769157197.4126] hostname: static hostname changed from "np0005593250" to "compute-0"
Jan 23 08:33:17 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 08:33:17 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 08:33:17 compute-0 sudo[29965]: pam_unix(sudo:session): session closed for user root
Jan 23 08:33:17 compute-0 sshd-session[29656]: Connection closed by 192.168.26.12 port 33138
Jan 23 08:33:17 compute-0 sshd-session[29653]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:33:17 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Jan 23 08:33:17 compute-0 systemd[1]: session-6.scope: Consumed 1.647s CPU time.
Jan 23 08:33:17 compute-0 systemd-logind[746]: Session 6 logged out. Waiting for processes to exit.
Jan 23 08:33:17 compute-0 systemd-logind[746]: Removed session 6.
Jan 23 08:33:27 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 08:33:47 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 08:36:45 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 23 08:36:45 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 23 08:36:45 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 23 08:36:45 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 23 08:36:58 compute-0 sshd-session[29992]: Accepted publickey for zuul from 192.168.26.111 port 32820 ssh2: RSA SHA256:AKJp1nqa0URRkslOBoQNVhtrZCfp8Xk5d6c5GGuTL0E
Jan 23 08:36:58 compute-0 systemd-logind[746]: New session 7 of user zuul.
Jan 23 08:36:58 compute-0 systemd[1]: Started Session 7 of User zuul.
Jan 23 08:36:58 compute-0 sshd-session[29992]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:36:59 compute-0 python3[30068]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:37:00 compute-0 sudo[30178]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntrjdlldiycempyqavdvtqakrhkfwepz ; /usr/bin/python3'
Jan 23 08:37:00 compute-0 sudo[30178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:00 compute-0 python3[30180]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:37:00 compute-0 sudo[30178]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:00 compute-0 sudo[30251]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyshkyimsaucelwwfldksfaitsfcelpj ; /usr/bin/python3'
Jan 23 08:37:00 compute-0 sudo[30251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:00 compute-0 python3[30253]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157420.194839-34424-189029099426469/source mode=0755 _original_basename=delorean.repo follow=False checksum=1d7412093fdea43b5454099227a576288791d9ce backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:37:00 compute-0 sudo[30251]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:00 compute-0 sudo[30277]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyjdjswgyfltetggrmnjtputddzqjuwr ; /usr/bin/python3'
Jan 23 08:37:00 compute-0 sudo[30277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:00 compute-0 python3[30279]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:37:00 compute-0 sudo[30277]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:01 compute-0 sudo[30350]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jepyxwahxpkgymepqspujfrunypnpgvd ; /usr/bin/python3'
Jan 23 08:37:01 compute-0 sudo[30350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:01 compute-0 python3[30352]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157420.194839-34424-189029099426469/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=50a3fd92f8bf68f65d4644f7ea4a784e3eaa0ad5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:37:01 compute-0 sudo[30350]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:01 compute-0 sudo[30376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiirlftwultfcgmxvcpzschbkrwfugmx ; /usr/bin/python3'
Jan 23 08:37:01 compute-0 sudo[30376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:01 compute-0 python3[30378]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:37:01 compute-0 sudo[30376]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:01 compute-0 sudo[30449]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyqftevwqkhoemwhsivngiinjfsshgei ; /usr/bin/python3'
Jan 23 08:37:01 compute-0 sudo[30449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:01 compute-0 python3[30451]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157420.194839-34424-189029099426469/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=8163d09913b97597f86e38eb45c3003e91da783e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:37:01 compute-0 sudo[30449]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:01 compute-0 sudo[30475]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxsautzndgqvwhksohdofvvtgmmoiplw ; /usr/bin/python3'
Jan 23 08:37:01 compute-0 sudo[30475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:01 compute-0 python3[30477]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:37:01 compute-0 sudo[30475]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:01 compute-0 sudo[30548]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxpmqxnimjawiwyqrlwbxxsbpseicske ; /usr/bin/python3'
Jan 23 08:37:01 compute-0 sudo[30548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:01 compute-0 python3[30550]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157420.194839-34424-189029099426469/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=d108d0750ad5b288ccc41bc6534ea307cc51e987 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:37:01 compute-0 sudo[30548]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:02 compute-0 sudo[30574]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvhekzftfjekreanogjuawxgpoxxqbaj ; /usr/bin/python3'
Jan 23 08:37:02 compute-0 sudo[30574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:02 compute-0 python3[30576]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:37:02 compute-0 sudo[30574]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:02 compute-0 sudo[30647]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvgpenqnbqtriuiukjappncfjkgrahum ; /usr/bin/python3'
Jan 23 08:37:02 compute-0 sudo[30647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:02 compute-0 python3[30649]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157420.194839-34424-189029099426469/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=20c3917c672c059a872cf09a437f61890d2f89fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:37:02 compute-0 sudo[30647]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:02 compute-0 sudo[30673]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsyuilgaavmdovfmidiyqsnqvjfkxari ; /usr/bin/python3'
Jan 23 08:37:02 compute-0 sudo[30673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:02 compute-0 python3[30675]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:37:02 compute-0 sudo[30673]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:02 compute-0 sudo[30746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkxbuqvloistwsqvzglzvwdpxjqronoo ; /usr/bin/python3'
Jan 23 08:37:02 compute-0 sudo[30746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:02 compute-0 python3[30748]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157420.194839-34424-189029099426469/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=4d14f168e8a0e6930d905faffbcdf4fedd6664d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:37:02 compute-0 sudo[30746]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:02 compute-0 sudo[30772]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khjpaxjqhpibplxlelvxeuvtnfevyeak ; /usr/bin/python3'
Jan 23 08:37:02 compute-0 sudo[30772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:02 compute-0 python3[30774]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 08:37:02 compute-0 sudo[30772]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:03 compute-0 sudo[30845]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oysymovypmgxprsekmkhyjijrdwntrmj ; /usr/bin/python3'
Jan 23 08:37:03 compute-0 sudo[30845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:37:03 compute-0 python3[30847]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157420.194839-34424-189029099426469/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:37:03 compute-0 sudo[30845]: pam_unix(sudo:session): session closed for user root
Jan 23 08:37:05 compute-0 sshd-session[30872]: Connection closed by 192.168.122.11 port 38970 [preauth]
Jan 23 08:37:05 compute-0 sshd-session[30873]: Connection closed by 192.168.122.11 port 38972 [preauth]
Jan 23 08:37:05 compute-0 sshd-session[30874]: Unable to negotiate with 192.168.122.11 port 38974: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 23 08:37:05 compute-0 sshd-session[30875]: Unable to negotiate with 192.168.122.11 port 38984: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 23 08:37:05 compute-0 sshd-session[30876]: Unable to negotiate with 192.168.122.11 port 38990: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 23 08:37:09 compute-0 python3[30905]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:42:09 compute-0 sshd-session[29995]: Received disconnect from 192.168.26.111 port 32820:11: disconnected by user
Jan 23 08:42:09 compute-0 sshd-session[29995]: Disconnected from user zuul 192.168.26.111 port 32820
Jan 23 08:42:09 compute-0 sshd-session[29992]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:42:09 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Jan 23 08:42:09 compute-0 systemd[1]: session-7.scope: Consumed 3.339s CPU time.
Jan 23 08:42:09 compute-0 systemd-logind[746]: Session 7 logged out. Waiting for processes to exit.
Jan 23 08:42:09 compute-0 systemd-logind[746]: Removed session 7.
Jan 23 08:44:45 compute-0 systemd[1]: Starting dnf makecache...
Jan 23 08:44:45 compute-0 dnf[30907]: Failed determining last makecache time.
Jan 23 08:44:45 compute-0 dnf[30907]: delorean-openstack-barbican-42b4c41831408a8e323  82 kB/s |  13 kB     00:00
Jan 23 08:44:45 compute-0 dnf[30907]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 444 kB/s |  65 kB     00:00
Jan 23 08:44:45 compute-0 dnf[30907]: delorean-openstack-cinder-1c00d6490d88e436f26ef 225 kB/s |  32 kB     00:00
Jan 23 08:44:45 compute-0 dnf[30907]: delorean-python-stevedore-c4acc5639fd2329372142 877 kB/s | 131 kB     00:00
Jan 23 08:44:46 compute-0 dnf[30907]: delorean-python-cloudkitty-tests-tempest-2c80f8 229 kB/s |  32 kB     00:00
Jan 23 08:44:46 compute-0 dnf[30907]: delorean-os-refresh-config-9bfc52b5049be2d8de61 2.2 MB/s | 349 kB     00:00
Jan 23 08:44:46 compute-0 dnf[30907]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 292 kB/s |  42 kB     00:00
Jan 23 08:44:46 compute-0 dnf[30907]: delorean-python-designate-tests-tempest-347fdbc 120 kB/s |  18 kB     00:00
Jan 23 08:44:46 compute-0 dnf[30907]: delorean-openstack-glance-1fd12c29b339f30fe823e 119 kB/s |  18 kB     00:00
Jan 23 08:44:47 compute-0 dnf[30907]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 203 kB/s |  29 kB     00:00
Jan 23 08:44:47 compute-0 dnf[30907]: delorean-openstack-manila-3c01b7181572c95dac462 180 kB/s |  25 kB     00:00
Jan 23 08:44:47 compute-0 dnf[30907]: delorean-python-whitebox-neutron-tests-tempest- 1.0 MB/s | 154 kB     00:00
Jan 23 08:44:47 compute-0 dnf[30907]: delorean-openstack-octavia-ba397f07a7331190208c 171 kB/s |  26 kB     00:00
Jan 23 08:44:47 compute-0 dnf[30907]: delorean-openstack-watcher-c014f81a8647287f6dcc 126 kB/s |  16 kB     00:00
Jan 23 08:44:47 compute-0 dnf[30907]: delorean-ansible-config_template-5ccaa22121a7ff  51 kB/s | 7.4 kB     00:00
Jan 23 08:44:47 compute-0 dnf[30907]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 976 kB/s | 144 kB     00:00
Jan 23 08:44:48 compute-0 dnf[30907]: delorean-openstack-swift-dc98a8463506ac520c469a  93 kB/s |  14 kB     00:00
Jan 23 08:44:48 compute-0 dnf[30907]: delorean-python-tempestconf-8515371b7cceebd4282 390 kB/s |  53 kB     00:00
Jan 23 08:44:48 compute-0 dnf[30907]: delorean-openstack-heat-ui-013accbfd179753bc3f0 688 kB/s |  96 kB     00:00
Jan 23 08:44:48 compute-0 dnf[30907]: CentOS Stream 9 - BaseOS                         18 kB/s | 6.7 kB     00:00
Jan 23 08:44:49 compute-0 dnf[30907]: CentOS Stream 9 - AppStream                      18 kB/s | 6.8 kB     00:00
Jan 23 08:44:50 compute-0 dnf[30907]: CentOS Stream 9 - CRB                           7.7 kB/s | 6.6 kB     00:00
Jan 23 08:44:50 compute-0 dnf[30907]: CentOS Stream 9 - Extras packages               9.7 kB/s | 7.3 kB     00:00
Jan 23 08:44:51 compute-0 dnf[30907]: dlrn-antelope-testing                           7.8 MB/s | 1.1 MB     00:00
Jan 23 08:44:51 compute-0 dnf[30907]: dlrn-antelope-build-deps                        3.2 MB/s | 461 kB     00:00
Jan 23 08:44:51 compute-0 dnf[30907]: centos9-rabbitmq                                8.2 MB/s | 123 kB     00:00
Jan 23 08:44:51 compute-0 dnf[30907]: centos9-storage                                  32 MB/s | 415 kB     00:00
Jan 23 08:44:51 compute-0 dnf[30907]: centos9-opstools                                4.0 MB/s |  51 kB     00:00
Jan 23 08:44:51 compute-0 dnf[30907]: NFV SIG OpenvSwitch                              32 MB/s | 461 kB     00:00
Jan 23 08:44:52 compute-0 dnf[30907]: repo-setup-centos-appstream                     215 MB/s |  26 MB     00:00
Jan 23 08:44:56 compute-0 dnf[30907]: repo-setup-centos-baseos                        211 MB/s | 8.9 MB     00:00
Jan 23 08:44:57 compute-0 dnf[30907]: repo-setup-centos-highavailability               36 MB/s | 744 kB     00:00
Jan 23 08:44:57 compute-0 dnf[30907]: repo-setup-centos-powertools                    193 MB/s | 7.6 MB     00:00
Jan 23 08:44:59 compute-0 irqbalance[742]: Cannot change IRQ 43 affinity: Operation not permitted
Jan 23 08:44:59 compute-0 irqbalance[742]: IRQ 43 affinity is now unmanaged
Jan 23 08:45:03 compute-0 dnf[30907]: Extra Packages for Enterprise Linux 9 - x86_64  5.4 MB/s |  20 MB     00:03
Jan 23 08:45:13 compute-0 dnf[30907]: Metadata cache created.
Jan 23 08:45:13 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 08:45:13 compute-0 systemd[1]: Finished dnf makecache.
Jan 23 08:45:13 compute-0 systemd[1]: dnf-makecache.service: Consumed 18.934s CPU time.
Jan 23 08:52:00 compute-0 sshd-session[31012]: Accepted publickey for zuul from 192.168.122.30 port 42950 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 08:52:00 compute-0 systemd-logind[746]: New session 8 of user zuul.
Jan 23 08:52:00 compute-0 systemd[1]: Started Session 8 of User zuul.
Jan 23 08:52:00 compute-0 sshd-session[31012]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:52:01 compute-0 python3.9[31165]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:52:02 compute-0 sudo[31344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjzwnbbwiueklunyujedsjrayxnvnyra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158322.376589-56-263735702023557/AnsiballZ_command.py'
Jan 23 08:52:02 compute-0 sudo[31344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:02 compute-0 python3.9[31346]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:52:10 compute-0 sudo[31344]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:11 compute-0 sshd-session[31015]: Connection closed by 192.168.122.30 port 42950
Jan 23 08:52:11 compute-0 sshd-session[31012]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:52:11 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Jan 23 08:52:11 compute-0 systemd[1]: session-8.scope: Consumed 6.112s CPU time.
Jan 23 08:52:11 compute-0 systemd-logind[746]: Session 8 logged out. Waiting for processes to exit.
Jan 23 08:52:11 compute-0 systemd-logind[746]: Removed session 8.
Jan 23 08:52:26 compute-0 sshd-session[31404]: Accepted publickey for zuul from 192.168.122.30 port 58642 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 08:52:26 compute-0 systemd-logind[746]: New session 9 of user zuul.
Jan 23 08:52:26 compute-0 systemd[1]: Started Session 9 of User zuul.
Jan 23 08:52:26 compute-0 sshd-session[31404]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:52:27 compute-0 python3.9[31557]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 08:52:28 compute-0 python3.9[31731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:52:28 compute-0 sudo[31881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wytizihmowveeqffpozdkhaqgtlnlcbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158348.371645-93-35792848596340/AnsiballZ_command.py'
Jan 23 08:52:28 compute-0 sudo[31881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:28 compute-0 python3.9[31883]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:52:28 compute-0 sudo[31881]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:29 compute-0 sudo[32034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcpzagcsnvvibkipcjleodceqlxvdhhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158349.185045-129-71674905992550/AnsiballZ_stat.py'
Jan 23 08:52:29 compute-0 sudo[32034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:29 compute-0 python3.9[32036]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 08:52:29 compute-0 sudo[32034]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:30 compute-0 sudo[32186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohkkgtezffqkpgebrdmjzkqxfnslzagf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158349.7788444-153-184367993639058/AnsiballZ_file.py'
Jan 23 08:52:30 compute-0 sudo[32186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:30 compute-0 python3.9[32188]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:52:30 compute-0 sudo[32186]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:30 compute-0 sudo[32338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdwqyywssfncvbxrxanlxefrduwixwpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158350.3745875-177-46491617953586/AnsiballZ_stat.py'
Jan 23 08:52:30 compute-0 sudo[32338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:30 compute-0 python3.9[32340]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:52:30 compute-0 sudo[32338]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:31 compute-0 sudo[32461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqadjjhrahgbjzytqwdjycobxncugbin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158350.3745875-177-46491617953586/AnsiballZ_copy.py'
Jan 23 08:52:31 compute-0 sudo[32461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:31 compute-0 python3.9[32463]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158350.3745875-177-46491617953586/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:52:31 compute-0 sudo[32461]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:31 compute-0 sudo[32613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umjoacvazqgveyqxubmbnefuzzgofdiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158351.3466196-222-244361737412479/AnsiballZ_setup.py'
Jan 23 08:52:31 compute-0 sudo[32613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:31 compute-0 python3.9[32615]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:52:31 compute-0 sudo[32613]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:32 compute-0 sudo[32770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwmaqttkicowhxjnepoibjakkedzztut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158352.066853-246-75761077060188/AnsiballZ_file.py'
Jan 23 08:52:32 compute-0 sudo[32770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:32 compute-0 python3.9[32772]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:52:32 compute-0 sudo[32770]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:32 compute-0 sudo[32922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjafkalpmpqlrkhibgigjhdwzvankngu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158352.6133542-273-237293095775398/AnsiballZ_file.py'
Jan 23 08:52:32 compute-0 sudo[32922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:32 compute-0 python3.9[32924]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:52:32 compute-0 sudo[32922]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:33 compute-0 python3.9[33074]: ansible-ansible.builtin.service_facts Invoked
Jan 23 08:52:35 compute-0 python3.9[33327]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:52:36 compute-0 python3.9[33477]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:52:37 compute-0 python3.9[33631]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:52:37 compute-0 sudo[33787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlauwahvytdnbxpbdfmcqyqouwjfqwzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158357.8095927-417-157628003453348/AnsiballZ_setup.py'
Jan 23 08:52:37 compute-0 sudo[33787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:38 compute-0 python3.9[33789]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:52:38 compute-0 sudo[33787]: pam_unix(sudo:session): session closed for user root
Jan 23 08:52:38 compute-0 sudo[33871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfibodvsumsnwdaotcndrnwcejbpjmpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158357.8095927-417-157628003453348/AnsiballZ_dnf.py'
Jan 23 08:52:38 compute-0 sudo[33871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:52:38 compute-0 python3.9[33873]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:53:58 compute-0 systemd[1]: Reloading.
Jan 23 08:53:58 compute-0 systemd-rc-local-generator[34073]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:53:59 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 23 08:53:59 compute-0 systemd[1]: Reloading.
Jan 23 08:53:59 compute-0 systemd-rc-local-generator[34116]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:53:59 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 23 08:53:59 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 23 08:53:59 compute-0 systemd[1]: Reloading.
Jan 23 08:53:59 compute-0 systemd-rc-local-generator[34158]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:53:59 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 23 08:53:59 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Jan 23 08:53:59 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Jan 23 08:53:59 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Jan 23 08:54:43 compute-0 kernel: SELinux:  Converting 2725 SID table entries...
Jan 23 08:54:43 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 08:54:43 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 23 08:54:43 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 08:54:43 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 23 08:54:43 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 08:54:43 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 08:54:43 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 08:54:43 compute-0 dbus-broker-launch[733]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 23 08:54:43 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 08:54:43 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 23 08:54:43 compute-0 systemd[1]: Reloading.
Jan 23 08:54:43 compute-0 systemd-rc-local-generator[34452]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:54:43 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 08:54:44 compute-0 sudo[33871]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:44 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 08:54:44 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 23 08:54:44 compute-0 systemd[1]: run-r65610b75fd5b4636a0847c0918ec7fa6.service: Deactivated successfully.
Jan 23 08:54:44 compute-0 sudo[35371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmcjdrjndnsmgpbonunmxzwbtafthoof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158484.2984297-453-175631919601924/AnsiballZ_command.py'
Jan 23 08:54:44 compute-0 sudo[35371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:44 compute-0 python3.9[35373]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:54:45 compute-0 sudo[35371]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:45 compute-0 sudo[35652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbcaqwalqhzdstahlrzpfgtwoorqrsap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158485.4919882-477-1892576649801/AnsiballZ_selinux.py'
Jan 23 08:54:45 compute-0 sudo[35652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:46 compute-0 python3.9[35654]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 08:54:46 compute-0 sudo[35652]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:46 compute-0 sudo[35804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aifpgmthchpotqohveazncdbzgfhfvma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158486.5388873-510-186315849401108/AnsiballZ_command.py'
Jan 23 08:54:46 compute-0 sudo[35804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:46 compute-0 python3.9[35806]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 08:54:47 compute-0 sudo[35804]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:47 compute-0 sudo[35957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjupebzmcaakknkzenyzaawrjwxkherg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158487.6227324-534-70875587168413/AnsiballZ_file.py'
Jan 23 08:54:47 compute-0 sudo[35957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:48 compute-0 python3.9[35959]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:54:48 compute-0 sudo[35957]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:49 compute-0 sudo[36109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxqeedeerkewfntlgalihuxokecvqide ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158488.9465716-558-226451407114836/AnsiballZ_mount.py'
Jan 23 08:54:49 compute-0 sudo[36109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:49 compute-0 python3.9[36111]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 08:54:49 compute-0 sudo[36109]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:50 compute-0 sudo[36261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfsjykyezcslewjfbvebisrpojmjqqts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158490.2972035-642-82344871703457/AnsiballZ_file.py'
Jan 23 08:54:50 compute-0 sudo[36261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:50 compute-0 python3.9[36263]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:54:50 compute-0 sudo[36261]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:51 compute-0 sudo[36413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqdotmlmnrmxcumdcoonzkygwesfbdom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158490.8190706-666-80750037503861/AnsiballZ_stat.py'
Jan 23 08:54:51 compute-0 sudo[36413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:51 compute-0 python3.9[36415]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:54:51 compute-0 sudo[36413]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:51 compute-0 sudo[36536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pozhmjlbfthfmonamcivrfuctkiojbwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158490.8190706-666-80750037503861/AnsiballZ_copy.py'
Jan 23 08:54:51 compute-0 sudo[36536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:51 compute-0 python3.9[36538]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158490.8190706-666-80750037503861/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8561196ac452c885b39729ffdf75bbc977d4d7d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:54:51 compute-0 sudo[36536]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:52 compute-0 sudo[36688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msrjdezwukykntpngzaesuyrubuxoqbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158492.2840173-738-218035969629092/AnsiballZ_stat.py'
Jan 23 08:54:52 compute-0 sudo[36688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:52 compute-0 python3.9[36690]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 08:54:52 compute-0 sudo[36688]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:52 compute-0 sudo[36840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvpbkkvrtslyscpgdrxofdnelhdqybmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158492.7547016-762-43950915607784/AnsiballZ_command.py'
Jan 23 08:54:52 compute-0 sudo[36840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:55 compute-0 python3.9[36842]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:54:55 compute-0 sudo[36840]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:56 compute-0 sudo[36993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlqmbyiwdfrclnodhlwiatqvsljynype ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158496.0061624-786-116926809248500/AnsiballZ_file.py'
Jan 23 08:54:56 compute-0 sudo[36993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:56 compute-0 python3.9[36995]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:54:56 compute-0 sudo[36993]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:57 compute-0 sudo[37145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mldqgqmpuzsoqyqnvfopngfnzuxsedoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158496.814404-819-48012026769800/AnsiballZ_getent.py'
Jan 23 08:54:57 compute-0 sudo[37145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:57 compute-0 python3.9[37147]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 08:54:57 compute-0 sudo[37145]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:57 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 08:54:57 compute-0 sudo[37299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prrxaqjhpprgdevzmeduaraombgrkkow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158497.4824479-843-52667711037969/AnsiballZ_group.py'
Jan 23 08:54:57 compute-0 sudo[37299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:57 compute-0 python3.9[37301]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 08:54:57 compute-0 groupadd[37302]: group added to /etc/group: name=qemu, GID=107
Jan 23 08:54:58 compute-0 groupadd[37302]: group added to /etc/gshadow: name=qemu
Jan 23 08:54:58 compute-0 groupadd[37302]: new group: name=qemu, GID=107
Jan 23 08:54:58 compute-0 sudo[37299]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:58 compute-0 sudo[37457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqtqkxujkpqmygrvxurnyapmugnnmwbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158498.1788146-867-152942221634824/AnsiballZ_user.py'
Jan 23 08:54:58 compute-0 sudo[37457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:58 compute-0 python3.9[37459]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 08:54:58 compute-0 useradd[37461]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 08:54:58 compute-0 sudo[37457]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:59 compute-0 sudo[37617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzrytxcrsgksovffzrqnycopgqhlodfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158499.0098217-891-103828386057525/AnsiballZ_getent.py'
Jan 23 08:54:59 compute-0 sudo[37617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:59 compute-0 python3.9[37619]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 08:54:59 compute-0 sudo[37617]: pam_unix(sudo:session): session closed for user root
Jan 23 08:54:59 compute-0 sudo[37770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efyukjpypikkxwbnkguchdilzfeokxgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158499.5586016-915-66231892731372/AnsiballZ_group.py'
Jan 23 08:54:59 compute-0 sudo[37770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:54:59 compute-0 python3.9[37772]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 08:54:59 compute-0 groupadd[37773]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 23 08:54:59 compute-0 groupadd[37773]: group added to /etc/gshadow: name=hugetlbfs
Jan 23 08:54:59 compute-0 groupadd[37773]: new group: name=hugetlbfs, GID=42477
Jan 23 08:54:59 compute-0 sudo[37770]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:00 compute-0 sudo[37928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stsaakitizecsvbehmtrlaiwrbqomlya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158500.1887403-942-104094451379970/AnsiballZ_file.py'
Jan 23 08:55:00 compute-0 sudo[37928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:00 compute-0 python3.9[37930]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 08:55:00 compute-0 sudo[37928]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:01 compute-0 sudo[38080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-henxysfpruntzhrtivfmwbynlbkjfplm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158500.9266922-975-19502761959236/AnsiballZ_dnf.py'
Jan 23 08:55:01 compute-0 sudo[38080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:01 compute-0 python3.9[38082]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:55:02 compute-0 sudo[38080]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:02 compute-0 sudo[38233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuxnlmuigwuxxezogndahqbrbdgzhaqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158502.7202322-999-262815230535103/AnsiballZ_file.py'
Jan 23 08:55:02 compute-0 sudo[38233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:03 compute-0 python3.9[38235]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:55:03 compute-0 sudo[38233]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:03 compute-0 sudo[38385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgewnnuapzmwjnhauehujjtgfljhmgtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158503.3059318-1023-119604286799912/AnsiballZ_stat.py'
Jan 23 08:55:03 compute-0 sudo[38385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:03 compute-0 python3.9[38387]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:55:03 compute-0 sudo[38385]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:03 compute-0 sudo[38508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fljhrhcbyjdyklndiyjkwbgzvltyqjyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158503.3059318-1023-119604286799912/AnsiballZ_copy.py'
Jan 23 08:55:03 compute-0 sudo[38508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:04 compute-0 python3.9[38510]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158503.3059318-1023-119604286799912/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:55:04 compute-0 sudo[38508]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:04 compute-0 sudo[38660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daebtkvagizetmsdeabfekgixaiygtqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158504.29036-1068-27086354003029/AnsiballZ_systemd.py'
Jan 23 08:55:04 compute-0 sudo[38660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:05 compute-0 python3.9[38662]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 08:55:05 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 23 08:55:05 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 23 08:55:05 compute-0 kernel: Bridge firewalling registered
Jan 23 08:55:05 compute-0 systemd-modules-load[38666]: Inserted module 'br_netfilter'
Jan 23 08:55:05 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 23 08:55:05 compute-0 sudo[38660]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:05 compute-0 sudo[38819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygpudoiqfhwnwowprpoujjfwvyhmylit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158505.3118937-1092-66367190131366/AnsiballZ_stat.py'
Jan 23 08:55:05 compute-0 sudo[38819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:05 compute-0 python3.9[38821]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:55:05 compute-0 sudo[38819]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:06 compute-0 sudo[38942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukkzqkkhygiexhbqlvufgzoyjqgssuwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158505.3118937-1092-66367190131366/AnsiballZ_copy.py'
Jan 23 08:55:06 compute-0 sudo[38942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:06 compute-0 python3.9[38944]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158505.3118937-1092-66367190131366/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:55:06 compute-0 sudo[38942]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:06 compute-0 sudo[39094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tthihjakicnojjirnkdabiriqndzhnhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158506.631027-1146-204580536451423/AnsiballZ_dnf.py'
Jan 23 08:55:06 compute-0 sudo[39094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:07 compute-0 python3.9[39096]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:55:14 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Jan 23 08:55:14 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Jan 23 08:55:15 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 08:55:15 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 23 08:55:15 compute-0 systemd[1]: Reloading.
Jan 23 08:55:15 compute-0 systemd-rc-local-generator[39160]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:55:15 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 08:55:15 compute-0 sudo[39094]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:16 compute-0 python3.9[40876]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 08:55:17 compute-0 python3.9[42004]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 08:55:17 compute-0 python3.9[42860]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 08:55:17 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 08:55:17 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 23 08:55:17 compute-0 systemd[1]: man-db-cache-update.service: Consumed 3.187s CPU time.
Jan 23 08:55:17 compute-0 systemd[1]: run-r6688b800d256444f8b86e7e59dc0ef06.service: Deactivated successfully.
Jan 23 08:55:18 compute-0 sudo[43265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pybetetvxowslrzccgcjvfdqanuwqhgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158517.9418755-1263-222707769013379/AnsiballZ_command.py'
Jan 23 08:55:18 compute-0 sudo[43265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:18 compute-0 python3.9[43267]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:55:18 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 08:55:18 compute-0 systemd[1]: Starting Authorization Manager...
Jan 23 08:55:18 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 08:55:18 compute-0 polkitd[43484]: Started polkitd version 0.117
Jan 23 08:55:18 compute-0 polkitd[43484]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 08:55:18 compute-0 polkitd[43484]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 08:55:18 compute-0 polkitd[43484]: Finished loading, compiling and executing 2 rules
Jan 23 08:55:18 compute-0 polkitd[43484]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 23 08:55:18 compute-0 systemd[1]: Started Authorization Manager.
Jan 23 08:55:18 compute-0 sudo[43265]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:19 compute-0 sudo[43648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwnwckyyxjtyfhrsllvywprgrjnswpbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158519.0466223-1290-83090093444872/AnsiballZ_systemd.py'
Jan 23 08:55:19 compute-0 sudo[43648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:19 compute-0 python3.9[43650]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:55:19 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 08:55:19 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 08:55:19 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 08:55:19 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 08:55:19 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 08:55:19 compute-0 sudo[43648]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:20 compute-0 python3.9[43812]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 08:55:23 compute-0 sudo[43962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crqgebvmdqxyqjmkztjaaxyenvwbeldj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158523.0894127-1461-111291355925174/AnsiballZ_systemd.py'
Jan 23 08:55:23 compute-0 sudo[43962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:23 compute-0 python3.9[43964]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:55:23 compute-0 systemd[1]: Reloading.
Jan 23 08:55:23 compute-0 systemd-rc-local-generator[43989]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:55:23 compute-0 sudo[43962]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:24 compute-0 sudo[44151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndzgqqftljlscuwoviqwzaxrnfpzeqzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158523.8189762-1461-103944866116574/AnsiballZ_systemd.py'
Jan 23 08:55:24 compute-0 sudo[44151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:24 compute-0 python3.9[44153]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:55:24 compute-0 systemd[1]: Reloading.
Jan 23 08:55:24 compute-0 systemd-rc-local-generator[44178]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:55:24 compute-0 sudo[44151]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:24 compute-0 sudo[44340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvwsrnylrkaylneadqjrwazjwdximbcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158524.7061656-1509-56779041622305/AnsiballZ_command.py'
Jan 23 08:55:24 compute-0 sudo[44340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:25 compute-0 python3.9[44342]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:55:25 compute-0 sudo[44340]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:25 compute-0 sudo[44493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaiumponsjsuikmwsxpszdizyentrqyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158525.2431567-1533-269491183346916/AnsiballZ_command.py'
Jan 23 08:55:25 compute-0 sudo[44493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:25 compute-0 python3.9[44495]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:55:25 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 23 08:55:25 compute-0 sudo[44493]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:25 compute-0 sudo[44646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdcynuyqxhvzhhcukphnjpusnbisgash ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158525.7631998-1557-20578329096677/AnsiballZ_command.py'
Jan 23 08:55:25 compute-0 sudo[44646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:26 compute-0 python3.9[44648]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:55:27 compute-0 sudo[44646]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:27 compute-0 sudo[44808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asjznaztfrdqnocrsszzlxwcsbfjdeqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158527.3219075-1581-140670389778431/AnsiballZ_command.py'
Jan 23 08:55:27 compute-0 sudo[44808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:27 compute-0 python3.9[44810]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:55:27 compute-0 sudo[44808]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:28 compute-0 sudo[44961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeainreyeervlcfrvdirhuwkfsnffidb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158527.8212469-1605-173858679659551/AnsiballZ_systemd.py'
Jan 23 08:55:28 compute-0 sudo[44961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:28 compute-0 python3.9[44963]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 08:55:28 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 08:55:28 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 08:55:28 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Jan 23 08:55:28 compute-0 systemd[1]: Starting Apply Kernel Variables...
Jan 23 08:55:28 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 08:55:28 compute-0 systemd[1]: Finished Apply Kernel Variables.
Jan 23 08:55:28 compute-0 sudo[44961]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:28 compute-0 sshd-session[31407]: Connection closed by 192.168.122.30 port 58642
Jan 23 08:55:28 compute-0 sshd-session[31404]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:55:28 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Jan 23 08:55:28 compute-0 systemd[1]: session-9.scope: Consumed 1min 39.155s CPU time.
Jan 23 08:55:28 compute-0 systemd-logind[746]: Session 9 logged out. Waiting for processes to exit.
Jan 23 08:55:28 compute-0 systemd-logind[746]: Removed session 9.
Jan 23 08:55:34 compute-0 sshd-session[44994]: Accepted publickey for zuul from 192.168.122.30 port 42812 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 08:55:34 compute-0 systemd-logind[746]: New session 10 of user zuul.
Jan 23 08:55:34 compute-0 systemd[1]: Started Session 10 of User zuul.
Jan 23 08:55:34 compute-0 sshd-session[44994]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:55:34 compute-0 python3.9[45147]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:55:35 compute-0 python3.9[45301]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:55:36 compute-0 sudo[45455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntwndwttllgztuimpowynwhbgvvjrhnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158536.5183258-110-137972500344735/AnsiballZ_command.py'
Jan 23 08:55:36 compute-0 sudo[45455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:36 compute-0 python3.9[45457]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:55:36 compute-0 sudo[45455]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:37 compute-0 python3.9[45608]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:55:38 compute-0 sudo[45762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfwfxpktypjljvyjanfkjrddprllvsbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158538.0572996-170-39318661854463/AnsiballZ_setup.py'
Jan 23 08:55:38 compute-0 sudo[45762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:38 compute-0 python3.9[45764]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:55:38 compute-0 sudo[45762]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:38 compute-0 sudo[45846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxfwzsauspwhahdvqxkegrxuoedzwxmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158538.0572996-170-39318661854463/AnsiballZ_dnf.py'
Jan 23 08:55:39 compute-0 sudo[45846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:39 compute-0 python3.9[45848]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:55:40 compute-0 sudo[45846]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:40 compute-0 sudo[45999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abusjdacnxzbfdaykeyviqvdzqslvkvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158540.228201-206-56158129491231/AnsiballZ_setup.py'
Jan 23 08:55:40 compute-0 sudo[45999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:40 compute-0 python3.9[46001]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:55:40 compute-0 sudo[45999]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:41 compute-0 sudo[46170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esgwmdbzjjwsqwaikslaadrcjxzkksoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158541.0157747-239-81963238243368/AnsiballZ_file.py'
Jan 23 08:55:41 compute-0 sudo[46170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:41 compute-0 python3.9[46172]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:55:41 compute-0 sudo[46170]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:41 compute-0 sudo[46322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmwaqqhmiairxfeqszboivqpkdfbpxel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158541.6226337-263-205777193264905/AnsiballZ_command.py'
Jan 23 08:55:41 compute-0 sudo[46322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:41 compute-0 python3.9[46324]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:55:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat353067602-merged.mount: Deactivated successfully.
Jan 23 08:55:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1151252567-merged.mount: Deactivated successfully.
Jan 23 08:55:41 compute-0 podman[46325]: 2026-01-23 08:55:41.988699255 +0000 UTC m=+0.037183219 system refresh
Jan 23 08:55:42 compute-0 sudo[46322]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:42 compute-0 sudo[46483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyogqpfwmvbykcehhkepsegeodhbwuel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158542.3572502-287-74528878738682/AnsiballZ_stat.py'
Jan 23 08:55:42 compute-0 sudo[46483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:42 compute-0 python3.9[46485]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:55:42 compute-0 sudo[46483]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-opaque\x2dbug\x2dcheck3853261186-merged.mount: Deactivated successfully.
Jan 23 08:55:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:55:43 compute-0 sudo[46606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccnjnhvbzixfiyrvcfrgqltaludysnap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158542.3572502-287-74528878738682/AnsiballZ_copy.py'
Jan 23 08:55:43 compute-0 sudo[46606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:43 compute-0 python3.9[46608]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158542.3572502-287-74528878738682/.source.json follow=False _original_basename=podman_network_config.j2 checksum=4d26f7f5affa107689739ed59175d0f531c886f1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:55:43 compute-0 sudo[46606]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:43 compute-0 sudo[46758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zobkrzecdulfeamnaklmfwrfzxyjylwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158543.4657595-332-223203740901032/AnsiballZ_stat.py'
Jan 23 08:55:43 compute-0 sudo[46758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:43 compute-0 python3.9[46760]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:55:43 compute-0 sudo[46758]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:44 compute-0 sudo[46881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdedeoklppefjoqnbeopckfsjgtlcfcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158543.4657595-332-223203740901032/AnsiballZ_copy.py'
Jan 23 08:55:44 compute-0 sudo[46881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:44 compute-0 python3.9[46883]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158543.4657595-332-223203740901032/.source.conf follow=False _original_basename=registries.conf.j2 checksum=7d6103ee1a01cd01d921f72f1af62704e0a47ff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:55:44 compute-0 sudo[46881]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:44 compute-0 sudo[47033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwiwxeoyzfdiaqkvfngsdtqywolywvsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158544.406129-380-277229661620020/AnsiballZ_ini_file.py'
Jan 23 08:55:44 compute-0 sudo[47033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:44 compute-0 python3.9[47035]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:55:44 compute-0 sudo[47033]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:45 compute-0 sudo[47185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbruuzgubkhvdrajjjcanbnecwzuwmvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158545.0133176-380-239059303855627/AnsiballZ_ini_file.py'
Jan 23 08:55:45 compute-0 sudo[47185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:45 compute-0 python3.9[47187]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:55:45 compute-0 sudo[47185]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:45 compute-0 sudo[47337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clpfmrfypkouidupmeowmhdoafnoxqsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158545.4952912-380-149372243258933/AnsiballZ_ini_file.py'
Jan 23 08:55:45 compute-0 sudo[47337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:45 compute-0 python3.9[47339]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:55:45 compute-0 sudo[47337]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:46 compute-0 sudo[47489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbtahcagmhinxbrbogwnbbtjreivmwfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158545.9757104-380-221668255853724/AnsiballZ_ini_file.py'
Jan 23 08:55:46 compute-0 sudo[47489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:46 compute-0 python3.9[47491]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:55:46 compute-0 sudo[47489]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:47 compute-0 python3.9[47641]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:55:47 compute-0 sudo[47793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvsgzjmcpnhcxbwqrqcggsirocqopquv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158547.3240309-500-168155263109797/AnsiballZ_dnf.py'
Jan 23 08:55:47 compute-0 sudo[47793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:47 compute-0 python3.9[47795]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:55:48 compute-0 sudo[47793]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:49 compute-0 sudo[47946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwkonbfqchkteyuyavqlbhhtlnwxpffv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158548.815827-524-204805244712567/AnsiballZ_dnf.py'
Jan 23 08:55:49 compute-0 sudo[47946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:49 compute-0 python3.9[47948]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:55:51 compute-0 sudo[47946]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:52 compute-0 sudo[48106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixaugsjjbnjvoinyqiwuspqyzncpbmkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158552.534867-554-212747034723910/AnsiballZ_dnf.py'
Jan 23 08:55:52 compute-0 sudo[48106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:52 compute-0 python3.9[48108]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:55:53 compute-0 sudo[48106]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:54 compute-0 sudo[48259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnudshropdmvqnrhldiiadsohuudxdtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158554.1534271-581-109836333217602/AnsiballZ_dnf.py'
Jan 23 08:55:54 compute-0 sudo[48259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:54 compute-0 python3.9[48261]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:55:55 compute-0 sudo[48259]: pam_unix(sudo:session): session closed for user root
Jan 23 08:55:56 compute-0 sudo[48412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtnbhgipeptkkmdlqkfuoykcegnmvfxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158555.9036593-614-143836146563298/AnsiballZ_dnf.py'
Jan 23 08:55:56 compute-0 sudo[48412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:55:56 compute-0 python3.9[48414]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:55:58 compute-0 sudo[48412]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:00 compute-0 sudo[48568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ondfsczhkcwfefxdgrbvinrvetolozor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158559.9832456-638-222598862139212/AnsiballZ_dnf.py'
Jan 23 08:56:00 compute-0 sudo[48568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:00 compute-0 python3.9[48570]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:56:05 compute-0 sudo[48568]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:06 compute-0 sudo[48739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyyhlsqkmfyddafeaagxeqynxrzyhjqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158566.441807-665-85078269251613/AnsiballZ_dnf.py'
Jan 23 08:56:06 compute-0 sudo[48739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:06 compute-0 python3.9[48741]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:56:07 compute-0 sudo[48739]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:08 compute-0 sudo[48892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqryboalcmvguxkfpjngruktpomxctpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158568.1665208-692-258729698656331/AnsiballZ_dnf.py'
Jan 23 08:56:08 compute-0 sudo[48892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:08 compute-0 python3.9[48894]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:56:21 compute-0 sudo[48892]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:45 compute-0 sudo[49230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlyylczfzifgclvsvxazgnizpgavanhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158605.391355-719-93303077013419/AnsiballZ_dnf.py'
Jan 23 08:56:45 compute-0 sudo[49230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:45 compute-0 python3.9[49232]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:56:46 compute-0 sudo[49230]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:47 compute-0 sudo[49386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bveumfxgkuiammzihssymjywymfjrozt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158607.084667-749-55358674054750/AnsiballZ_dnf.py'
Jan 23 08:56:47 compute-0 sudo[49386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:47 compute-0 python3.9[49388]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:56:50 compute-0 sudo[49386]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:51 compute-0 sudo[49544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuybfhcedkobzmaihisfahjqrehysmdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158610.9488935-782-266520541352799/AnsiballZ_file.py'
Jan 23 08:56:51 compute-0 sudo[49544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:51 compute-0 python3.9[49546]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:56:51 compute-0 sudo[49544]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:51 compute-0 sudo[49719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uizedmlohmoaaxwsgmnfntjgoligrhyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158611.4486737-806-192985809697396/AnsiballZ_stat.py'
Jan 23 08:56:51 compute-0 sudo[49719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:51 compute-0 python3.9[49721]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:56:51 compute-0 sudo[49719]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:52 compute-0 sudo[49842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlqhruhdjnfohhygxrhypafrzbyaxhpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158611.4486737-806-192985809697396/AnsiballZ_copy.py'
Jan 23 08:56:52 compute-0 sudo[49842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:52 compute-0 python3.9[49844]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769158611.4486737-806-192985809697396/.source.json _original_basename=.pbi_zldt follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:56:52 compute-0 sudo[49842]: pam_unix(sudo:session): session closed for user root
Jan 23 08:56:52 compute-0 sudo[49994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apcmosopegtcpsfwbkumazufksrphjwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158612.4987261-860-203127706389067/AnsiballZ_podman_image.py'
Jan 23 08:56:52 compute-0 sudo[49994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:56:52 compute-0 python3.9[49996]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 23 08:56:53 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:56:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3987267001-lower\x2dmapped.mount: Deactivated successfully.
Jan 23 08:56:59 compute-0 podman[50006]: 2026-01-23 08:56:59.467557975 +0000 UTC m=+6.432676467 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 08:56:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:56:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:56:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:56:59 compute-0 sudo[49994]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:00 compute-0 sudo[50274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjlbukkbtsuqletaagbstriutekdukvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158620.240103-899-138166776741897/AnsiballZ_podman_image.py'
Jan 23 08:57:00 compute-0 sudo[50274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:00 compute-0 python3.9[50276]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 23 08:57:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:14 compute-0 podman[50287]: 2026-01-23 08:57:14.427806013 +0000 UTC m=+13.806371445 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 08:57:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:14 compute-0 sudo[50274]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:16 compute-0 sudo[50538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xronwuzwplbzwvtdfqsxatbkypkhparr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158635.951695-932-191699217980556/AnsiballZ_podman_image.py'
Jan 23 08:57:16 compute-0 sudo[50538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:16 compute-0 python3.9[50540]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 23 08:57:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:20 compute-0 podman[50550]: 2026-01-23 08:57:20.17730781 +0000 UTC m=+3.830071965 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 23 08:57:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:20 compute-0 sudo[50538]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:20 compute-0 sudo[50782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uimaymklcznvtxaqptyqwtrradxiemha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158640.4174817-932-263639953434662/AnsiballZ_podman_image.py'
Jan 23 08:57:20 compute-0 sudo[50782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:20 compute-0 python3.9[50784]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 23 08:57:22 compute-0 podman[50794]: 2026-01-23 08:57:22.701864727 +0000 UTC m=+1.900035614 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 23 08:57:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:57:22 compute-0 sudo[50782]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:23 compute-0 sshd-session[44997]: Connection closed by 192.168.122.30 port 42812
Jan 23 08:57:23 compute-0 sshd-session[44994]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:57:23 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Jan 23 08:57:23 compute-0 systemd[1]: session-10.scope: Consumed 1min 8.356s CPU time.
Jan 23 08:57:23 compute-0 systemd-logind[746]: Session 10 logged out. Waiting for processes to exit.
Jan 23 08:57:23 compute-0 systemd-logind[746]: Removed session 10.
Jan 23 08:57:29 compute-0 sshd-session[50917]: Accepted publickey for zuul from 192.168.122.30 port 59554 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 08:57:29 compute-0 systemd-logind[746]: New session 11 of user zuul.
Jan 23 08:57:29 compute-0 systemd[1]: Started Session 11 of User zuul.
Jan 23 08:57:29 compute-0 sshd-session[50917]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:57:30 compute-0 python3.9[51070]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:57:30 compute-0 sudo[51224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fokjwltvdfcmljkleqgdecuqnbonffkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158650.5424519-68-67953222007020/AnsiballZ_getent.py'
Jan 23 08:57:30 compute-0 sudo[51224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:30 compute-0 python3.9[51226]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 08:57:30 compute-0 sudo[51224]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:31 compute-0 sudo[51377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trnaaaacgnvdaennghtsmrdeahbecadi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158651.1431572-92-149666746848138/AnsiballZ_group.py'
Jan 23 08:57:31 compute-0 sudo[51377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:31 compute-0 python3.9[51379]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 08:57:31 compute-0 groupadd[51380]: group added to /etc/group: name=openvswitch, GID=42476
Jan 23 08:57:31 compute-0 groupadd[51380]: group added to /etc/gshadow: name=openvswitch
Jan 23 08:57:31 compute-0 groupadd[51380]: new group: name=openvswitch, GID=42476
Jan 23 08:57:31 compute-0 sudo[51377]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:32 compute-0 sudo[51535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqfsphqkenzwrkepetyzuiurdwtqhihn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158651.7708025-116-118216562717258/AnsiballZ_user.py'
Jan 23 08:57:32 compute-0 sudo[51535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:32 compute-0 python3.9[51537]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 08:57:32 compute-0 useradd[51539]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 08:57:32 compute-0 useradd[51539]: add 'openvswitch' to group 'hugetlbfs'
Jan 23 08:57:32 compute-0 useradd[51539]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 23 08:57:32 compute-0 sudo[51535]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:32 compute-0 sudo[51695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpfzeuuiykfixepdqssxvgatvcxrkmza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158652.5976632-146-45820550185793/AnsiballZ_setup.py'
Jan 23 08:57:32 compute-0 sudo[51695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:33 compute-0 python3.9[51697]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:57:33 compute-0 sudo[51695]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:33 compute-0 sudo[51779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmayxylueeuswbqypaxgyndtoalwqcri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158652.5976632-146-45820550185793/AnsiballZ_dnf.py'
Jan 23 08:57:33 compute-0 sudo[51779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:33 compute-0 python3.9[51781]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 08:57:36 compute-0 sudo[51779]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:37 compute-0 sudo[51941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szljdmghbnfyvajspeizuvqnjimjkdnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158657.0351675-188-144071345734990/AnsiballZ_dnf.py'
Jan 23 08:57:37 compute-0 sudo[51941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:37 compute-0 python3.9[51943]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:57:45 compute-0 kernel: SELinux:  Converting 2738 SID table entries...
Jan 23 08:57:45 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 08:57:45 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 23 08:57:45 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 08:57:45 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 23 08:57:45 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 08:57:45 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 08:57:45 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 08:57:45 compute-0 groupadd[51966]: group added to /etc/group: name=unbound, GID=994
Jan 23 08:57:45 compute-0 groupadd[51966]: group added to /etc/gshadow: name=unbound
Jan 23 08:57:45 compute-0 groupadd[51966]: new group: name=unbound, GID=994
Jan 23 08:57:45 compute-0 useradd[51973]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 23 08:57:45 compute-0 dbus-broker-launch[733]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 23 08:57:45 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 23 08:57:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 08:57:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 23 08:57:46 compute-0 systemd[1]: Reloading.
Jan 23 08:57:46 compute-0 systemd-rc-local-generator[52464]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:57:46 compute-0 systemd-sysv-generator[52467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:57:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 08:57:47 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 08:57:47 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 23 08:57:47 compute-0 systemd[1]: run-r31459793249b4b1d874e8cd049c2d554.service: Deactivated successfully.
Jan 23 08:57:47 compute-0 sudo[51941]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:48 compute-0 sudo[53039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oievvpiqfyxwwelqblxrqqdliwteugku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158667.625923-212-100440250206438/AnsiballZ_systemd.py'
Jan 23 08:57:48 compute-0 sudo[53039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:48 compute-0 python3.9[53041]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 08:57:48 compute-0 systemd[1]: Reloading.
Jan 23 08:57:48 compute-0 systemd-rc-local-generator[53065]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:57:48 compute-0 systemd-sysv-generator[53068]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:57:48 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Jan 23 08:57:48 compute-0 chown[53082]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 23 08:57:48 compute-0 ovs-ctl[53087]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 23 08:57:48 compute-0 ovs-ctl[53087]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 23 08:57:48 compute-0 ovs-ctl[53087]: Starting ovsdb-server [  OK  ]
Jan 23 08:57:48 compute-0 ovs-vsctl[53136]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 23 08:57:48 compute-0 ovs-vsctl[53156]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"5bdcc3dc-0ac8-4139-a9e7-75947c17f20e\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 23 08:57:48 compute-0 ovs-ctl[53087]: Configuring Open vSwitch system IDs [  OK  ]
Jan 23 08:57:48 compute-0 ovs-ctl[53087]: Enabling remote OVSDB managers [  OK  ]
Jan 23 08:57:48 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Jan 23 08:57:48 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 23 08:57:48 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 23 08:57:48 compute-0 ovs-vsctl[53173]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 23 08:57:48 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 23 08:57:48 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Jan 23 08:57:48 compute-0 ovs-ctl[53206]: Inserting openvswitch module [  OK  ]
Jan 23 08:57:48 compute-0 ovs-ctl[53175]: Starting ovs-vswitchd [  OK  ]
Jan 23 08:57:48 compute-0 ovs-ctl[53175]: Enabling remote OVSDB managers [  OK  ]
Jan 23 08:57:48 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 23 08:57:48 compute-0 ovs-vsctl[53224]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 23 08:57:48 compute-0 systemd[1]: Starting Open vSwitch...
Jan 23 08:57:48 compute-0 systemd[1]: Finished Open vSwitch.
Jan 23 08:57:48 compute-0 sudo[53039]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:49 compute-0 python3.9[53375]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:57:50 compute-0 sudo[53525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhrrtniqtnpchzwyltnaeqmwbfsqsrln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158669.9264-266-222755946428044/AnsiballZ_sefcontext.py'
Jan 23 08:57:50 compute-0 sudo[53525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:50 compute-0 python3.9[53527]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 08:57:51 compute-0 kernel: SELinux:  Converting 2752 SID table entries...
Jan 23 08:57:51 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 08:57:51 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 23 08:57:51 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 08:57:51 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 23 08:57:51 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 08:57:51 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 08:57:51 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 08:57:51 compute-0 sudo[53525]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:52 compute-0 python3.9[53682]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:57:52 compute-0 sudo[53838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdgqbsmtpeuclcsgmdtxpgnntdybaelr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158672.3668208-320-185948906021123/AnsiballZ_dnf.py'
Jan 23 08:57:52 compute-0 dbus-broker-launch[733]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 23 08:57:52 compute-0 sudo[53838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:52 compute-0 python3.9[53840]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:57:53 compute-0 sudo[53838]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:54 compute-0 sudo[53992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btuufdjkhllpdykayfjsihuzhdxupwmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158673.8872092-344-7705960527801/AnsiballZ_command.py'
Jan 23 08:57:54 compute-0 sudo[53992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:54 compute-0 python3.9[53994]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:57:54 compute-0 sudo[53992]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:55 compute-0 sudo[54279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqsalzitepmiokggpawnhcpdrcjwmpko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158675.037789-368-263096179171702/AnsiballZ_file.py'
Jan 23 08:57:55 compute-0 sudo[54279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:55 compute-0 python3.9[54281]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 08:57:55 compute-0 sudo[54279]: pam_unix(sudo:session): session closed for user root
Jan 23 08:57:56 compute-0 python3.9[54431]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 08:57:56 compute-0 sudo[54583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcchhemrfxipvexskjddtdxczyovqqfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158676.2685807-416-264937162232120/AnsiballZ_dnf.py'
Jan 23 08:57:56 compute-0 sudo[54583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:57:56 compute-0 python3.9[54585]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:57:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 08:57:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 23 08:57:59 compute-0 systemd[1]: Reloading.
Jan 23 08:57:59 compute-0 systemd-sysv-generator[54619]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:57:59 compute-0 systemd-rc-local-generator[54616]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:57:59 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 08:57:59 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 08:57:59 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 23 08:57:59 compute-0 systemd[1]: run-r9847579d217c40b580f120051449041f.service: Deactivated successfully.
Jan 23 08:57:59 compute-0 sudo[54583]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:00 compute-0 sudo[54900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eejqqccbdsmhmpzvnvcwjprjvxeniiif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158679.895142-440-26184500991588/AnsiballZ_systemd.py'
Jan 23 08:58:00 compute-0 sudo[54900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:00 compute-0 python3.9[54902]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 08:58:00 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 08:58:00 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 08:58:00 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 08:58:00 compute-0 systemd[1]: Stopping Network Manager...
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3528] caught SIGTERM, shutting down normally.
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3535] dhcp4 (eth0): canceled DHCP transaction
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3536] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3536] dhcp4 (eth0): state changed no lease
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3537] dhcp6 (eth0): canceled DHCP transaction
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3538] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3538] dhcp6 (eth0): state changed no lease
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3539] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 08:58:00 compute-0 NetworkManager[7254]: <info>  [1769158680.3569] exiting (success)
Jan 23 08:58:00 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 08:58:00 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 08:58:00 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 08:58:00 compute-0 systemd[1]: Stopped Network Manager.
Jan 23 08:58:00 compute-0 systemd[1]: Starting Network Manager...
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4225] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:1b526f85-5c85-4390-8f4f-c5e9ef0f805d)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4227] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4269] manager[0x55dafa0d5000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 08:58:00 compute-0 systemd[1]: Starting Hostname Service...
Jan 23 08:58:00 compute-0 systemd[1]: Started Hostname Service.
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4817] hostname: hostname: using hostnamed
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4818] hostname: static hostname changed from (none) to "compute-0"
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4821] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4823] manager[0x55dafa0d5000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4823] manager[0x55dafa0d5000]: rfkill: WWAN hardware radio set enabled
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4838] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4844] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4845] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4845] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4846] manager: Networking is enabled by state file
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4848] settings: Loaded settings plugin: keyfile (internal)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4855] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4878] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4884] dhcp: init: Using DHCP client 'internal'
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4887] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4891] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4894] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4900] device (lo): Activation: starting connection 'lo' (ab218173-ee72-4eee-b4f5-cff3eabad9ac)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4905] device (eth0): carrier: link connected
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4908] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4912] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4912] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4917] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4921] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4925] device (eth1): carrier: link connected
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4928] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4932] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (eb1a661b-2380-537d-bcf6-15c64e0c452b) (indicated)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4932] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4935] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4941] device (eth1): Activation: starting connection 'ci-private-network' (eb1a661b-2380-537d-bcf6-15c64e0c452b)
Jan 23 08:58:00 compute-0 systemd[1]: Started Network Manager.
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4947] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4952] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4954] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4955] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4957] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4959] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4960] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4961] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4973] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4978] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4980] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4982] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4991] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4993] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.4997] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5005] dhcp4 (eth0): state changed new lease, address=192.168.26.29
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5010] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5030] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5031] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5032] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5035] device (lo): Activation: successful, device activated.
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5039] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5040] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 23 08:58:00 compute-0 NetworkManager[54920]: <info>  [1769158680.5042] device (eth1): Activation: successful, device activated.
Jan 23 08:58:00 compute-0 systemd[1]: Starting Network Manager Wait Online...
Jan 23 08:58:00 compute-0 sudo[54900]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:00 compute-0 sudo[55109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lndlcwkgwclkweihahyynsbbnoggubdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158680.6768837-464-89400961558876/AnsiballZ_dnf.py'
Jan 23 08:58:00 compute-0 sudo[55109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:01 compute-0 python3.9[55111]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:58:01 compute-0 NetworkManager[54920]: <info>  [1769158681.5394] dhcp6 (eth0): state changed new lease, address=2001:db8::397
Jan 23 08:58:01 compute-0 NetworkManager[54920]: <info>  [1769158681.5403] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 08:58:01 compute-0 NetworkManager[54920]: <info>  [1769158681.5434] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 08:58:01 compute-0 NetworkManager[54920]: <info>  [1769158681.5435] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 08:58:01 compute-0 NetworkManager[54920]: <info>  [1769158681.5438] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 08:58:01 compute-0 NetworkManager[54920]: <info>  [1769158681.5440] device (eth0): Activation: successful, device activated.
Jan 23 08:58:01 compute-0 NetworkManager[54920]: <info>  [1769158681.5444] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 08:58:01 compute-0 NetworkManager[54920]: <info>  [1769158681.5449] manager: startup complete
Jan 23 08:58:01 compute-0 systemd[1]: Finished Network Manager Wait Online.
Jan 23 08:58:10 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 08:58:10 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 23 08:58:10 compute-0 systemd[1]: Reloading.
Jan 23 08:58:10 compute-0 systemd-sysv-generator[55182]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:58:10 compute-0 systemd-rc-local-generator[55179]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:58:10 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 08:58:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 08:58:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 23 08:58:11 compute-0 systemd[1]: run-r905c5d3f448347049f9610408b6bf880.service: Deactivated successfully.
Jan 23 08:58:11 compute-0 sudo[55109]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:11 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 08:58:11 compute-0 sudo[55590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvkamhralocjhewdfxsmurfbxsakytwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158691.8065953-500-3572977969309/AnsiballZ_stat.py'
Jan 23 08:58:11 compute-0 sudo[55590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:12 compute-0 python3.9[55592]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 08:58:12 compute-0 sudo[55590]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:12 compute-0 sudo[55742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mivwwntuevjcbdwsnhkhezelscdejhnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158692.319275-527-140499813874655/AnsiballZ_ini_file.py'
Jan 23 08:58:12 compute-0 sudo[55742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:12 compute-0 python3.9[55744]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:12 compute-0 sudo[55742]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:13 compute-0 sudo[55896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bifqmelqebxyxbplkpcjkrikihpycbxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158693.0713272-557-248717911038745/AnsiballZ_ini_file.py'
Jan 23 08:58:13 compute-0 sudo[55896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:13 compute-0 python3.9[55898]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:13 compute-0 sudo[55896]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:13 compute-0 sudo[56048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxxzmwhyutpktduglnekwuzkezosrbpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158693.525127-557-26561849094735/AnsiballZ_ini_file.py'
Jan 23 08:58:13 compute-0 sudo[56048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:13 compute-0 python3.9[56050]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:13 compute-0 sudo[56048]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:14 compute-0 sudo[56202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsqrnvbhpmywcpyjyycrgvsusuqdtluu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158694.125728-602-69259603307421/AnsiballZ_ini_file.py'
Jan 23 08:58:14 compute-0 sudo[56202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:14 compute-0 python3.9[56204]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:14 compute-0 sudo[56202]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:14 compute-0 sudo[56354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diiqyumsuaiyhenzczjstvpotdtirksp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158694.5666268-602-55241657289246/AnsiballZ_ini_file.py'
Jan 23 08:58:14 compute-0 sudo[56354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:14 compute-0 python3.9[56356]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:14 compute-0 sudo[56354]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:15 compute-0 sudo[56506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbecelyllegmtoxdxnaystkfudismnvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158695.076847-647-238959357223520/AnsiballZ_stat.py'
Jan 23 08:58:15 compute-0 sudo[56506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:15 compute-0 python3.9[56508]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:58:15 compute-0 sudo[56506]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:15 compute-0 sudo[56629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeirymadsrtftkvuhclxfrjnoktgelwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158695.076847-647-238959357223520/AnsiballZ_copy.py'
Jan 23 08:58:15 compute-0 sudo[56629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:15 compute-0 python3.9[56631]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158695.076847-647-238959357223520/.source _original_basename=.bp7zvrez follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:15 compute-0 sudo[56629]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:16 compute-0 sudo[56781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyzwyreswakfziqrxyzlfujswghufuku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158696.0628376-692-164731209516241/AnsiballZ_file.py'
Jan 23 08:58:16 compute-0 sudo[56781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:16 compute-0 python3.9[56783]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:16 compute-0 sudo[56781]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:16 compute-0 sudo[56933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvyqbdvxmcvbaevvtfarbblquqvudkun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158696.5905013-716-66825244040451/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 23 08:58:16 compute-0 sudo[56933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:17 compute-0 python3.9[56935]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 23 08:58:17 compute-0 sudo[56933]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:17 compute-0 sudo[57085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdsvkbvgjryqxredqjuiwolixbytcfni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158697.244089-743-114253055447558/AnsiballZ_file.py'
Jan 23 08:58:17 compute-0 sudo[57085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:17 compute-0 python3.9[57087]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:17 compute-0 sudo[57085]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:18 compute-0 sudo[57237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdzfzkcwismsgxivvppuvgbqdxvsberu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158697.881557-773-142430160289207/AnsiballZ_stat.py'
Jan 23 08:58:18 compute-0 sudo[57237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:18 compute-0 sudo[57237]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:18 compute-0 sudo[57360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuuoylrclfsdjogfiarqnzjmecenizvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158697.881557-773-142430160289207/AnsiballZ_copy.py'
Jan 23 08:58:18 compute-0 sudo[57360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:18 compute-0 sudo[57360]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:19 compute-0 sudo[57512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhpvkisbohgwvsaennpjdhkytqpdqebs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158698.7671394-818-124000853068806/AnsiballZ_slurp.py'
Jan 23 08:58:19 compute-0 sudo[57512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:19 compute-0 python3.9[57514]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 23 08:58:19 compute-0 sudo[57512]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:19 compute-0 sudo[57687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rliyydniycrgclgcquhevkoxgxixdoyd ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158699.4054284-845-271164097623432/async_wrapper.py j241307169380 300 /home/zuul/.ansible/tmp/ansible-tmp-1769158699.4054284-845-271164097623432/AnsiballZ_edpm_os_net_config.py _'
Jan 23 08:58:19 compute-0 sudo[57687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:20 compute-0 ansible-async_wrapper.py[57689]: Invoked with j241307169380 300 /home/zuul/.ansible/tmp/ansible-tmp-1769158699.4054284-845-271164097623432/AnsiballZ_edpm_os_net_config.py _
Jan 23 08:58:20 compute-0 ansible-async_wrapper.py[57692]: Starting module and watcher
Jan 23 08:58:20 compute-0 ansible-async_wrapper.py[57692]: Start watching 57693 (300)
Jan 23 08:58:20 compute-0 ansible-async_wrapper.py[57693]: Start module (57693)
Jan 23 08:58:20 compute-0 ansible-async_wrapper.py[57689]: Return async_wrapper task started.
Jan 23 08:58:20 compute-0 sudo[57687]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:20 compute-0 python3.9[57694]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 23 08:58:20 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 23 08:58:20 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 23 08:58:20 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 23 08:58:20 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 23 08:58:20 compute-0 kernel: cfg80211: failed to load regulatory.db
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.4753] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.4767] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5154] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5155] audit: op="connection-add" uuid="c4303188-0250-4a99-ab5a-92a1c2275499" name="br-ex-br" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5166] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5167] audit: op="connection-add" uuid="e3228f04-3bcb-4853-af07-4aa9d4b457f0" name="br-ex-port" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5176] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5177] audit: op="connection-add" uuid="bde061f7-d76f-4aee-93c6-cc0d6a82235c" name="eth1-port" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5186] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5187] audit: op="connection-add" uuid="a6ea65a0-accc-45b3-ba17-6f133bff7f1f" name="vlan20-port" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5196] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5197] audit: op="connection-add" uuid="8e38cfc2-294c-4f82-922c-50abc47569a2" name="vlan21-port" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5205] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5207] audit: op="connection-add" uuid="d7d0e194-14a8-4b5a-954e-899d87561665" name="vlan22-port" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5222] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.may-fail,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5235] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5237] audit: op="connection-add" uuid="cddf7510-0364-47a3-926c-7c9a2d8b6802" name="br-ex-if" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5255] audit: op="connection-update" uuid="eb1a661b-2380-537d-bcf6-15c64e0c452b" name="ci-private-network" args="connection.controller,connection.slave-type,connection.port-type,connection.master,connection.timestamp,ipv4.dns,ipv4.addresses,ipv4.method,ipv4.never-default,ipv4.routing-rules,ipv4.routes,ipv6.dns,ipv6.addresses,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,ipv6.routing-rules,ovs-external-ids.data,ovs-interface.type" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5268] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5269] audit: op="connection-add" uuid="83501210-459a-418b-92a9-fe1b5e18e77e" name="vlan20-if" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5282] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5283] audit: op="connection-add" uuid="5a6de098-116e-47d6-bc1a-4ce34b604941" name="vlan21-if" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5295] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5296] audit: op="connection-add" uuid="5b349f07-8292-4687-8995-462ed152eb46" name="vlan22-if" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5305] audit: op="connection-delete" uuid="17ffb41b-c510-3ee4-8367-2fd6b152fbfa" name="Wired connection 1" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5313] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5315] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5320] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5322] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (c4303188-0250-4a99-ab5a-92a1c2275499)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5323] audit: op="connection-activate" uuid="c4303188-0250-4a99-ab5a-92a1c2275499" name="br-ex-br" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5324] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5324] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5328] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5331] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (e3228f04-3bcb-4853-af07-4aa9d4b457f0)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5332] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5333] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5336] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5339] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (bde061f7-d76f-4aee-93c6-cc0d6a82235c)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5340] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5341] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5344] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5347] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (a6ea65a0-accc-45b3-ba17-6f133bff7f1f)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5349] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5349] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5352] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5355] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (8e38cfc2-294c-4f82-922c-50abc47569a2)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5357] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5357] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5362] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5365] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (d7d0e194-14a8-4b5a-954e-899d87561665)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5366] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5368] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5369] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5373] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5374] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5377] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5380] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (cddf7510-0364-47a3-926c-7c9a2d8b6802)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5381] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5383] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5384] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5385] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5386] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5393] device (eth1): disconnecting for new activation request.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5395] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5397] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5399] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5400] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5402] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5404] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5407] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5410] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (83501210-459a-418b-92a9-fe1b5e18e77e)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5411] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5413] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5415] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5417] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5418] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5419] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5421] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5423] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (5a6de098-116e-47d6-bc1a-4ce34b604941)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5424] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5426] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5427] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5428] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5429] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <warn>  [1769158701.5430] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5432] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5434] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (5b349f07-8292-4687-8995-462ed152eb46)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5435] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5437] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5439] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5439] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5440] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5450] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.may-fail,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5451] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5454] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5455] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5460] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5462] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 kernel: ovs-system: entered promiscuous mode
Jan 23 08:58:21 compute-0 kernel: Timeout policy base is empty
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5465] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5468] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5469] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5478] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 systemd-udevd[57701]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5481] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5484] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5485] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5489] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5491] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5494] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5495] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5499] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5502] dhcp4 (eth0): canceled DHCP transaction
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5502] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5502] dhcp4 (eth0): state changed no lease
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5502] dhcp6 (eth0): canceled DHCP transaction
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5502] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5503] dhcp6 (eth0): state changed no lease
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5506] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5512] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5516] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57695 uid=0 result="fail" reason="Device is not activated"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5519] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5525] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 23 08:58:21 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5543] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5545] dhcp4 (eth0): state changed new lease, address=192.168.26.29
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5609] device (eth1): disconnecting for new activation request.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5609] audit: op="connection-activate" uuid="eb1a661b-2380-537d-bcf6-15c64e0c452b" name="ci-private-network" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5635] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57695 uid=0 result="success"
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5635] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5701] device (eth1): Activation: starting connection 'ci-private-network' (eb1a661b-2380-537d-bcf6-15c64e0c452b)
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5704] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5709] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5711] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5715] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5717] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5721] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5722] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5722] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5723] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5724] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5728] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5740] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5742] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5745] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5747] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5750] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5752] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5754] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5756] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5759] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5761] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5764] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5781] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5797] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5798] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5801] device (eth1): Activation: successful, device activated.
Jan 23 08:58:21 compute-0 kernel: br-ex: entered promiscuous mode
Jan 23 08:58:21 compute-0 kernel: vlan22: entered promiscuous mode
Jan 23 08:58:21 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5908] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 23 08:58:21 compute-0 systemd-udevd[57700]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5926] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 kernel: vlan20: entered promiscuous mode
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5965] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5973] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.5979] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6022] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6042] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6059] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6063] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6070] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 kernel: vlan21: entered promiscuous mode
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6107] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6120] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6167] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6173] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6183] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6189] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6197] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6214] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6216] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 08:58:21 compute-0 NetworkManager[54920]: <info>  [1769158701.6223] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 08:58:22 compute-0 NetworkManager[54920]: <info>  [1769158702.6874] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57695 uid=0 result="success"
Jan 23 08:58:22 compute-0 NetworkManager[54920]: <info>  [1769158702.7843] checkpoint[0x55dafa0ab950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 23 08:58:22 compute-0 NetworkManager[54920]: <info>  [1769158702.7845] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57695 uid=0 result="success"
Jan 23 08:58:22 compute-0 NetworkManager[54920]: <info>  [1769158702.8866] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57695 uid=0 result="success"
Jan 23 08:58:22 compute-0 NetworkManager[54920]: <info>  [1769158702.8877] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.0255] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.1319] checkpoint[0x55dafa0aba20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.1324] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.3369] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.3381] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.4854] audit: op="networking-control" arg="global-dns-configuration" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.4865] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf)
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.4870] audit: op="networking-control" arg="global-dns-configuration" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.4885] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 sudo[58027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwyfazncswlxwsmrmxeefdvacvixscog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158703.1876338-845-116608843705508/AnsiballZ_async_status.py'
Jan 23 08:58:23 compute-0 sudo[58027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.6042] checkpoint[0x55dafa0abaf0]: destroy /org/freedesktop/NetworkManager/Checkpoint/3
Jan 23 08:58:23 compute-0 NetworkManager[54920]: <info>  [1769158703.6045] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=57695 uid=0 result="success"
Jan 23 08:58:23 compute-0 ansible-async_wrapper.py[57693]: Module complete (57693)
Jan 23 08:58:23 compute-0 python3.9[58029]: ansible-ansible.legacy.async_status Invoked with jid=j241307169380.57689 mode=status _async_dir=/root/.ansible_async
Jan 23 08:58:23 compute-0 sudo[58027]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:23 compute-0 sudo[58126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wplxakerqoblpoubvoownnegkluankoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158703.1876338-845-116608843705508/AnsiballZ_async_status.py'
Jan 23 08:58:23 compute-0 sudo[58126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:24 compute-0 python3.9[58128]: ansible-ansible.legacy.async_status Invoked with jid=j241307169380.57689 mode=cleanup _async_dir=/root/.ansible_async
Jan 23 08:58:24 compute-0 sudo[58126]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:25 compute-0 ansible-async_wrapper.py[57692]: Done in kid B.
Jan 23 08:58:27 compute-0 sudo[58280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efbtmabsryrkwlhpepirowizlwyfyeyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158707.5771928-916-137808013941944/AnsiballZ_stat.py'
Jan 23 08:58:27 compute-0 sudo[58280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:27 compute-0 python3.9[58282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:58:27 compute-0 sudo[58280]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:28 compute-0 sudo[58403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-relmnugnwyhcikndrzqyklnobfpjcloo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158707.5771928-916-137808013941944/AnsiballZ_copy.py'
Jan 23 08:58:28 compute-0 sudo[58403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:28 compute-0 python3.9[58405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158707.5771928-916-137808013941944/.source.returncode _original_basename=.rpaz9iu_ follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:28 compute-0 sudo[58403]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:28 compute-0 sudo[58555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twfeyromcarndeuhuwocnhdmcmkjsvps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158708.5314567-964-54501824484795/AnsiballZ_stat.py'
Jan 23 08:58:28 compute-0 sudo[58555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:28 compute-0 python3.9[58557]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:58:28 compute-0 sudo[58555]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:29 compute-0 sudo[58678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zchhhosbqywvcseupzbazedmkfdfzvkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158708.5314567-964-54501824484795/AnsiballZ_copy.py'
Jan 23 08:58:29 compute-0 sudo[58678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:29 compute-0 python3.9[58680]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158708.5314567-964-54501824484795/.source.cfg _original_basename=.k70k3czf follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:29 compute-0 sudo[58678]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:29 compute-0 sudo[58830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlvupglzfcsshvrufhrpyrdstzhjwwaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158709.4027598-1009-35239588369137/AnsiballZ_systemd.py'
Jan 23 08:58:29 compute-0 sudo[58830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:29 compute-0 python3.9[58832]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 08:58:29 compute-0 systemd[1]: Reloading Network Manager...
Jan 23 08:58:29 compute-0 NetworkManager[54920]: <info>  [1769158709.8819] audit: op="reload" arg="0" pid=58836 uid=0 result="success"
Jan 23 08:58:29 compute-0 NetworkManager[54920]: <info>  [1769158709.8823] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 23 08:58:29 compute-0 NetworkManager[54920]: <info>  [1769158709.8824] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 08:58:29 compute-0 systemd[1]: Reloaded Network Manager.
Jan 23 08:58:29 compute-0 sudo[58830]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:30 compute-0 sshd-session[50920]: Connection closed by 192.168.122.30 port 59554
Jan 23 08:58:30 compute-0 sshd-session[50917]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:58:30 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Jan 23 08:58:30 compute-0 systemd[1]: session-11.scope: Consumed 35.003s CPU time.
Jan 23 08:58:30 compute-0 systemd-logind[746]: Session 11 logged out. Waiting for processes to exit.
Jan 23 08:58:30 compute-0 systemd-logind[746]: Removed session 11.
Jan 23 08:58:30 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 08:58:35 compute-0 sshd-session[58869]: Accepted publickey for zuul from 192.168.122.30 port 48672 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 08:58:35 compute-0 systemd-logind[746]: New session 12 of user zuul.
Jan 23 08:58:35 compute-0 systemd[1]: Started Session 12 of User zuul.
Jan 23 08:58:35 compute-0 sshd-session[58869]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:58:36 compute-0 python3.9[59022]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:58:37 compute-0 python3.9[59177]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:58:38 compute-0 python3.9[59366]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:58:38 compute-0 sshd-session[58872]: Connection closed by 192.168.122.30 port 48672
Jan 23 08:58:38 compute-0 sshd-session[58869]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:58:38 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Jan 23 08:58:38 compute-0 systemd[1]: session-12.scope: Consumed 1.645s CPU time.
Jan 23 08:58:38 compute-0 systemd-logind[746]: Session 12 logged out. Waiting for processes to exit.
Jan 23 08:58:38 compute-0 systemd-logind[746]: Removed session 12.
Jan 23 08:58:39 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 08:58:43 compute-0 sshd-session[59395]: Accepted publickey for zuul from 192.168.122.30 port 58338 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 08:58:43 compute-0 systemd-logind[746]: New session 13 of user zuul.
Jan 23 08:58:43 compute-0 systemd[1]: Started Session 13 of User zuul.
Jan 23 08:58:43 compute-0 sshd-session[59395]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:58:44 compute-0 python3.9[59548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:58:45 compute-0 python3.9[59702]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:58:45 compute-0 sudo[59856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhpliftgcjsdtnqdjmkvzhvoateqwlxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158725.5688121-80-258507546182505/AnsiballZ_setup.py'
Jan 23 08:58:45 compute-0 sudo[59856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:45 compute-0 python3.9[59858]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:58:46 compute-0 sudo[59856]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:46 compute-0 sudo[59941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bztoocrieyauaremqhwmquonnbclhabe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158725.5688121-80-258507546182505/AnsiballZ_dnf.py'
Jan 23 08:58:46 compute-0 sudo[59941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:46 compute-0 python3.9[59943]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:58:47 compute-0 sudo[59941]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:47 compute-0 sudo[60094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssvpfygnspduhqxvmmpdxgbxlchsqjxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158727.7473161-116-161138765086546/AnsiballZ_setup.py'
Jan 23 08:58:47 compute-0 sudo[60094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:48 compute-0 python3.9[60096]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:58:48 compute-0 sudo[60094]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:48 compute-0 sudo[60285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lufcboplpbxninbihqjaxyrslkgubcvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158728.587709-149-33736575936505/AnsiballZ_file.py'
Jan 23 08:58:48 compute-0 sudo[60285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:49 compute-0 python3.9[60287]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:49 compute-0 sudo[60285]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:49 compute-0 sudo[60437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxewghcghpidszwuemqanxpfeoyztskj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158729.1810257-173-90841959458921/AnsiballZ_command.py'
Jan 23 08:58:49 compute-0 sudo[60437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:49 compute-0 python3.9[60439]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:58:49 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 08:58:49 compute-0 sudo[60437]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:50 compute-0 sudo[60599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqdrcqvlwnvhccmodkvvgkwphzzqempc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158729.823314-197-239987515355381/AnsiballZ_stat.py'
Jan 23 08:58:50 compute-0 sudo[60599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:50 compute-0 python3.9[60601]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:58:50 compute-0 sudo[60599]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:50 compute-0 sudo[60677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyojjaekjlbsukprmwefflaupqgfzwxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158729.823314-197-239987515355381/AnsiballZ_file.py'
Jan 23 08:58:50 compute-0 sudo[60677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:50 compute-0 python3.9[60679]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:58:50 compute-0 sudo[60677]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:50 compute-0 sudo[60829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yglzuxgmavsqjklxriecasrhcsbhkikj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158730.7357364-233-135498955954101/AnsiballZ_stat.py'
Jan 23 08:58:50 compute-0 sudo[60829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:51 compute-0 python3.9[60831]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:58:51 compute-0 sudo[60829]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:51 compute-0 sudo[60907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwzyjikmxrtcbfrdwxqnubhahlrbhnre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158730.7357364-233-135498955954101/AnsiballZ_file.py'
Jan 23 08:58:51 compute-0 sudo[60907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:51 compute-0 python3.9[60909]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:58:51 compute-0 sudo[60907]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:51 compute-0 sudo[61059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvhnahsjkanpiajjwcdioprawswfxvxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158731.5976949-272-97883697295097/AnsiballZ_ini_file.py'
Jan 23 08:58:51 compute-0 sudo[61059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:52 compute-0 python3.9[61061]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:58:52 compute-0 sudo[61059]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:52 compute-0 sudo[61211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqfvsowrfvugjaayucgluisljqzdcirt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158732.1641386-272-58988265060810/AnsiballZ_ini_file.py'
Jan 23 08:58:52 compute-0 sudo[61211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:52 compute-0 python3.9[61213]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:58:52 compute-0 sudo[61211]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:52 compute-0 sudo[61364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhjahubaexchfxwaaqttzxvpuvadlfff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158732.6218235-272-10163769189267/AnsiballZ_ini_file.py'
Jan 23 08:58:52 compute-0 sudo[61364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:52 compute-0 python3.9[61366]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:58:52 compute-0 sudo[61364]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:53 compute-0 sudo[61516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwvakfhgwyxleqnbwcheydztmvkdaxlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158733.130286-272-75359520819469/AnsiballZ_ini_file.py'
Jan 23 08:58:53 compute-0 sudo[61516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:53 compute-0 python3.9[61518]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:58:53 compute-0 sudo[61516]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:53 compute-0 sudo[61668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pybhdqxjlkjflnwyvcgmmllivkljlodn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158733.7454476-365-31456011670269/AnsiballZ_dnf.py'
Jan 23 08:58:53 compute-0 sudo[61668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:54 compute-0 python3.9[61670]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:58:55 compute-0 sudo[61668]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:55 compute-0 sudo[61821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljocepbyaakhfmduduorgiugliafspkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158735.4641597-398-197803962624352/AnsiballZ_setup.py'
Jan 23 08:58:55 compute-0 sudo[61821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:55 compute-0 python3.9[61823]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:58:55 compute-0 sudo[61821]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:56 compute-0 sudo[61975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keqiqoopwxtrimfkhejzqmogvbqwpvse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158736.0963075-422-57865715414913/AnsiballZ_stat.py'
Jan 23 08:58:56 compute-0 sudo[61975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:56 compute-0 python3.9[61977]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 08:58:56 compute-0 sudo[61975]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:56 compute-0 sudo[62127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shghcxsuojsxiffunayoeuwmgxxkylbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158736.6571636-449-163565822910777/AnsiballZ_stat.py'
Jan 23 08:58:56 compute-0 sudo[62127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:56 compute-0 python3.9[62129]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 08:58:57 compute-0 sudo[62127]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:57 compute-0 sudo[62279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqvjltuckkkhznhsjajnfuxzsjvahgwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158737.3078518-479-85996996151693/AnsiballZ_command.py'
Jan 23 08:58:57 compute-0 sudo[62279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:57 compute-0 python3.9[62281]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:58:57 compute-0 sudo[62279]: pam_unix(sudo:session): session closed for user root
Jan 23 08:58:58 compute-0 sudo[62432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htrllegwtvevviiivdficurmpwvfokah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158737.9280818-509-212317326896415/AnsiballZ_service_facts.py'
Jan 23 08:58:58 compute-0 sudo[62432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:58:58 compute-0 python3.9[62434]: ansible-service_facts Invoked
Jan 23 08:58:58 compute-0 network[62451]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 08:58:58 compute-0 network[62452]: 'network-scripts' will be removed from distribution in near future.
Jan 23 08:58:58 compute-0 network[62453]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 08:59:00 compute-0 sudo[62432]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:02 compute-0 sudo[62736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljmyhdvcbuzwmjpsspfckczucadnkcgv ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769158741.8160622-554-193615853115581/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769158741.8160622-554-193615853115581/args'
Jan 23 08:59:02 compute-0 sudo[62736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:02 compute-0 sudo[62736]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:02 compute-0 sudo[62903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wshlgpfdnhihzkrekvwvszxjcpgginwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158742.3223085-587-156718284876137/AnsiballZ_dnf.py'
Jan 23 08:59:02 compute-0 sudo[62903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:02 compute-0 python3.9[62905]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 08:59:03 compute-0 sudo[62903]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:04 compute-0 sudo[63056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyawyqbzamrkreqrxjvbkdnystmduqel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158744.1535037-626-146579166206248/AnsiballZ_package_facts.py'
Jan 23 08:59:04 compute-0 sudo[63056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:04 compute-0 python3.9[63058]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 08:59:05 compute-0 sudo[63056]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:05 compute-0 sudo[63208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmjrmbballswegwaixdhyvozqjewijup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158745.626451-656-204092506465840/AnsiballZ_stat.py'
Jan 23 08:59:05 compute-0 sudo[63208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:05 compute-0 python3.9[63210]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:06 compute-0 sudo[63208]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:06 compute-0 sudo[63333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cazscdupgibxtomumywmrckagjdjvfzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158745.626451-656-204092506465840/AnsiballZ_copy.py'
Jan 23 08:59:06 compute-0 sudo[63333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:06 compute-0 python3.9[63335]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158745.626451-656-204092506465840/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:06 compute-0 sudo[63333]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:06 compute-0 sudo[63487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chcxajmttpwwlgsilgkexpxdwrcmpgbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158746.6636505-701-223470914173729/AnsiballZ_stat.py'
Jan 23 08:59:06 compute-0 sudo[63487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:07 compute-0 python3.9[63489]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:07 compute-0 sudo[63487]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:07 compute-0 sudo[63612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfjdbgdpliezllipnwoxlmbnkigdkcxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158746.6636505-701-223470914173729/AnsiballZ_copy.py'
Jan 23 08:59:07 compute-0 sudo[63612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:07 compute-0 python3.9[63614]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158746.6636505-701-223470914173729/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:07 compute-0 sudo[63612]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:08 compute-0 sudo[63766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kurmswiduubujxorxbnadvobxdzdmukk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158748.270405-764-206863807127572/AnsiballZ_lineinfile.py'
Jan 23 08:59:08 compute-0 sudo[63766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:08 compute-0 python3.9[63768]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:08 compute-0 sudo[63766]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:09 compute-0 sudo[63920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqsakwnvqralceekrkpbvmoyqwkoolyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158749.633341-809-212858490240172/AnsiballZ_setup.py'
Jan 23 08:59:09 compute-0 sudo[63920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:10 compute-0 python3.9[63922]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:59:10 compute-0 sudo[63920]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:10 compute-0 sudo[64004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksrgvpoxpurnplcadsnomrndluhrxgjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158749.633341-809-212858490240172/AnsiballZ_systemd.py'
Jan 23 08:59:10 compute-0 sudo[64004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:10 compute-0 python3.9[64006]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:59:10 compute-0 sudo[64004]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:11 compute-0 sudo[64158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvdaeykeqqderygerwqvxmwzzfmfgmje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158751.704594-857-225126607212196/AnsiballZ_setup.py'
Jan 23 08:59:11 compute-0 sudo[64158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:12 compute-0 python3.9[64160]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 08:59:12 compute-0 sudo[64158]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:12 compute-0 sudo[64242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhyfdxncflzhimndmzknztoptlwsyhrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158751.704594-857-225126607212196/AnsiballZ_systemd.py'
Jan 23 08:59:12 compute-0 sudo[64242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:12 compute-0 python3.9[64244]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 08:59:12 compute-0 chronyd[754]: chronyd exiting
Jan 23 08:59:12 compute-0 systemd[1]: Stopping NTP client/server...
Jan 23 08:59:12 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Jan 23 08:59:12 compute-0 systemd[1]: Stopped NTP client/server.
Jan 23 08:59:12 compute-0 systemd[1]: Starting NTP client/server...
Jan 23 08:59:12 compute-0 chronyd[64253]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 08:59:12 compute-0 chronyd[64253]: Frequency -9.662 +/- 0.483 ppm read from /var/lib/chrony/drift
Jan 23 08:59:12 compute-0 chronyd[64253]: Loaded seccomp filter (level 2)
Jan 23 08:59:12 compute-0 systemd[1]: Started NTP client/server.
Jan 23 08:59:12 compute-0 sudo[64242]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:13 compute-0 sshd-session[59398]: Connection closed by 192.168.122.30 port 58338
Jan 23 08:59:13 compute-0 sshd-session[59395]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:59:13 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Jan 23 08:59:13 compute-0 systemd[1]: session-13.scope: Consumed 17.803s CPU time.
Jan 23 08:59:13 compute-0 systemd-logind[746]: Session 13 logged out. Waiting for processes to exit.
Jan 23 08:59:13 compute-0 systemd-logind[746]: Removed session 13.
Jan 23 08:59:18 compute-0 sshd-session[64279]: Accepted publickey for zuul from 192.168.122.30 port 47826 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 08:59:18 compute-0 systemd-logind[746]: New session 14 of user zuul.
Jan 23 08:59:18 compute-0 systemd[1]: Started Session 14 of User zuul.
Jan 23 08:59:18 compute-0 sshd-session[64279]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:59:18 compute-0 python3.9[64432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 08:59:19 compute-0 sudo[64586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcydjhubzcivwbeqdavjmqkrqfszlbjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158759.3178337-59-38573703949780/AnsiballZ_file.py'
Jan 23 08:59:19 compute-0 sudo[64586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:19 compute-0 python3.9[64588]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:19 compute-0 sudo[64586]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:20 compute-0 sudo[64761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzcowqszywfusuunmmgkqhkgabgqvpkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158760.0572824-83-79833484546036/AnsiballZ_stat.py'
Jan 23 08:59:20 compute-0 sudo[64761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:20 compute-0 python3.9[64763]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:20 compute-0 sudo[64761]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:20 compute-0 sudo[64839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsctkhkryqnizecknseqvmscabwwgwei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158760.0572824-83-79833484546036/AnsiballZ_file.py'
Jan 23 08:59:20 compute-0 sudo[64839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:20 compute-0 python3.9[64841]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.8f2ype_w recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:20 compute-0 sudo[64839]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:21 compute-0 sudo[64991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcjkemaogljnmkjccbhihzhlxppksnqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158761.3881078-143-112134370983859/AnsiballZ_stat.py'
Jan 23 08:59:21 compute-0 sudo[64991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:21 compute-0 python3.9[64993]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:21 compute-0 sudo[64991]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:22 compute-0 sudo[65114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxbtkivbjemyltfynkqhkbulganynbpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158761.3881078-143-112134370983859/AnsiballZ_copy.py'
Jan 23 08:59:22 compute-0 sudo[65114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:22 compute-0 python3.9[65116]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158761.3881078-143-112134370983859/.source _original_basename=.e3evajo3 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:22 compute-0 sudo[65114]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:22 compute-0 sudo[65266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uijajiyvfbjsmgrmucufksmzvzkmzfez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158762.405672-191-143743470299488/AnsiballZ_file.py'
Jan 23 08:59:22 compute-0 sudo[65266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:22 compute-0 python3.9[65268]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:59:22 compute-0 sudo[65266]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:23 compute-0 sudo[65418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgrlvsfrdfqzluauowlogbfcahxmiwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158762.8802-215-204999451156136/AnsiballZ_stat.py'
Jan 23 08:59:23 compute-0 sudo[65418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:23 compute-0 python3.9[65420]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:23 compute-0 sudo[65418]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:23 compute-0 sudo[65541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzsggbwunbuwsqdydidmsdwhbboarixj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158762.8802-215-204999451156136/AnsiballZ_copy.py'
Jan 23 08:59:23 compute-0 sudo[65541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:23 compute-0 python3.9[65543]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158762.8802-215-204999451156136/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:59:23 compute-0 sudo[65541]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:23 compute-0 sudo[65693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkebiguxgocamhcvlitlryalzrhuhztl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158763.6882377-215-251547676599796/AnsiballZ_stat.py'
Jan 23 08:59:23 compute-0 sudo[65693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:23 compute-0 python3.9[65695]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:24 compute-0 sudo[65693]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:24 compute-0 sudo[65816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acqoqstgommurjljvtklcjpehkhkphaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158763.6882377-215-251547676599796/AnsiballZ_copy.py'
Jan 23 08:59:24 compute-0 sudo[65816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:24 compute-0 python3.9[65818]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158763.6882377-215-251547676599796/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 08:59:24 compute-0 sudo[65816]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:24 compute-0 sudo[65968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdmfemckjruvumpgahqcaomtfzqmpxaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158764.5820956-302-174132020865463/AnsiballZ_file.py'
Jan 23 08:59:24 compute-0 sudo[65968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:24 compute-0 python3.9[65970]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:24 compute-0 sudo[65968]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:25 compute-0 sudo[66120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esbbdxdqkoflwkkpkiryaisdqujztatm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158765.0774817-326-197210902317550/AnsiballZ_stat.py'
Jan 23 08:59:25 compute-0 sudo[66120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:25 compute-0 python3.9[66122]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:25 compute-0 sudo[66120]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:25 compute-0 sudo[66243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejvlrawhnzpupgyxbvdhtefhukafpeqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158765.0774817-326-197210902317550/AnsiballZ_copy.py'
Jan 23 08:59:25 compute-0 sudo[66243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:25 compute-0 python3.9[66245]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158765.0774817-326-197210902317550/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:25 compute-0 sudo[66243]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:26 compute-0 sudo[66395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqbaqpdlxyrehcuyuxegsveqwbnmmrqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158765.9488645-371-116398284981408/AnsiballZ_stat.py'
Jan 23 08:59:26 compute-0 sudo[66395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:26 compute-0 python3.9[66397]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:26 compute-0 sudo[66395]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:26 compute-0 sudo[66518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bulbopgbtsmrcvpwtsselicdoodbrqxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158765.9488645-371-116398284981408/AnsiballZ_copy.py'
Jan 23 08:59:26 compute-0 sudo[66518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:26 compute-0 python3.9[66520]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158765.9488645-371-116398284981408/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:26 compute-0 sudo[66518]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:27 compute-0 sudo[66670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egaqkaalpnheknohadfwjhwpllbcvebd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158766.851284-416-11675452907742/AnsiballZ_systemd.py'
Jan 23 08:59:27 compute-0 sudo[66670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:27 compute-0 python3.9[66672]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:59:27 compute-0 systemd[1]: Reloading.
Jan 23 08:59:27 compute-0 systemd-rc-local-generator[66697]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:59:27 compute-0 systemd-sysv-generator[66700]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:59:27 compute-0 systemd[1]: Reloading.
Jan 23 08:59:27 compute-0 systemd-rc-local-generator[66731]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:59:27 compute-0 systemd-sysv-generator[66734]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:59:27 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Jan 23 08:59:27 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Jan 23 08:59:27 compute-0 sudo[66670]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:28 compute-0 sudo[66897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftyfpzwdgecateblkovdsjcglkrrwerc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158768.1426923-440-48169169472447/AnsiballZ_stat.py'
Jan 23 08:59:28 compute-0 sudo[66897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:28 compute-0 python3.9[66899]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:28 compute-0 sudo[66897]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:28 compute-0 sudo[67020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgpwtknszafoygdncxgxypzcvjcaibsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158768.1426923-440-48169169472447/AnsiballZ_copy.py'
Jan 23 08:59:28 compute-0 sudo[67020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:28 compute-0 python3.9[67022]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158768.1426923-440-48169169472447/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:28 compute-0 sudo[67020]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:29 compute-0 sudo[67172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glywljwyjjqjbllhueuxmjrdvnecxxhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158769.0902755-485-8113149238340/AnsiballZ_stat.py'
Jan 23 08:59:29 compute-0 sudo[67172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:29 compute-0 python3.9[67174]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:29 compute-0 sudo[67172]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:29 compute-0 sudo[67295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbjwbhxcvrtytnumvwyfcwhkmjqcwxkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158769.0902755-485-8113149238340/AnsiballZ_copy.py'
Jan 23 08:59:29 compute-0 sudo[67295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:29 compute-0 python3.9[67297]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158769.0902755-485-8113149238340/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:29 compute-0 sudo[67295]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:30 compute-0 sudo[67447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiumhnqfogznokzkollrcybptntiuoqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158769.9753702-530-180373414651558/AnsiballZ_systemd.py'
Jan 23 08:59:30 compute-0 sudo[67447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:30 compute-0 python3.9[67449]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:59:30 compute-0 systemd[1]: Reloading.
Jan 23 08:59:30 compute-0 systemd-sysv-generator[67475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:59:30 compute-0 systemd-rc-local-generator[67472]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:59:30 compute-0 systemd[1]: Reloading.
Jan 23 08:59:30 compute-0 systemd-rc-local-generator[67507]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:59:30 compute-0 systemd-sysv-generator[67510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:59:30 compute-0 systemd[1]: Starting Create netns directory...
Jan 23 08:59:30 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 08:59:30 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 08:59:30 compute-0 systemd[1]: Finished Create netns directory.
Jan 23 08:59:30 compute-0 sudo[67447]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:31 compute-0 python3.9[67675]: ansible-ansible.builtin.service_facts Invoked
Jan 23 08:59:31 compute-0 network[67692]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 08:59:31 compute-0 network[67693]: 'network-scripts' will be removed from distribution in near future.
Jan 23 08:59:31 compute-0 network[67694]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 08:59:33 compute-0 sudo[67954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymmbblcwqmpwagyuxcdfyvwqqmuypapq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158773.2653668-578-49063172951590/AnsiballZ_systemd.py'
Jan 23 08:59:33 compute-0 sudo[67954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:33 compute-0 python3.9[67956]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:59:33 compute-0 systemd[1]: Reloading.
Jan 23 08:59:33 compute-0 systemd-rc-local-generator[67981]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:59:33 compute-0 systemd-sysv-generator[67986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:59:33 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 23 08:59:34 compute-0 iptables.init[67996]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 23 08:59:34 compute-0 iptables.init[67996]: iptables: Flushing firewall rules: [  OK  ]
Jan 23 08:59:34 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Jan 23 08:59:34 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 23 08:59:34 compute-0 sudo[67954]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:34 compute-0 sudo[68190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvylvblnadbanudzikonveypamcwyhwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158774.2468944-578-219813206576805/AnsiballZ_systemd.py'
Jan 23 08:59:34 compute-0 sudo[68190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:34 compute-0 python3.9[68192]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:59:34 compute-0 sudo[68190]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:35 compute-0 sudo[68344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiyulxmokvemnfhktnsnnulojfwhnimg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158774.9471488-626-259926042063068/AnsiballZ_systemd.py'
Jan 23 08:59:35 compute-0 sudo[68344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:35 compute-0 python3.9[68346]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 08:59:35 compute-0 systemd[1]: Reloading.
Jan 23 08:59:35 compute-0 systemd-rc-local-generator[68369]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 08:59:35 compute-0 systemd-sysv-generator[68372]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 08:59:35 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 23 08:59:35 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 23 08:59:35 compute-0 sudo[68344]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:36 compute-0 sudo[68536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvknnxmbvfycoitmkbekaljdlbbtzoal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158775.8094683-650-113038112860545/AnsiballZ_command.py'
Jan 23 08:59:36 compute-0 sudo[68536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:36 compute-0 python3.9[68538]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:59:36 compute-0 sudo[68536]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:36 compute-0 sudo[68689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zznylmngtpqtkntwwatjjsqozylxnyog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158776.770865-692-205881509402975/AnsiballZ_stat.py'
Jan 23 08:59:36 compute-0 sudo[68689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:37 compute-0 python3.9[68691]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:37 compute-0 sudo[68689]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:37 compute-0 sudo[68814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrpfjvrgivwjnwxyeejpjxlpbcftcujw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158776.770865-692-205881509402975/AnsiballZ_copy.py'
Jan 23 08:59:37 compute-0 sudo[68814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:37 compute-0 python3.9[68816]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158776.770865-692-205881509402975/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:37 compute-0 sudo[68814]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:37 compute-0 sudo[68967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyamvzyynymuqlwpplrwlfpcnikjmjiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158777.7016814-737-134550579619561/AnsiballZ_systemd.py'
Jan 23 08:59:37 compute-0 sudo[68967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:38 compute-0 python3.9[68969]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 08:59:38 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Jan 23 08:59:38 compute-0 sshd[963]: Received SIGHUP; restarting.
Jan 23 08:59:38 compute-0 sshd[963]: Server listening on 0.0.0.0 port 22.
Jan 23 08:59:38 compute-0 sshd[963]: Server listening on :: port 22.
Jan 23 08:59:38 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Jan 23 08:59:38 compute-0 sudo[68967]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:38 compute-0 sudo[69123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbdkdeqdhewjbdyrxclfvtvrvqlktfpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158778.3345962-761-79551768191869/AnsiballZ_file.py'
Jan 23 08:59:38 compute-0 sudo[69123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:38 compute-0 python3.9[69125]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:38 compute-0 sudo[69123]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:38 compute-0 sudo[69275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbptiusolrxyrswqqrcckowbjfeswuww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158778.8156826-785-64622659130282/AnsiballZ_stat.py'
Jan 23 08:59:38 compute-0 sudo[69275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:39 compute-0 python3.9[69277]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:39 compute-0 sudo[69275]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:39 compute-0 sudo[69398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxgpyrhuevuzctrctgbdfqivammyqorq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158778.8156826-785-64622659130282/AnsiballZ_copy.py'
Jan 23 08:59:39 compute-0 sudo[69398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:39 compute-0 python3.9[69400]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158778.8156826-785-64622659130282/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:39 compute-0 sudo[69398]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:40 compute-0 sudo[69550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zarcsmjtgujkcobpujjfmnkucvsfpisa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158779.9872415-839-129426544308444/AnsiballZ_timezone.py'
Jan 23 08:59:40 compute-0 sudo[69550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:40 compute-0 python3.9[69552]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 08:59:40 compute-0 systemd[1]: Starting Time & Date Service...
Jan 23 08:59:40 compute-0 systemd[1]: Started Time & Date Service.
Jan 23 08:59:40 compute-0 sudo[69550]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:40 compute-0 sudo[69706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxttceidajmhnydtrzxgwpmlwtqbhygw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158780.792098-866-22355354229661/AnsiballZ_file.py'
Jan 23 08:59:40 compute-0 sudo[69706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:41 compute-0 python3.9[69708]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:41 compute-0 sudo[69706]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:41 compute-0 sudo[69858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoohuryrxmxtuxopsgnipzhnpinkfnmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158781.277353-890-83570903013824/AnsiballZ_stat.py'
Jan 23 08:59:41 compute-0 sudo[69858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:41 compute-0 python3.9[69860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:41 compute-0 sudo[69858]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:41 compute-0 sudo[69981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwwxlqmgyrvodaxhaufheisjabwfkppg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158781.277353-890-83570903013824/AnsiballZ_copy.py'
Jan 23 08:59:41 compute-0 sudo[69981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:42 compute-0 python3.9[69983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158781.277353-890-83570903013824/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:42 compute-0 sudo[69981]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:42 compute-0 sudo[70133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqnyshgxvqwoamizbmpxzdqjjxajiihe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158782.1783626-935-281139554970620/AnsiballZ_stat.py'
Jan 23 08:59:42 compute-0 sudo[70133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:42 compute-0 python3.9[70135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:42 compute-0 sudo[70133]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:42 compute-0 sudo[70256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpvhxpnflimzmximpsvfvvipgldqnkyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158782.1783626-935-281139554970620/AnsiballZ_copy.py'
Jan 23 08:59:42 compute-0 sudo[70256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:42 compute-0 python3.9[70258]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158782.1783626-935-281139554970620/.source.yaml _original_basename=.xzv_cug8 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:42 compute-0 sudo[70256]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:43 compute-0 sudo[70408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izxdessjqgejvphsnmnnlwrtsvmambpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158783.0742526-980-227874452181530/AnsiballZ_stat.py'
Jan 23 08:59:43 compute-0 sudo[70408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:43 compute-0 python3.9[70410]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:43 compute-0 sudo[70408]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:43 compute-0 sudo[70531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qctieppbzkihmcwsfhaxisbmwnswmoeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158783.0742526-980-227874452181530/AnsiballZ_copy.py'
Jan 23 08:59:43 compute-0 sudo[70531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:43 compute-0 python3.9[70533]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158783.0742526-980-227874452181530/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:43 compute-0 sudo[70531]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:44 compute-0 sudo[70683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsiegwhjmmhbdmlfvyarvufqsmofcnkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158783.9777193-1025-202423968147501/AnsiballZ_command.py'
Jan 23 08:59:44 compute-0 sudo[70683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:44 compute-0 python3.9[70685]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:59:44 compute-0 sudo[70683]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:44 compute-0 sudo[70836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdcqzswrxrghxrhjxuxakuqdbduuhhpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158784.4856784-1049-13027317092208/AnsiballZ_command.py'
Jan 23 08:59:44 compute-0 sudo[70836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:44 compute-0 python3.9[70838]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:59:44 compute-0 sudo[70836]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:45 compute-0 sudo[70989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cauqnlxtohffauhmhdsybbtfzwailnuy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769158785.0236456-1073-22741601538365/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 08:59:45 compute-0 sudo[70989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:45 compute-0 python3[70991]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 08:59:45 compute-0 sudo[70989]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:45 compute-0 sudo[71141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxfierwjmznrzqjvskyluhdbskvgxsns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158785.7268972-1097-39070121467199/AnsiballZ_stat.py'
Jan 23 08:59:45 compute-0 sudo[71141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:46 compute-0 python3.9[71143]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:46 compute-0 sudo[71141]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:46 compute-0 sudo[71264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiyvffidilonrzydahqkzojwnzuvkihn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158785.7268972-1097-39070121467199/AnsiballZ_copy.py'
Jan 23 08:59:46 compute-0 sudo[71264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:46 compute-0 python3.9[71266]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158785.7268972-1097-39070121467199/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:46 compute-0 sudo[71264]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:46 compute-0 sudo[71416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofawdkqspfhrvuqeywunvkxpjbilqluu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158786.7107985-1142-149233743696810/AnsiballZ_stat.py'
Jan 23 08:59:46 compute-0 sudo[71416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:47 compute-0 python3.9[71418]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:47 compute-0 sudo[71416]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:47 compute-0 sudo[71539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxavmmlqxapeleujnzglpholvhqconep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158786.7107985-1142-149233743696810/AnsiballZ_copy.py'
Jan 23 08:59:47 compute-0 sudo[71539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:47 compute-0 python3.9[71541]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158786.7107985-1142-149233743696810/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:47 compute-0 sudo[71539]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:47 compute-0 sudo[71691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxkicalspebaruojzudogfqnabwrpmun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158787.6603243-1187-26560912172858/AnsiballZ_stat.py'
Jan 23 08:59:47 compute-0 sudo[71691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:48 compute-0 python3.9[71693]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:48 compute-0 sudo[71691]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:48 compute-0 sudo[71814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xutrllrhsiusbwrbnvouxhtseaxuprzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158787.6603243-1187-26560912172858/AnsiballZ_copy.py'
Jan 23 08:59:48 compute-0 sudo[71814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:48 compute-0 python3.9[71816]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158787.6603243-1187-26560912172858/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:48 compute-0 sudo[71814]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:48 compute-0 sudo[71966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngprhlhfyywwyxmhttomczlpuoqgzdnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158788.5870352-1232-196552578615483/AnsiballZ_stat.py'
Jan 23 08:59:48 compute-0 sudo[71966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:48 compute-0 python3.9[71968]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:48 compute-0 sudo[71966]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:49 compute-0 sudo[72089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esdafsgnktcsywqopxplmorfvvyifxeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158788.5870352-1232-196552578615483/AnsiballZ_copy.py'
Jan 23 08:59:49 compute-0 sudo[72089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:49 compute-0 python3.9[72091]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158788.5870352-1232-196552578615483/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:49 compute-0 sudo[72089]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:49 compute-0 sudo[72241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydkjieylqgdqihetzruzjtdqawcoxzyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158789.4902637-1277-141571033914646/AnsiballZ_stat.py'
Jan 23 08:59:49 compute-0 sudo[72241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:49 compute-0 python3.9[72243]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 08:59:49 compute-0 sudo[72241]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:50 compute-0 sudo[72364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oastcgjtsixkjsrjenozhefowwrptneh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158789.4902637-1277-141571033914646/AnsiballZ_copy.py'
Jan 23 08:59:50 compute-0 sudo[72364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:50 compute-0 python3.9[72366]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158789.4902637-1277-141571033914646/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:50 compute-0 sudo[72364]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:50 compute-0 sudo[72516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vibkcgzthglkwfdpxjhchzsidthvdinq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158790.4243436-1322-97210934240566/AnsiballZ_file.py'
Jan 23 08:59:50 compute-0 sudo[72516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:50 compute-0 python3.9[72518]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:50 compute-0 sudo[72516]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:51 compute-0 sudo[72668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnapifulvssrtgusftjnsqvyqcgvdxid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158790.9037743-1346-8554202922801/AnsiballZ_command.py'
Jan 23 08:59:51 compute-0 sudo[72668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:51 compute-0 python3.9[72670]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 08:59:51 compute-0 sudo[72668]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:51 compute-0 sudo[72827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdmbdbuhtjrrxhepldbdacfhztxkinvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158791.421215-1370-246176481423091/AnsiballZ_blockinfile.py'
Jan 23 08:59:51 compute-0 sudo[72827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:51 compute-0 python3.9[72829]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:51 compute-0 sudo[72827]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:52 compute-0 sudo[72980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmbuutgxunlzxaodhofjujjlqtasrota ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158792.149271-1397-157201557556240/AnsiballZ_file.py'
Jan 23 08:59:52 compute-0 sudo[72980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:52 compute-0 python3.9[72982]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:52 compute-0 sudo[72980]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:52 compute-0 sudo[73132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byqbgqdrhnsleohyiebyjvgvtubgexav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158792.582348-1397-88450320157080/AnsiballZ_file.py'
Jan 23 08:59:52 compute-0 sudo[73132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:52 compute-0 python3.9[73134]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 08:59:52 compute-0 sudo[73132]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:53 compute-0 sudo[73284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llszgxaiowvrwrhwziicrfoxlhfatmfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158793.0958571-1442-55399601573536/AnsiballZ_mount.py'
Jan 23 08:59:53 compute-0 sudo[73284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:53 compute-0 python3.9[73286]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 08:59:53 compute-0 sudo[73284]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:53 compute-0 sudo[73437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyxnimjxgyxdoxdytnhsugdoludetaaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158793.7169526-1442-104038055836725/AnsiballZ_mount.py'
Jan 23 08:59:53 compute-0 sudo[73437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 08:59:54 compute-0 python3.9[73439]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 08:59:54 compute-0 sudo[73437]: pam_unix(sudo:session): session closed for user root
Jan 23 08:59:54 compute-0 sshd-session[64282]: Connection closed by 192.168.122.30 port 47826
Jan 23 08:59:54 compute-0 sshd-session[64279]: pam_unix(sshd:session): session closed for user zuul
Jan 23 08:59:54 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Jan 23 08:59:54 compute-0 systemd[1]: session-14.scope: Consumed 24.202s CPU time.
Jan 23 08:59:54 compute-0 systemd-logind[746]: Session 14 logged out. Waiting for processes to exit.
Jan 23 08:59:54 compute-0 systemd-logind[746]: Removed session 14.
Jan 23 08:59:59 compute-0 sshd-session[73465]: Accepted publickey for zuul from 192.168.122.30 port 41632 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 08:59:59 compute-0 systemd-logind[746]: New session 15 of user zuul.
Jan 23 08:59:59 compute-0 systemd[1]: Started Session 15 of User zuul.
Jan 23 08:59:59 compute-0 sshd-session[73465]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 08:59:59 compute-0 sudo[73618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiyclrnyzevsfcwnexbeaicahpuwolik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158799.4971628-23-13564559706937/AnsiballZ_tempfile.py'
Jan 23 08:59:59 compute-0 sudo[73618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:00 compute-0 python3.9[73620]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 09:00:00 compute-0 sudo[73618]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:00 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:00:00 compute-0 sudo[73771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sckjtbayzktagbbakuzjgtrovdaehqul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158800.168294-59-200827284936456/AnsiballZ_stat.py'
Jan 23 09:00:00 compute-0 sudo[73771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:00 compute-0 python3.9[73773]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:00:00 compute-0 sudo[73771]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:01 compute-0 sudo[73923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbiectjmoynmrpgyjmqdxrrmtxsuolqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158800.8648171-89-46610534211082/AnsiballZ_setup.py'
Jan 23 09:00:01 compute-0 sudo[73923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:01 compute-0 python3.9[73925]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:00:01 compute-0 sudo[73923]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:02 compute-0 sudo[74075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkcottrggsfvajbxqibkrreqocnatudb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158801.7181978-114-181402661777944/AnsiballZ_blockinfile.py'
Jan 23 09:00:02 compute-0 sudo[74075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:02 compute-0 python3.9[74077]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDjUu/Mj6GGmf4jKnlRE0TyzK1IX82J97pgcSSQiIRnKaLm5lUot6zubNofR9ORpCCclAYavqhSbmEAU6HdAmKSSQrijMmIJ9WHTRoZfobCJM3fboZX/TT6emwvwPadq0iKBp7nvubyzgTO2Hg+CUhXn+j5xUnveVgMTdYhOGeSXPzIaUO9zpFHFvFpA3qA25+kEL8I27vbJh0w47dl24HGXye6STW90mC60UUNB5aCcFJ4YmnBlcBAWsbm+ssQ0qiJP7Pu+6W/sUg6vXVJFcWIMu83geQkwpr1M2Q60khscb0kkUf0xnpYkMBVGlJM2LrF7dT0fxtV2huVLvflHSQDjMmdMLQXjuluCVCcB5QkSbKKymW6HjzyTCtaUEUXlLYpDh94+7OAyl7rLiwrEgMqBll7wF8AzslzRA2EhvWpTkF3+S8kKyhnPwe9rC5cWavDZCqgnZjK2MP7r6BFXZRXyQKWQEMJJp2QYgKtJCH8864CPk2L/qCQsdjjZ1urTsM=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGUtuchI3v1iXRftQa4VzOiJmo+d4nFgEpR4DWD1c2z9
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCLYiRE2TFAwmr+njFHaxAtN2JMkZ3lw9HCrXL5Z9B7KeFw34ZV/d4z77jBP/ILtkcDG74JxP7RkbQSa5zNhPiY=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDX+XPbw9T2NjcldxVVi0rNqrrnHIx0+fIibX7tYA0cCwPsR0gbrgKXpfEWv4yAHfVS907+MfEKO5ImPR18QohNauVi927QN0bMdUkwqEINxkJA1eGi19gcYypg4wIftiRX2WjxKvOwMe27gy5+aKA5S9BAMhwmqs3mbDutPlvDVPnvnzoiSMM4Ol1Y7ucsHTINXHdjYxdJ2kOVHEWpTM+upFs4yOeWIavl0JiJDKYjbvmD5HZtKVcVD1ZRyMYfjDCmdqUM+SykqoB4kFIVg7nSzGYZM5g5gLDRVvf3ffMDTj0Kg6Z+EDjv3aH1vwoVMZBkdOKNhvPcCqYiaYKJ+jRMuuS4Guc9uMAtQDnHQU+pNtCUvyJbM+IMpvAGEeUF/kD3OLnwMLaw7hdclsJvS72UViUO0+yxozQ8MCprJ0qNoEgw3qtVdQ4dj/KPS/BBvqA1Wlq0S2UZgz0O8pItHh5bBZQXdE8UDpXBdL63gdTOCfCGLYFxdHpI2iAZYuVorqM=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGE/iq/mTcfXbhkN5aUTUNDALNwsnVHVdqrwEtqJCcnQ
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMsJS5NDxrzko3eezTkBkraoDc2LsCykusdV94nwfGHbuHypeaQLUQ8gRzieFCZk2SeDeFcY/RqMQMtJfjQjyfs=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC7Sg0OaQMNAWoNO20p8rWU4SQa6cm+a3WDDy5qlV6nvVCd03gYrTJmRU9wVM6Z8oqP5hkh3PkjxgBQt0m6+2ju89tFGKdUItJxpKYYNsr90lg1MUIu5y40xlVuXSCD/OMCWAClm/IwD5Lulp1Xc5gPN5EF2e7Qn/kqGup/r/931uGKWsbRSw/qFDYBz15E7Uf/T+zSErWqERT9dulFKzJ05/kLFmqhvE2KPYPH7EbFK2rUp6buCd8pMjj1nqUnJnsB7sT9blS61xaA3V9uM7NtV9fSyxopIFyJSgd33JP8FDuIIJod5FvgaGeSJOyUKKb8biITq7nm1zx8lMYGWr+0rztE4lNU14HfXIawDS91nhTFcgu0SibzG2SMSeDuc4csohPhqbIsaDh+/BWfeOUQ+iXgREmJi6jgMQnfv4npw5KUfUbYnlOKR29jBjpqvgtsExlr86ipzeTMdCC6DejBHYqhLUMo228hiDBo8KASiNdG+o4EMwKCBGUF+C4yLxs=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICvqJJVvdEemBIjp5B0R4oPOln/jBijvphQX1ui0yAWM
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGp1M+iKgQFSDb6gjrT0MOUR8XAevXHpaY4sIt+8bEVeDK9CrSZPwxISB2j0mxXXIg5/f5ukm6ADHq7ipk4hIv8=
                                             create=True mode=0644 path=/tmp/ansible.khul252d state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:02 compute-0 sudo[74075]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:02 compute-0 sudo[74227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hptujqhxlstbahwrvoypvstfphqdfytd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158802.2976444-138-207197490054513/AnsiballZ_command.py'
Jan 23 09:00:02 compute-0 sudo[74227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:02 compute-0 python3.9[74229]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.khul252d' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:00:02 compute-0 sudo[74227]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:03 compute-0 sudo[74381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmtozzbqijnratlsuciivdzbopfmqrac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158802.8914745-162-12992733806647/AnsiballZ_file.py'
Jan 23 09:00:03 compute-0 sudo[74381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:03 compute-0 python3.9[74383]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.khul252d state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:03 compute-0 sudo[74381]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:03 compute-0 sshd-session[73468]: Connection closed by 192.168.122.30 port 41632
Jan 23 09:00:03 compute-0 sshd-session[73465]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:00:03 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Jan 23 09:00:03 compute-0 systemd[1]: session-15.scope: Consumed 2.300s CPU time.
Jan 23 09:00:03 compute-0 systemd-logind[746]: Session 15 logged out. Waiting for processes to exit.
Jan 23 09:00:03 compute-0 systemd-logind[746]: Removed session 15.
Jan 23 09:00:08 compute-0 sshd-session[74408]: Accepted publickey for zuul from 192.168.122.30 port 60956 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:00:08 compute-0 systemd-logind[746]: New session 16 of user zuul.
Jan 23 09:00:08 compute-0 systemd[1]: Started Session 16 of User zuul.
Jan 23 09:00:08 compute-0 sshd-session[74408]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:00:09 compute-0 python3.9[74561]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:00:10 compute-0 sudo[74715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwymjkkzniorahrhmfdullfxuqrecxvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158809.6891973-56-207527508772547/AnsiballZ_systemd.py'
Jan 23 09:00:10 compute-0 sudo[74715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:10 compute-0 python3.9[74717]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 09:00:10 compute-0 sudo[74715]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:10 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 09:00:10 compute-0 sudo[74871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltdrevczteipofkqljolbgevnxeslcxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158810.5331516-80-250421854353095/AnsiballZ_systemd.py'
Jan 23 09:00:10 compute-0 sudo[74871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:11 compute-0 python3.9[74873]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:00:11 compute-0 sudo[74871]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:11 compute-0 sudo[75024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhcvqwxnvhnuylkandnepveftouyvntg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158811.3814087-107-20245029247927/AnsiballZ_command.py'
Jan 23 09:00:11 compute-0 sudo[75024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:11 compute-0 python3.9[75026]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:00:11 compute-0 sudo[75024]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:12 compute-0 sudo[75177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbuozdkhlgskhtqqtvdykpacvxbprxse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158812.0128343-131-232387114579639/AnsiballZ_stat.py'
Jan 23 09:00:12 compute-0 sudo[75177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:12 compute-0 python3.9[75179]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:00:12 compute-0 sudo[75177]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:12 compute-0 sudo[75331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jowtmfowlayoiodyelrafamuswhtfhal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158812.6606014-155-264175189213683/AnsiballZ_command.py'
Jan 23 09:00:12 compute-0 sudo[75331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:13 compute-0 python3.9[75333]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:00:13 compute-0 sudo[75331]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:13 compute-0 sudo[75486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygmqdujlklfzkbpdqivonclhwqubnfey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158813.1989386-179-153825429234799/AnsiballZ_file.py'
Jan 23 09:00:13 compute-0 sudo[75486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:13 compute-0 python3.9[75488]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:13 compute-0 sudo[75486]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:13 compute-0 sshd-session[74411]: Connection closed by 192.168.122.30 port 60956
Jan 23 09:00:13 compute-0 sshd-session[74408]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:00:13 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Jan 23 09:00:13 compute-0 systemd[1]: session-16.scope: Consumed 3.142s CPU time.
Jan 23 09:00:13 compute-0 systemd-logind[746]: Session 16 logged out. Waiting for processes to exit.
Jan 23 09:00:13 compute-0 systemd-logind[746]: Removed session 16.
Jan 23 09:00:19 compute-0 sshd-session[75513]: Accepted publickey for zuul from 192.168.122.30 port 33434 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:00:19 compute-0 systemd-logind[746]: New session 17 of user zuul.
Jan 23 09:00:19 compute-0 systemd[1]: Started Session 17 of User zuul.
Jan 23 09:00:19 compute-0 sshd-session[75513]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:00:20 compute-0 python3.9[75666]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:00:21 compute-0 sudo[75820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zezwowynhyxomkqvjuocvuvmjnoqxnum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158820.8283763-63-172964087988447/AnsiballZ_setup.py'
Jan 23 09:00:21 compute-0 sudo[75820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:21 compute-0 python3.9[75822]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:00:21 compute-0 sudo[75820]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:21 compute-0 sudo[75904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcqddnppitreusmpwejsifguuqoqfgqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158820.8283763-63-172964087988447/AnsiballZ_dnf.py'
Jan 23 09:00:21 compute-0 sudo[75904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:21 compute-0 python3.9[75906]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:00:22 compute-0 sudo[75904]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:23 compute-0 python3.9[76057]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:00:24 compute-0 python3.9[76208]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:00:25 compute-0 python3.9[76358]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:00:25 compute-0 python3.9[76508]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:00:26 compute-0 sshd-session[75516]: Connection closed by 192.168.122.30 port 33434
Jan 23 09:00:26 compute-0 sshd-session[75513]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:00:26 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Jan 23 09:00:26 compute-0 systemd[1]: session-17.scope: Consumed 4.244s CPU time.
Jan 23 09:00:26 compute-0 systemd-logind[746]: Session 17 logged out. Waiting for processes to exit.
Jan 23 09:00:26 compute-0 systemd-logind[746]: Removed session 17.
Jan 23 09:00:31 compute-0 sshd-session[76534]: Accepted publickey for zuul from 192.168.122.30 port 39268 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:00:31 compute-0 systemd-logind[746]: New session 18 of user zuul.
Jan 23 09:00:31 compute-0 systemd[1]: Started Session 18 of User zuul.
Jan 23 09:00:31 compute-0 sshd-session[76534]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:00:32 compute-0 python3.9[76687]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:00:33 compute-0 sudo[76841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulrwkwutuknqtgbffkhornjcxlltzlsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158833.5581787-112-201540247745752/AnsiballZ_file.py'
Jan 23 09:00:33 compute-0 sudo[76841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:34 compute-0 python3.9[76843]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:34 compute-0 sudo[76841]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:34 compute-0 sudo[76993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgiqxyjsaksresaxsbgyrhtplkzhaoqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158834.1206512-112-165356363650302/AnsiballZ_file.py'
Jan 23 09:00:34 compute-0 sudo[76993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:34 compute-0 python3.9[76995]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:34 compute-0 sudo[76993]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:34 compute-0 sudo[77145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehectcnefiloaquehghwafwoysthsqpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158834.6288471-158-198917607692345/AnsiballZ_stat.py'
Jan 23 09:00:34 compute-0 sudo[77145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:35 compute-0 python3.9[77147]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:35 compute-0 sudo[77145]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:35 compute-0 sudo[77268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhzthpcganshtnzoaubioydnflwyaghw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158834.6288471-158-198917607692345/AnsiballZ_copy.py'
Jan 23 09:00:35 compute-0 sudo[77268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:35 compute-0 python3.9[77270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158834.6288471-158-198917607692345/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=7acdb2b62758196e1582e74ba0dd266079779f2d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:35 compute-0 sudo[77268]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:35 compute-0 sudo[77420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izqtrwahtuztcvgixupzvjtncnvoejgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158835.7057214-158-276444795087022/AnsiballZ_stat.py'
Jan 23 09:00:35 compute-0 sudo[77420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:36 compute-0 python3.9[77422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:36 compute-0 sudo[77420]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:36 compute-0 sudo[77543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbdgmgptlyaurqbbcblpizyhalyuywwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158835.7057214-158-276444795087022/AnsiballZ_copy.py'
Jan 23 09:00:36 compute-0 sudo[77543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:36 compute-0 python3.9[77545]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158835.7057214-158-276444795087022/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=178b38fd55fc318e3497a0b0a90f821f605a83ed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:36 compute-0 sudo[77543]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:36 compute-0 sudo[77695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glvrkkonshdvhkkgpomcfvicgedfrfdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158836.5121043-158-82056914941720/AnsiballZ_stat.py'
Jan 23 09:00:36 compute-0 sudo[77695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:36 compute-0 python3.9[77697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:36 compute-0 sudo[77695]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:37 compute-0 sudo[77818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebntppqnkuiwuuacidhdgnfootgcwyrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158836.5121043-158-82056914941720/AnsiballZ_copy.py'
Jan 23 09:00:37 compute-0 sudo[77818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:37 compute-0 python3.9[77820]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158836.5121043-158-82056914941720/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=048ae38463c8c6a707b242601fa83057d7622be0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:37 compute-0 sudo[77818]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:37 compute-0 sudo[77970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwczkbazpyofxocyhnnvddllkdgvfrzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158837.4795783-285-567295083179/AnsiballZ_file.py'
Jan 23 09:00:37 compute-0 sudo[77970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:37 compute-0 python3.9[77972]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:37 compute-0 sudo[77970]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:38 compute-0 sudo[78122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehhouhkuuzbmosyibptlpmcappmnsweh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158837.9312823-285-54585693913027/AnsiballZ_file.py'
Jan 23 09:00:38 compute-0 sudo[78122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:38 compute-0 python3.9[78124]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:38 compute-0 sudo[78122]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:38 compute-0 sudo[78274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmmhlevoxibytbollnixwyvvzglsymzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158838.597614-332-10055436632542/AnsiballZ_stat.py'
Jan 23 09:00:38 compute-0 sudo[78274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:38 compute-0 python3.9[78276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:38 compute-0 sudo[78274]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:39 compute-0 sudo[78397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkywtjjshibordiimbeykqqxkabzfilk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158838.597614-332-10055436632542/AnsiballZ_copy.py'
Jan 23 09:00:39 compute-0 sudo[78397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:39 compute-0 python3.9[78399]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158838.597614-332-10055436632542/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=dd7e2ed5e777b7b039e6f091d421da3f286ff876 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:39 compute-0 sudo[78397]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:39 compute-0 sudo[78549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abfjjxodwrtpjtuogfyhzjxyladwwmhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158839.4978783-332-170976024000124/AnsiballZ_stat.py'
Jan 23 09:00:39 compute-0 sudo[78549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:39 compute-0 python3.9[78551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:39 compute-0 sudo[78549]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:40 compute-0 sudo[78672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-narvefiqrjtuqkrrhmragziwdonlgzqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158839.4978783-332-170976024000124/AnsiballZ_copy.py'
Jan 23 09:00:40 compute-0 sudo[78672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:40 compute-0 python3.9[78674]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158839.4978783-332-170976024000124/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=d6fa5b2b75cef1305cda84d4cb6e540bab5a60a4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:40 compute-0 sudo[78672]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:40 compute-0 sudo[78824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqfyfhbijpusamoleyntgwkreihcjsqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158840.3459818-332-60163829619470/AnsiballZ_stat.py'
Jan 23 09:00:40 compute-0 sudo[78824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:40 compute-0 python3.9[78826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:40 compute-0 sudo[78824]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:41 compute-0 sudo[78947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgttcmrmdfgkehiawhaseyukoitrgmag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158840.3459818-332-60163829619470/AnsiballZ_copy.py'
Jan 23 09:00:41 compute-0 sudo[78947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:41 compute-0 python3.9[78949]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158840.3459818-332-60163829619470/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=5db96c31f6f1175bc0241a492f6d3125bd637811 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:41 compute-0 sudo[78947]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:41 compute-0 sudo[79099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztrcdnhtjmkswygxrppbmonfhzjsucyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158841.6037097-468-223533748614353/AnsiballZ_file.py'
Jan 23 09:00:41 compute-0 sudo[79099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:42 compute-0 python3.9[79101]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:42 compute-0 sudo[79099]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:42 compute-0 sudo[79251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsmnwpiuacmjrlbgwnjzatxtbdpofyyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158842.179443-468-87798987054841/AnsiballZ_file.py'
Jan 23 09:00:42 compute-0 sudo[79251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:42 compute-0 python3.9[79253]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:42 compute-0 sudo[79251]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:42 compute-0 sudo[79403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhcavazlfabagadjvuecuozjggwdfzko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158842.699974-515-121030456075358/AnsiballZ_stat.py'
Jan 23 09:00:42 compute-0 sudo[79403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:43 compute-0 python3.9[79405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:43 compute-0 sudo[79403]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:43 compute-0 sudo[79526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khhuiykahpvwvmwweinaqllpdtlsscev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158842.699974-515-121030456075358/AnsiballZ_copy.py'
Jan 23 09:00:43 compute-0 sudo[79526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:43 compute-0 python3.9[79528]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158842.699974-515-121030456075358/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=c960d239ff59576e68a9ca381494398bb70fee97 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:43 compute-0 sudo[79526]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:43 compute-0 sudo[79678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrlzmypjgkjohcanueycbxwbosibvezo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158843.5906525-515-89793446026287/AnsiballZ_stat.py'
Jan 23 09:00:43 compute-0 sudo[79678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:44 compute-0 python3.9[79680]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:44 compute-0 sudo[79678]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:44 compute-0 sudo[79801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtutebpcyvywvhcmbnqxoqtadjfbgeov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158843.5906525-515-89793446026287/AnsiballZ_copy.py'
Jan 23 09:00:44 compute-0 sudo[79801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:44 compute-0 python3.9[79803]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158843.5906525-515-89793446026287/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=2fa6ad873b253c0aac0a96d72964699d0cc12e8b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:44 compute-0 sudo[79801]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:44 compute-0 sudo[79953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubgqajhzvekfwtldukfndksebueddorv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158844.7042003-515-64054163874622/AnsiballZ_stat.py'
Jan 23 09:00:44 compute-0 sudo[79953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:45 compute-0 python3.9[79955]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:45 compute-0 sudo[79953]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:45 compute-0 sudo[80076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmdppknjwzyjxwwuyajjizqxfvtejymb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158844.7042003-515-64054163874622/AnsiballZ_copy.py'
Jan 23 09:00:45 compute-0 sudo[80076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:45 compute-0 python3.9[80078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158844.7042003-515-64054163874622/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7030495d1c4b0926e784afc1d501e75828b8f6fb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:45 compute-0 sudo[80076]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:45 compute-0 sudo[80228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhnpcjtogomgadtpxrnwwpcqskhwqulh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158845.704849-656-6847134452078/AnsiballZ_file.py'
Jan 23 09:00:45 compute-0 sudo[80228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:46 compute-0 python3.9[80230]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:46 compute-0 sudo[80228]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:46 compute-0 sudo[80380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyzpbdjbwczaotofvirvptzaqqmhblwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158846.154913-656-118514532668440/AnsiballZ_file.py'
Jan 23 09:00:46 compute-0 sudo[80380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:46 compute-0 python3.9[80382]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:46 compute-0 sudo[80380]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:46 compute-0 sudo[80532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzsyihsrgknrgqykvwruynljfhrhyhwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158846.7352312-706-83469527782960/AnsiballZ_stat.py'
Jan 23 09:00:46 compute-0 sudo[80532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:47 compute-0 python3.9[80534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:47 compute-0 sudo[80532]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:47 compute-0 sudo[80655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njupbzzymmyknsweasqpvmgplwyhrqyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158846.7352312-706-83469527782960/AnsiballZ_copy.py'
Jan 23 09:00:47 compute-0 sudo[80655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:47 compute-0 python3.9[80657]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158846.7352312-706-83469527782960/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d97712930d51e2a929bf041a98ad1c1a93249375 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:47 compute-0 sudo[80655]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:47 compute-0 sudo[80807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whghatfzjdsdptvezndrsqhzimbpaakv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158847.6340547-706-103237295020532/AnsiballZ_stat.py'
Jan 23 09:00:47 compute-0 sudo[80807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:47 compute-0 python3.9[80809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:48 compute-0 sudo[80807]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:48 compute-0 sudo[80930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epzvzbzlvvizkeztobkwlfduuqpruuqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158847.6340547-706-103237295020532/AnsiballZ_copy.py'
Jan 23 09:00:48 compute-0 sudo[80930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:48 compute-0 python3.9[80932]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158847.6340547-706-103237295020532/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=2fa6ad873b253c0aac0a96d72964699d0cc12e8b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:48 compute-0 sudo[80930]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:48 compute-0 sudo[81082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsrxwmailaqwgssnlzmlfnsvecllczvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158848.5234706-706-246243438758517/AnsiballZ_stat.py'
Jan 23 09:00:48 compute-0 sudo[81082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:48 compute-0 python3.9[81084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:48 compute-0 sudo[81082]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:49 compute-0 sudo[81205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skokkvalhqfwuowephqqstxptpiqrqxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158848.5234706-706-246243438758517/AnsiballZ_copy.py'
Jan 23 09:00:49 compute-0 sudo[81205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:49 compute-0 python3.9[81207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158848.5234706-706-246243438758517/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f5ad364014ef3560c3db90adfb4a0f149a8b641f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:49 compute-0 sudo[81205]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:50 compute-0 sudo[81357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roycmcbymyrayozmcsskpxhklqqvghbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158850.1660106-888-47619081856246/AnsiballZ_file.py'
Jan 23 09:00:50 compute-0 sudo[81357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:50 compute-0 python3.9[81359]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:50 compute-0 sudo[81357]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:50 compute-0 sudo[81509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyitelohwcxfetyqfjwueonifizmszbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158850.6585913-912-130176502671488/AnsiballZ_stat.py'
Jan 23 09:00:50 compute-0 sudo[81509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:50 compute-0 python3.9[81511]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:51 compute-0 sudo[81509]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:51 compute-0 sudo[81632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juadsgrkeyqiymdyuiargvxqufnpmuil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158850.6585913-912-130176502671488/AnsiballZ_copy.py'
Jan 23 09:00:51 compute-0 sudo[81632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:51 compute-0 python3.9[81634]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158850.6585913-912-130176502671488/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8561196ac452c885b39729ffdf75bbc977d4d7d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:51 compute-0 sudo[81632]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:51 compute-0 sudo[81784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivsagbmsmeidffxtlknxxlhhzlyolqdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158851.6034675-960-40092208951829/AnsiballZ_file.py'
Jan 23 09:00:51 compute-0 sudo[81784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:51 compute-0 python3.9[81786]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:51 compute-0 sudo[81784]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:52 compute-0 sudo[81936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdwnqwlugkelygjdoyhiricirshdxgwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158852.0807567-986-162221309323868/AnsiballZ_stat.py'
Jan 23 09:00:52 compute-0 sudo[81936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:52 compute-0 python3.9[81938]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:52 compute-0 sudo[81936]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:52 compute-0 sudo[82059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrixokhcirowdbvveloctablutrzrng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158852.0807567-986-162221309323868/AnsiballZ_copy.py'
Jan 23 09:00:52 compute-0 sudo[82059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:52 compute-0 python3.9[82061]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158852.0807567-986-162221309323868/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8561196ac452c885b39729ffdf75bbc977d4d7d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:52 compute-0 sudo[82059]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:53 compute-0 sudo[82211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlofzbhdrojeawqbvnmovaxhwyhqonlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158853.002304-1035-107317273146722/AnsiballZ_file.py'
Jan 23 09:00:53 compute-0 sudo[82211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:53 compute-0 python3.9[82213]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:53 compute-0 sudo[82211]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:53 compute-0 sudo[82363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkkcxnkbfzxwutflekjmemuzoqmpgmgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158853.4857526-1061-32484417578084/AnsiballZ_stat.py'
Jan 23 09:00:53 compute-0 sudo[82363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:53 compute-0 python3.9[82365]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:53 compute-0 sudo[82363]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:54 compute-0 sudo[82486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufmrixetqiduuxjjbxvyvdktjinbqhcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158853.4857526-1061-32484417578084/AnsiballZ_copy.py'
Jan 23 09:00:54 compute-0 sudo[82486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:54 compute-0 python3.9[82488]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158853.4857526-1061-32484417578084/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8561196ac452c885b39729ffdf75bbc977d4d7d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:54 compute-0 sudo[82486]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:54 compute-0 sudo[82638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukkbdkbkwtcptpeihcfyrnblknjplijz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158854.3527975-1108-64295663835439/AnsiballZ_file.py'
Jan 23 09:00:54 compute-0 sudo[82638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:54 compute-0 python3.9[82640]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:54 compute-0 sudo[82638]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:54 compute-0 sudo[82790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvupszaagpgipibsvjwzlcpflxogasgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158854.799682-1132-67843580077400/AnsiballZ_stat.py'
Jan 23 09:00:54 compute-0 sudo[82790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:55 compute-0 python3.9[82792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:55 compute-0 sudo[82790]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:55 compute-0 sudo[82913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iajftmxvvneluwmduuufyckospdjwvyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158854.799682-1132-67843580077400/AnsiballZ_copy.py'
Jan 23 09:00:55 compute-0 sudo[82913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:55 compute-0 python3.9[82915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158854.799682-1132-67843580077400/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8561196ac452c885b39729ffdf75bbc977d4d7d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:55 compute-0 sudo[82913]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:55 compute-0 sudo[83065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovagxqxmerpwewmpqknkgmmabgeczvnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158855.727779-1181-28124554664367/AnsiballZ_file.py'
Jan 23 09:00:55 compute-0 sudo[83065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:56 compute-0 python3.9[83067]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:56 compute-0 sudo[83065]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:56 compute-0 sudo[83217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfvvgtyiurkuobsrsdlumwhlkehsakiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158856.187001-1204-118455521094661/AnsiballZ_stat.py'
Jan 23 09:00:56 compute-0 sudo[83217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:56 compute-0 python3.9[83219]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:56 compute-0 sudo[83217]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:56 compute-0 sudo[83340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peffgezevzcottdzgmobxdplkmgndzsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158856.187001-1204-118455521094661/AnsiballZ_copy.py'
Jan 23 09:00:56 compute-0 sudo[83340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:56 compute-0 python3.9[83342]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158856.187001-1204-118455521094661/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8561196ac452c885b39729ffdf75bbc977d4d7d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:56 compute-0 sudo[83340]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:57 compute-0 sudo[83492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtwlqdqqmokedjwpxjioopyjjvleqzcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158857.1046941-1250-143065229484719/AnsiballZ_file.py'
Jan 23 09:00:57 compute-0 sudo[83492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:57 compute-0 python3.9[83494]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:57 compute-0 sudo[83492]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:57 compute-0 sudo[83644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utbxcgoyaudhipusmuigyvtcvdntwzmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158857.5759776-1276-193072663439946/AnsiballZ_stat.py'
Jan 23 09:00:57 compute-0 sudo[83644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:57 compute-0 python3.9[83646]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:57 compute-0 sudo[83644]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:58 compute-0 sudo[83767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwjebdrljcizdkvbwvhlxmcfmkjsaiba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158857.5759776-1276-193072663439946/AnsiballZ_copy.py'
Jan 23 09:00:58 compute-0 sudo[83767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:58 compute-0 python3.9[83769]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158857.5759776-1276-193072663439946/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8561196ac452c885b39729ffdf75bbc977d4d7d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:58 compute-0 sudo[83767]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:58 compute-0 sudo[83919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odxuiuwarvjqklehsoyfuvdnssdangvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158858.4631457-1323-220784895967604/AnsiballZ_file.py'
Jan 23 09:00:58 compute-0 sudo[83919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:58 compute-0 python3.9[83921]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:00:58 compute-0 sudo[83919]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:59 compute-0 sudo[84071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiiezctsquuvynsganfzslxvbycqgxbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158858.932784-1344-32815609559837/AnsiballZ_stat.py'
Jan 23 09:00:59 compute-0 sudo[84071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:59 compute-0 python3.9[84073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:00:59 compute-0 sudo[84071]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:59 compute-0 sudo[84194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuojqsoistzvehnpyynsxnwynxmxeyaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158858.932784-1344-32815609559837/AnsiballZ_copy.py'
Jan 23 09:00:59 compute-0 sudo[84194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:00:59 compute-0 python3.9[84196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158858.932784-1344-32815609559837/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8561196ac452c885b39729ffdf75bbc977d4d7d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:00:59 compute-0 sudo[84194]: pam_unix(sudo:session): session closed for user root
Jan 23 09:00:59 compute-0 sshd-session[76537]: Connection closed by 192.168.122.30 port 39268
Jan 23 09:00:59 compute-0 sshd-session[76534]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:00:59 compute-0 systemd-logind[746]: Session 18 logged out. Waiting for processes to exit.
Jan 23 09:00:59 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Jan 23 09:00:59 compute-0 systemd[1]: session-18.scope: Consumed 19.937s CPU time.
Jan 23 09:00:59 compute-0 systemd-logind[746]: Removed session 18.
Jan 23 09:01:01 compute-0 CROND[84222]: (root) CMD (run-parts /etc/cron.hourly)
Jan 23 09:01:01 compute-0 run-parts[84225]: (/etc/cron.hourly) starting 0anacron
Jan 23 09:01:01 compute-0 anacron[84233]: Anacron started on 2026-01-23
Jan 23 09:01:01 compute-0 anacron[84233]: Will run job `cron.daily' in 31 min.
Jan 23 09:01:01 compute-0 anacron[84233]: Will run job `cron.weekly' in 51 min.
Jan 23 09:01:01 compute-0 anacron[84233]: Will run job `cron.monthly' in 71 min.
Jan 23 09:01:01 compute-0 anacron[84233]: Jobs will be executed sequentially
Jan 23 09:01:01 compute-0 run-parts[84235]: (/etc/cron.hourly) finished 0anacron
Jan 23 09:01:01 compute-0 CROND[84221]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 23 09:01:05 compute-0 sshd-session[84236]: Accepted publickey for zuul from 192.168.122.30 port 34096 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:01:05 compute-0 systemd-logind[746]: New session 19 of user zuul.
Jan 23 09:01:05 compute-0 systemd[1]: Started Session 19 of User zuul.
Jan 23 09:01:05 compute-0 sshd-session[84236]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:01:06 compute-0 python3.9[84389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:06 compute-0 sudo[84543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbcuovfttjualosnbofspnditsnqqcyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158866.388028-62-180671356781008/AnsiballZ_file.py'
Jan 23 09:01:06 compute-0 sudo[84543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:06 compute-0 python3.9[84545]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:06 compute-0 sudo[84543]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:07 compute-0 sudo[84695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcgnrobyrboqttlfjtqpxoezqnxjgcxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158867.002917-62-182753154935577/AnsiballZ_file.py'
Jan 23 09:01:07 compute-0 sudo[84695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:07 compute-0 python3.9[84697]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:07 compute-0 sudo[84695]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:07 compute-0 python3.9[84847]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:08 compute-0 sudo[84997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkuwlhqhgfbbsiydqcknejnmmlggaidi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158868.0459912-131-194599140342321/AnsiballZ_seboolean.py'
Jan 23 09:01:08 compute-0 sudo[84997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:08 compute-0 python3.9[84999]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 09:01:09 compute-0 sudo[84997]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:09 compute-0 sudo[85153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hptbprzzoajrzpmeluifrncykczdkzfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158869.5273883-161-98182045798877/AnsiballZ_setup.py'
Jan 23 09:01:09 compute-0 dbus-broker-launch[733]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 23 09:01:09 compute-0 sudo[85153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:09 compute-0 python3.9[85155]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:01:10 compute-0 sudo[85153]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:10 compute-0 sudo[85237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdcmalbcsakurojkkmnazbzilkybzssb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158869.5273883-161-98182045798877/AnsiballZ_dnf.py'
Jan 23 09:01:10 compute-0 sudo[85237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:10 compute-0 python3.9[85239]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:01:11 compute-0 sudo[85237]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:12 compute-0 sudo[85390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvvwsksntamkrfiyiyblwxxrfltrgxja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158871.7001019-197-145904218244240/AnsiballZ_systemd.py'
Jan 23 09:01:12 compute-0 sudo[85390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:12 compute-0 python3.9[85392]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:01:12 compute-0 sudo[85390]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:12 compute-0 sudo[85545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgkhlfwibwlcaqfmidypbxaymfikmhem ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769158872.602596-221-115768817455311/AnsiballZ_edpm_nftables_snippet.py'
Jan 23 09:01:12 compute-0 sudo[85545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:13 compute-0 python3[85547]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 23 09:01:13 compute-0 sudo[85545]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:13 compute-0 sudo[85697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzrkjtkriudwstcoatwywlqzbsgszdkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158873.3286862-248-212800872881812/AnsiballZ_file.py'
Jan 23 09:01:13 compute-0 sudo[85697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:13 compute-0 python3.9[85699]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:13 compute-0 sudo[85697]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:14 compute-0 sudo[85849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewzuglfqpktetsdnethjjetjogbffrvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158873.7873988-272-209950688951711/AnsiballZ_stat.py'
Jan 23 09:01:14 compute-0 sudo[85849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:14 compute-0 python3.9[85851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:14 compute-0 sudo[85849]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:14 compute-0 sudo[85927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuqzbelynoppwvqoqoftjfrtxzrjozem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158873.7873988-272-209950688951711/AnsiballZ_file.py'
Jan 23 09:01:14 compute-0 sudo[85927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:14 compute-0 python3.9[85929]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:14 compute-0 sudo[85927]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:14 compute-0 sudo[86079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftyghtviuupfwhygekczalungvcrolsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158874.7273805-308-60963355282560/AnsiballZ_stat.py'
Jan 23 09:01:14 compute-0 sudo[86079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:15 compute-0 python3.9[86081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:15 compute-0 sudo[86079]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:15 compute-0 sudo[86157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmtakjycgqxsairzhptdvyzlqaqtvycl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158874.7273805-308-60963355282560/AnsiballZ_file.py'
Jan 23 09:01:15 compute-0 sudo[86157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:15 compute-0 python3.9[86159]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wybgg9ob recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:15 compute-0 sudo[86157]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:15 compute-0 sudo[86309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgjjzxjmkdzdpsfjbidqhwhzgexpdtcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158875.6807823-344-267721817961921/AnsiballZ_stat.py'
Jan 23 09:01:15 compute-0 sudo[86309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:16 compute-0 python3.9[86311]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:16 compute-0 sudo[86309]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:16 compute-0 sudo[86387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsykdoaprsalxlblmowphllvugejumkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158875.6807823-344-267721817961921/AnsiballZ_file.py'
Jan 23 09:01:16 compute-0 sudo[86387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:16 compute-0 python3.9[86389]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:16 compute-0 sudo[86387]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:16 compute-0 sudo[86539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wceoxzflnzvrmrcfkdovwfqkbkvnurpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158876.5649884-383-47966802451318/AnsiballZ_command.py'
Jan 23 09:01:16 compute-0 sudo[86539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:17 compute-0 python3.9[86541]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:17 compute-0 sudo[86539]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:17 compute-0 sudo[86692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnaotzjuonqejnlhjyzqpfzxjligleky ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769158877.209607-407-241166263849859/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 09:01:17 compute-0 sudo[86692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:17 compute-0 python3[86694]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 09:01:17 compute-0 sudo[86692]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:18 compute-0 sudo[86844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwhntlpjliwvbgxymdsmdabbtircnwuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158877.828279-431-258779154362988/AnsiballZ_stat.py'
Jan 23 09:01:18 compute-0 sudo[86844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:18 compute-0 python3.9[86846]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:18 compute-0 sudo[86844]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:18 compute-0 sudo[86969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcbvyytwnxsnukluwstmadghmojpncih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158877.828279-431-258779154362988/AnsiballZ_copy.py'
Jan 23 09:01:18 compute-0 sudo[86969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:18 compute-0 python3.9[86971]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158877.828279-431-258779154362988/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:18 compute-0 sudo[86969]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:19 compute-0 sudo[87121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jydczbnpljbfiaksvvstifjbdpwvplpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158878.8338768-476-189467133918343/AnsiballZ_stat.py'
Jan 23 09:01:19 compute-0 sudo[87121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:19 compute-0 python3.9[87123]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:19 compute-0 sudo[87121]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:19 compute-0 sudo[87246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvwgswhyjkfsthagpvfgiznxkmsjhnxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158878.8338768-476-189467133918343/AnsiballZ_copy.py'
Jan 23 09:01:19 compute-0 sudo[87246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:19 compute-0 python3.9[87248]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158878.8338768-476-189467133918343/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:19 compute-0 sudo[87246]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:19 compute-0 sudo[87398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgkmelnyesqkwoayttcinneqotpymenq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158879.7379167-521-170456334195197/AnsiballZ_stat.py'
Jan 23 09:01:19 compute-0 sudo[87398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:20 compute-0 python3.9[87400]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:20 compute-0 sudo[87398]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:20 compute-0 sudo[87523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmqbzbdgvrndedpqpwtqjzoydkitkzkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158879.7379167-521-170456334195197/AnsiballZ_copy.py'
Jan 23 09:01:20 compute-0 sudo[87523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:20 compute-0 python3.9[87525]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158879.7379167-521-170456334195197/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:20 compute-0 sudo[87523]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:20 compute-0 sudo[87675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffnozblvguenkuzkqzwanwpludyczwrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158880.675281-566-143578723994413/AnsiballZ_stat.py'
Jan 23 09:01:20 compute-0 sudo[87675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:21 compute-0 python3.9[87677]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:21 compute-0 sudo[87675]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:21 compute-0 sudo[87800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlvihnfpsutfcmagjajyxmzmsqjvuekn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158880.675281-566-143578723994413/AnsiballZ_copy.py'
Jan 23 09:01:21 compute-0 sudo[87800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:21 compute-0 chronyd[64253]: Selected source 166.88.142.52 (pool.ntp.org)
Jan 23 09:01:21 compute-0 python3.9[87802]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158880.675281-566-143578723994413/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:21 compute-0 sudo[87800]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:21 compute-0 sudo[87952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpgumrkeiexznpnxlrqphcvfvetatjde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158881.5484705-611-80538450961864/AnsiballZ_stat.py'
Jan 23 09:01:21 compute-0 sudo[87952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:21 compute-0 python3.9[87954]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:22 compute-0 sudo[87952]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:22 compute-0 sudo[88077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzmkkmoczapbxjaswztxodlmrifqwrsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158881.5484705-611-80538450961864/AnsiballZ_copy.py'
Jan 23 09:01:22 compute-0 sudo[88077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:22 compute-0 python3.9[88079]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158881.5484705-611-80538450961864/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:22 compute-0 sudo[88077]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:22 compute-0 sudo[88229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xebakhbgselcdweaxqwpuwtlvkkfrhyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158882.5064924-656-52122489200675/AnsiballZ_file.py'
Jan 23 09:01:22 compute-0 sudo[88229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:22 compute-0 python3.9[88231]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:22 compute-0 sudo[88229]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:23 compute-0 sudo[88381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjzbzvypisabwsvshqwgxmrjnasiwcqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158883.0106027-680-90521907036650/AnsiballZ_command.py'
Jan 23 09:01:23 compute-0 sudo[88381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:23 compute-0 python3.9[88383]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:23 compute-0 sudo[88381]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:23 compute-0 sudo[88536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqerfvjibvzbsfvbcmrkdcvxxcdffemb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158883.5047047-704-145315062797431/AnsiballZ_blockinfile.py'
Jan 23 09:01:23 compute-0 sudo[88536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:23 compute-0 python3.9[88538]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:23 compute-0 sudo[88536]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:24 compute-0 sudo[88688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhstjgbvcfoqdilsadvajgaxfkuihenf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158884.175078-731-214403375743217/AnsiballZ_command.py'
Jan 23 09:01:24 compute-0 sudo[88688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:24 compute-0 python3.9[88690]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:24 compute-0 sudo[88688]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:24 compute-0 sudo[88841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olybpdsnmjjutnhkxdoibisbukvhuvcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158884.6692088-755-243444881901212/AnsiballZ_stat.py'
Jan 23 09:01:24 compute-0 sudo[88841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:24 compute-0 python3.9[88843]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:01:25 compute-0 sudo[88841]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:25 compute-0 sudo[88995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqxpkmhcuwrvzxqntqudbavykkmbcudm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158885.13662-779-44705037843063/AnsiballZ_command.py'
Jan 23 09:01:25 compute-0 sudo[88995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:25 compute-0 python3.9[88997]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:25 compute-0 sudo[88995]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:25 compute-0 sudo[89150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdmzkjgpkviaolwjevcnubytldcrjigv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158885.6738725-803-85091288061690/AnsiballZ_file.py'
Jan 23 09:01:25 compute-0 sudo[89150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:26 compute-0 python3.9[89152]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:26 compute-0 sudo[89150]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:26 compute-0 python3.9[89302]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:27 compute-0 sudo[89453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykjobqoswuqbxjdxppbuqquisxfdfirx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158887.695846-923-21128436146467/AnsiballZ_command.py'
Jan 23 09:01:27 compute-0 sudo[89453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:28 compute-0 python3.9[89455]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:28 compute-0 ovs-vsctl[89456]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 23 09:01:28 compute-0 sudo[89453]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:28 compute-0 sudo[89606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiubpffzatwlhxofgmqzldkssxsphggx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158888.209571-950-162812780475229/AnsiballZ_command.py'
Jan 23 09:01:28 compute-0 sudo[89606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:28 compute-0 python3.9[89608]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:28 compute-0 sudo[89606]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:28 compute-0 sudo[89761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydxilxlhurwgvjrzfuqvpoljjdzayzzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158888.6788635-974-97581357219340/AnsiballZ_command.py'
Jan 23 09:01:28 compute-0 sudo[89761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:28 compute-0 python3.9[89763]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:29 compute-0 ovs-vsctl[89764]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 23 09:01:29 compute-0 sudo[89761]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:29 compute-0 python3.9[89914]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:01:29 compute-0 sudo[90066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umcklvymtoyaqdjcsvkltmbhixquohca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158889.7208443-1025-215872751363576/AnsiballZ_file.py'
Jan 23 09:01:29 compute-0 sudo[90066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:30 compute-0 python3.9[90068]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:30 compute-0 sudo[90066]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:30 compute-0 sudo[90218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcjjedotwijnjqaarvplguocjhfqcxay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158890.1920226-1049-235560327725163/AnsiballZ_stat.py'
Jan 23 09:01:30 compute-0 sudo[90218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:30 compute-0 python3.9[90220]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:30 compute-0 sudo[90218]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:30 compute-0 sudo[90296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wajpsxntglahlhlodrnshltrmcopadom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158890.1920226-1049-235560327725163/AnsiballZ_file.py'
Jan 23 09:01:30 compute-0 sudo[90296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:30 compute-0 python3.9[90298]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:30 compute-0 sudo[90296]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:31 compute-0 sudo[90448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dduvlekecytohqkcfelnkmsgxucoxgny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158890.9832296-1049-136041984879602/AnsiballZ_stat.py'
Jan 23 09:01:31 compute-0 sudo[90448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:31 compute-0 python3.9[90450]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:31 compute-0 sudo[90448]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:31 compute-0 sudo[90526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xemapudijkjfuzlsdgmycshxzpakwvyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158890.9832296-1049-136041984879602/AnsiballZ_file.py'
Jan 23 09:01:31 compute-0 sudo[90526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:31 compute-0 python3.9[90528]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:31 compute-0 sudo[90526]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:31 compute-0 sudo[90678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gocftpstkddechxilydfkhmkjgmtarkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158891.821679-1118-217558367544712/AnsiballZ_file.py'
Jan 23 09:01:31 compute-0 sudo[90678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:32 compute-0 python3.9[90680]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:32 compute-0 sudo[90678]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:32 compute-0 sudo[90830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wccqgphwgwomyngigobdzfcjoqnhwbtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158892.293972-1142-113753032166974/AnsiballZ_stat.py'
Jan 23 09:01:32 compute-0 sudo[90830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:32 compute-0 python3.9[90832]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:32 compute-0 sudo[90830]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:32 compute-0 sudo[90908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkuevlqcaunhrntedltwoaebqyavoxzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158892.293972-1142-113753032166974/AnsiballZ_file.py'
Jan 23 09:01:32 compute-0 sudo[90908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:32 compute-0 python3.9[90910]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:32 compute-0 sudo[90908]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:33 compute-0 sudo[91060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlzwwjihcwcdumhamuqtfragsvezzocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158893.0852256-1178-151341874497870/AnsiballZ_stat.py'
Jan 23 09:01:33 compute-0 sudo[91060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:33 compute-0 python3.9[91062]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:33 compute-0 sudo[91060]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:33 compute-0 sudo[91138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrwkejmplmjqkcaeykyfmsqpaegqbidm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158893.0852256-1178-151341874497870/AnsiballZ_file.py'
Jan 23 09:01:33 compute-0 sudo[91138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:33 compute-0 python3.9[91140]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:33 compute-0 sudo[91138]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:34 compute-0 sudo[91290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eimsavgyqrzhdyjfrbubtiiupkkmwctr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158893.8819206-1214-32795865845595/AnsiballZ_systemd.py'
Jan 23 09:01:34 compute-0 sudo[91290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:34 compute-0 python3.9[91292]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:01:34 compute-0 systemd[1]: Reloading.
Jan 23 09:01:34 compute-0 systemd-sysv-generator[91319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:01:34 compute-0 systemd-rc-local-generator[91316]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:01:34 compute-0 sudo[91290]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:34 compute-0 sudo[91480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgskmtktipqhvbarrfgfxzzrhbjsowgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158894.6784391-1238-101508574023671/AnsiballZ_stat.py'
Jan 23 09:01:34 compute-0 sudo[91480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:35 compute-0 python3.9[91482]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:35 compute-0 sudo[91480]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:35 compute-0 sudo[91558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhwlyhsarqqmbewimwzjcnugyjbshlub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158894.6784391-1238-101508574023671/AnsiballZ_file.py'
Jan 23 09:01:35 compute-0 sudo[91558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:35 compute-0 python3.9[91560]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:35 compute-0 sudo[91558]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:35 compute-0 sudo[91710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trhyezxuavrdamebxclroiupxboyzuzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158895.5507998-1274-147144544978751/AnsiballZ_stat.py'
Jan 23 09:01:35 compute-0 sudo[91710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:35 compute-0 python3.9[91712]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:35 compute-0 sudo[91710]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:36 compute-0 sudo[91788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmvegfbajiyxnszcktergphqykcrstfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158895.5507998-1274-147144544978751/AnsiballZ_file.py'
Jan 23 09:01:36 compute-0 sudo[91788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:36 compute-0 python3.9[91790]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:36 compute-0 sudo[91788]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:36 compute-0 sudo[91940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osihfamaqfsqgynksdykfybehebvjxxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158896.4456816-1310-36387456333957/AnsiballZ_systemd.py'
Jan 23 09:01:36 compute-0 sudo[91940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:36 compute-0 python3.9[91942]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:01:36 compute-0 systemd[1]: Reloading.
Jan 23 09:01:36 compute-0 systemd-rc-local-generator[91966]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:01:36 compute-0 systemd-sysv-generator[91969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:01:37 compute-0 systemd[1]: Starting Create netns directory...
Jan 23 09:01:37 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 09:01:37 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 09:01:37 compute-0 systemd[1]: Finished Create netns directory.
Jan 23 09:01:37 compute-0 sudo[91940]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:37 compute-0 sudo[92134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwgbatcvvcmpqbxqfadqrdsbyrcywoqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158897.4031472-1340-108438508280580/AnsiballZ_file.py'
Jan 23 09:01:37 compute-0 sudo[92134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:37 compute-0 python3.9[92136]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:37 compute-0 sudo[92134]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:38 compute-0 sudo[92286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnunqjhobzypucpvlmcxbjakvetvjiiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158897.8942149-1364-185142725577188/AnsiballZ_stat.py'
Jan 23 09:01:38 compute-0 sudo[92286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:38 compute-0 python3.9[92288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:38 compute-0 sudo[92286]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:38 compute-0 sudo[92409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezgwjwqsgwoviusqmnqgtekrkdrxdwsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158897.8942149-1364-185142725577188/AnsiballZ_copy.py'
Jan 23 09:01:38 compute-0 sudo[92409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:38 compute-0 python3.9[92411]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158897.8942149-1364-185142725577188/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:38 compute-0 sudo[92409]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:39 compute-0 sudo[92561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vocvgebfxhbndoslnvtwgpskwfntcsyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158899.0136662-1415-42380148917886/AnsiballZ_file.py'
Jan 23 09:01:39 compute-0 sudo[92561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:39 compute-0 python3.9[92563]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:39 compute-0 sudo[92561]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:39 compute-0 sudo[92713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gagvrlmwwjnswgtdsynvdavoirmgrfbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158899.537303-1439-62520839270983/AnsiballZ_file.py'
Jan 23 09:01:39 compute-0 sudo[92713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:39 compute-0 python3.9[92715]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:39 compute-0 sudo[92713]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:40 compute-0 sudo[92865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-simdwwrxnjebcynxlcxiatymyngoxnun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158900.047874-1463-218424737406785/AnsiballZ_stat.py'
Jan 23 09:01:40 compute-0 sudo[92865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:40 compute-0 python3.9[92867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:40 compute-0 sudo[92865]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:40 compute-0 sudo[92988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmtotueejfdjsxsueiehboqmnqfiusvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158900.047874-1463-218424737406785/AnsiballZ_copy.py'
Jan 23 09:01:40 compute-0 sudo[92988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:40 compute-0 python3.9[92990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158900.047874-1463-218424737406785/.source.json _original_basename=.dclt8b60 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:40 compute-0 sudo[92988]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:41 compute-0 python3.9[93140]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:42 compute-0 sudo[93561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruibyputicutzxhwaoomdycbqomzfqee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158902.6721098-1583-37678951802874/AnsiballZ_container_config_data.py'
Jan 23 09:01:42 compute-0 sudo[93561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:43 compute-0 python3.9[93563]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 23 09:01:43 compute-0 sudo[93561]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:43 compute-0 sudo[93713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqbuyxujgokuukfefzvyadvtugdovqvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158903.455181-1616-39908513075269/AnsiballZ_container_config_hash.py'
Jan 23 09:01:43 compute-0 sudo[93713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:43 compute-0 python3.9[93715]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 09:01:43 compute-0 sudo[93713]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:44 compute-0 sudo[93865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaqitakchcgqxvkexvsqvebqdpuxjfju ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769158904.2474647-1646-55234079574463/AnsiballZ_edpm_container_manage.py'
Jan 23 09:01:44 compute-0 sudo[93865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:44 compute-0 python3[93867]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 09:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:01:44 compute-0 podman[93896]: 2026-01-23 09:01:44.940462382 +0000 UTC m=+0.027384606 container create f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:01:44 compute-0 podman[93896]: 2026-01-23 09:01:44.928096169 +0000 UTC m=+0.015018413 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 09:01:44 compute-0 python3[93867]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 09:01:45 compute-0 sudo[93865]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:45 compute-0 sudo[94075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpnfhrqkfbjwlajwvwvoxijhjpopzqha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158905.1470785-1670-279794944009063/AnsiballZ_stat.py'
Jan 23 09:01:45 compute-0 sudo[94075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:45 compute-0 python3.9[94077]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:01:45 compute-0 sudo[94075]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:01:45 compute-0 sudo[94229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvpnettpjaaifcddnwjhmjxyvehyrgly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158905.729787-1697-137254997640855/AnsiballZ_file.py'
Jan 23 09:01:45 compute-0 sudo[94229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:46 compute-0 python3.9[94231]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:46 compute-0 sudo[94229]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:46 compute-0 sudo[94305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwhqkiuweonwhokicbsyztkgrrcdhcwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158905.729787-1697-137254997640855/AnsiballZ_stat.py'
Jan 23 09:01:46 compute-0 sudo[94305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:46 compute-0 python3.9[94307]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:01:46 compute-0 sudo[94305]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:46 compute-0 sudo[94456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddxddbmfxwforvnygxjoxzxvbhzqtwft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158906.5764382-1697-161045580390548/AnsiballZ_copy.py'
Jan 23 09:01:46 compute-0 sudo[94456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:47 compute-0 python3.9[94458]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158906.5764382-1697-161045580390548/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:47 compute-0 sudo[94456]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:47 compute-0 sudo[94532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcgegosmgdinmjwmkxhqdjiqwbaqjtqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158906.5764382-1697-161045580390548/AnsiballZ_systemd.py'
Jan 23 09:01:47 compute-0 sudo[94532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:47 compute-0 python3.9[94534]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:01:47 compute-0 systemd[1]: Reloading.
Jan 23 09:01:47 compute-0 systemd-rc-local-generator[94557]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:01:47 compute-0 systemd-sysv-generator[94561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:01:47 compute-0 sudo[94532]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:47 compute-0 sudo[94643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryvahmopwzemnynapibjejkzeisrlogh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158906.5764382-1697-161045580390548/AnsiballZ_systemd.py'
Jan 23 09:01:47 compute-0 sudo[94643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:48 compute-0 python3.9[94645]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:01:48 compute-0 systemd[1]: Reloading.
Jan 23 09:01:48 compute-0 systemd-rc-local-generator[94668]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:01:48 compute-0 systemd-sysv-generator[94671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:01:48 compute-0 systemd[1]: Starting ovn_controller container...
Jan 23 09:01:48 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 23 09:01:48 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:01:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326bc298fae466bbaf5752436a02f4133a515b2c197baf747aa5f28d8fcc2432/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 23 09:01:48 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d.
Jan 23 09:01:48 compute-0 podman[94685]: 2026-01-23 09:01:48.364140131 +0000 UTC m=+0.081568550 container init f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + sudo -E kolla_set_configs
Jan 23 09:01:48 compute-0 podman[94685]: 2026-01-23 09:01:48.388504822 +0000 UTC m=+0.105933251 container start f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Jan 23 09:01:48 compute-0 edpm-start-podman-container[94685]: ovn_controller
Jan 23 09:01:48 compute-0 systemd[1]: Created slice User Slice of UID 0.
Jan 23 09:01:48 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 23 09:01:48 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 23 09:01:48 compute-0 edpm-start-podman-container[94684]: Creating additional drop-in dependency for "ovn_controller" (f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d)
Jan 23 09:01:48 compute-0 systemd[1]: Starting User Manager for UID 0...
Jan 23 09:01:48 compute-0 systemd[94726]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 23 09:01:48 compute-0 systemd[1]: Reloading.
Jan 23 09:01:48 compute-0 podman[94704]: 2026-01-23 09:01:48.459832125 +0000 UTC m=+0.063055039 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:01:48 compute-0 systemd[94726]: Queued start job for default target Main User Target.
Jan 23 09:01:48 compute-0 systemd-rc-local-generator[94775]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:01:48 compute-0 systemd-sysv-generator[94779]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:01:48 compute-0 systemd[94726]: Created slice User Application Slice.
Jan 23 09:01:48 compute-0 systemd[94726]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 23 09:01:48 compute-0 systemd[94726]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:01:48 compute-0 systemd[94726]: Reached target Paths.
Jan 23 09:01:48 compute-0 systemd[94726]: Reached target Timers.
Jan 23 09:01:48 compute-0 systemd[94726]: Starting D-Bus User Message Bus Socket...
Jan 23 09:01:48 compute-0 systemd[94726]: Starting Create User's Volatile Files and Directories...
Jan 23 09:01:48 compute-0 systemd[94726]: Finished Create User's Volatile Files and Directories.
Jan 23 09:01:48 compute-0 systemd[94726]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:01:48 compute-0 systemd[94726]: Reached target Sockets.
Jan 23 09:01:48 compute-0 systemd[94726]: Reached target Basic System.
Jan 23 09:01:48 compute-0 systemd[94726]: Reached target Main User Target.
Jan 23 09:01:48 compute-0 systemd[94726]: Startup finished in 86ms.
Jan 23 09:01:48 compute-0 systemd[1]: Started User Manager for UID 0.
Jan 23 09:01:48 compute-0 systemd[1]: Started ovn_controller container.
Jan 23 09:01:48 compute-0 systemd[1]: f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d-465e6dd946dde876.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 09:01:48 compute-0 systemd[1]: f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d-465e6dd946dde876.service: Failed with result 'exit-code'.
Jan 23 09:01:48 compute-0 systemd[1]: Started Session c1 of User root.
Jan 23 09:01:48 compute-0 sudo[94643]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:48 compute-0 ovn_controller[94697]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 09:01:48 compute-0 ovn_controller[94697]: INFO:__main__:Validating config file
Jan 23 09:01:48 compute-0 ovn_controller[94697]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 09:01:48 compute-0 ovn_controller[94697]: INFO:__main__:Writing out command to execute
Jan 23 09:01:48 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 23 09:01:48 compute-0 ovn_controller[94697]: ++ cat /run_command
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + ARGS=
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + sudo kolla_copy_cacerts
Jan 23 09:01:48 compute-0 systemd[1]: Started Session c2 of User root.
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + [[ ! -n '' ]]
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + . kolla_extend_start
Jan 23 09:01:48 compute-0 ovn_controller[94697]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + umask 0022
Jan 23 09:01:48 compute-0 ovn_controller[94697]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 23 09:01:48 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7509] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7514] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <warn>  [1769158908.7515] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7520] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7524] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7526] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 09:01:48 compute-0 kernel: br-int: entered promiscuous mode
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 09:01:48 compute-0 ovn_controller[94697]: 2026-01-23T09:01:48Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7634] manager: (ovn-66e142-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7638] manager: (ovn-116624-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Jan 23 09:01:48 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7748] device (genev_sys_6081): carrier: link connected
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7750] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Jan 23 09:01:48 compute-0 systemd-udevd[94825]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:01:48 compute-0 systemd-udevd[94830]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:01:48 compute-0 NetworkManager[54920]: <info>  [1769158908.7903] manager: (ovn-4d28b1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 23 09:01:49 compute-0 python3.9[94957]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 09:01:50 compute-0 sudo[95107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cikmgtqdsvvgedkrfawlathdclyctbhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158909.8882155-1832-253342869975397/AnsiballZ_stat.py'
Jan 23 09:01:50 compute-0 sudo[95107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:50 compute-0 python3.9[95109]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:01:50 compute-0 sudo[95107]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:50 compute-0 sudo[95230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nszfcgastaimeaymlmyhgzqgiwvolcom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158909.8882155-1832-253342869975397/AnsiballZ_copy.py'
Jan 23 09:01:50 compute-0 sudo[95230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:50 compute-0 python3.9[95232]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158909.8882155-1832-253342869975397/.source.yaml _original_basename=._bd8tyax follow=False checksum=cab9709c37f943d389561ff89ee5ad42859acc19 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:50 compute-0 sudo[95230]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:50 compute-0 sudo[95382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exsqdfwhnbssqxyqgrefkybohdwdrlwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158910.7566528-1877-188756458996486/AnsiballZ_command.py'
Jan 23 09:01:50 compute-0 sudo[95382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:51 compute-0 python3.9[95384]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:51 compute-0 ovs-vsctl[95385]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 23 09:01:51 compute-0 sudo[95382]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:51 compute-0 sudo[95535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wywljcqlrdkdiftvtvmribnwcmvedxwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158911.2314057-1901-29467127471184/AnsiballZ_command.py'
Jan 23 09:01:51 compute-0 sudo[95535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:51 compute-0 python3.9[95537]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:51 compute-0 ovs-vsctl[95539]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 23 09:01:51 compute-0 sudo[95535]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:52 compute-0 sudo[95690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyozkdhvyaxziucbrztkyzyijqgljams ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158912.021633-1943-215609422913355/AnsiballZ_command.py'
Jan 23 09:01:52 compute-0 sudo[95690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:52 compute-0 python3.9[95692]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:01:52 compute-0 ovs-vsctl[95693]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 23 09:01:52 compute-0 sudo[95690]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:52 compute-0 sshd-session[84239]: Connection closed by 192.168.122.30 port 34096
Jan 23 09:01:52 compute-0 sshd-session[84236]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:01:52 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Jan 23 09:01:52 compute-0 systemd[1]: session-19.scope: Consumed 31.945s CPU time.
Jan 23 09:01:52 compute-0 systemd-logind[746]: Session 19 logged out. Waiting for processes to exit.
Jan 23 09:01:52 compute-0 systemd-logind[746]: Removed session 19.
Jan 23 09:01:57 compute-0 sshd-session[95718]: Accepted publickey for zuul from 192.168.122.30 port 49072 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:01:57 compute-0 systemd-logind[746]: New session 21 of user zuul.
Jan 23 09:01:57 compute-0 systemd[1]: Started Session 21 of User zuul.
Jan 23 09:01:57 compute-0 sshd-session[95718]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:01:58 compute-0 python3.9[95871]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:58 compute-0 systemd[1]: Stopping User Manager for UID 0...
Jan 23 09:01:58 compute-0 systemd[94726]: Activating special unit Exit the Session...
Jan 23 09:01:58 compute-0 systemd[94726]: Stopped target Main User Target.
Jan 23 09:01:58 compute-0 systemd[94726]: Stopped target Basic System.
Jan 23 09:01:58 compute-0 systemd[94726]: Stopped target Paths.
Jan 23 09:01:58 compute-0 systemd[94726]: Stopped target Sockets.
Jan 23 09:01:58 compute-0 systemd[94726]: Stopped target Timers.
Jan 23 09:01:58 compute-0 systemd[94726]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:01:58 compute-0 systemd[94726]: Closed D-Bus User Message Bus Socket.
Jan 23 09:01:58 compute-0 systemd[94726]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:01:58 compute-0 systemd[94726]: Removed slice User Application Slice.
Jan 23 09:01:58 compute-0 systemd[94726]: Reached target Shutdown.
Jan 23 09:01:58 compute-0 systemd[94726]: Finished Exit the Session.
Jan 23 09:01:58 compute-0 systemd[94726]: Reached target Exit the Session.
Jan 23 09:01:58 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Jan 23 09:01:58 compute-0 systemd[1]: Stopped User Manager for UID 0.
Jan 23 09:01:58 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 23 09:01:58 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 23 09:01:58 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 23 09:01:58 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 23 09:01:58 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Jan 23 09:01:58 compute-0 sudo[96027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgwwnxcimbcklfdtrpyulfcaiutuwiec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158918.6104693-62-196781112757434/AnsiballZ_file.py'
Jan 23 09:01:58 compute-0 sudo[96027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:59 compute-0 python3.9[96029]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:59 compute-0 sudo[96027]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:59 compute-0 sudo[96180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkudvkotxirlxitxuwpbxichfybilabf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158919.1991878-62-198886352696131/AnsiballZ_file.py'
Jan 23 09:01:59 compute-0 sudo[96180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:59 compute-0 python3.9[96182]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:59 compute-0 sudo[96180]: pam_unix(sudo:session): session closed for user root
Jan 23 09:01:59 compute-0 sudo[96332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftpcibkkcpoppqhgzlvvtlzyazbocriu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158919.647692-62-245159278777953/AnsiballZ_file.py'
Jan 23 09:01:59 compute-0 sudo[96332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:01:59 compute-0 python3.9[96334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:01:59 compute-0 sudo[96332]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:00 compute-0 sudo[96484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yebwvokjcgtnzjheloomhqrcknkztvxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158920.0806947-62-44378875392465/AnsiballZ_file.py'
Jan 23 09:02:00 compute-0 sudo[96484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:00 compute-0 python3.9[96486]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:00 compute-0 sudo[96484]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:00 compute-0 sudo[96636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqzjjoqtnegxppeslflfptdjfaqrjnzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158920.5066397-62-16487938559847/AnsiballZ_file.py'
Jan 23 09:02:00 compute-0 sudo[96636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:00 compute-0 python3.9[96638]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:00 compute-0 sudo[96636]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:01 compute-0 python3.9[96788]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:02:01 compute-0 sudo[96938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozczbsjehaqizqdserdvavxpmecwhtmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158921.5813484-194-254758003445923/AnsiballZ_seboolean.py'
Jan 23 09:02:01 compute-0 sudo[96938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:02 compute-0 python3.9[96940]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 09:02:02 compute-0 sudo[96938]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:03 compute-0 python3.9[97090]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:03 compute-0 python3.9[97211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158922.6744816-218-245624053079731/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:04 compute-0 python3.9[97361]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:04 compute-0 python3.9[97482]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158923.7434363-263-86082990870820/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:04 compute-0 sudo[97632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uagfhgrglvqqgrepoiroaykbkejgxkjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158924.751193-314-209048057997436/AnsiballZ_setup.py'
Jan 23 09:02:04 compute-0 sudo[97632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:05 compute-0 python3.9[97634]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:02:05 compute-0 sudo[97632]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:05 compute-0 sudo[97716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iygemrzhzgajebefqkjbteojeszxbfhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158924.751193-314-209048057997436/AnsiballZ_dnf.py'
Jan 23 09:02:05 compute-0 sudo[97716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:05 compute-0 python3.9[97718]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:02:06 compute-0 sudo[97716]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:07 compute-0 sudo[97870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avdshipsvcwagtnhxixvwrpcddumidtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158926.930695-350-217052356459933/AnsiballZ_systemd.py'
Jan 23 09:02:07 compute-0 sudo[97870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:07 compute-0 python3.9[97872]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:02:07 compute-0 sudo[97870]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:08 compute-0 python3.9[98025]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:08 compute-0 python3.9[98146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158927.8232965-374-243889350851898/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:08 compute-0 python3.9[98296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:09 compute-0 python3.9[98417]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158928.643873-374-280070254995482/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:10 compute-0 python3.9[98567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:10 compute-0 python3.9[98688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158929.915371-506-59927946011022/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:11 compute-0 python3.9[98838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:11 compute-0 python3.9[98959]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158930.7033448-506-92248889106947/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:11 compute-0 python3.9[99109]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:02:12 compute-0 sudo[99261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mriqcqruzigleyhbikudqolmstjjmozh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158932.0706272-620-270031434166932/AnsiballZ_file.py'
Jan 23 09:02:12 compute-0 sudo[99261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:12 compute-0 python3.9[99263]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:12 compute-0 sudo[99261]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:12 compute-0 sudo[99413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exdufjqnxzomwskcimvrmznxoaoswibx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158932.5519884-644-271559075853379/AnsiballZ_stat.py'
Jan 23 09:02:12 compute-0 sudo[99413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:12 compute-0 python3.9[99415]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:12 compute-0 sudo[99413]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:13 compute-0 sudo[99491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgnaibonttxxsmzqhfjlowfamjgnilnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158932.5519884-644-271559075853379/AnsiballZ_file.py'
Jan 23 09:02:13 compute-0 sudo[99491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:13 compute-0 python3.9[99493]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:13 compute-0 sudo[99491]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:13 compute-0 sudo[99643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqtytzujjdbjvjthnixrouvnbvwbbhxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158933.3093-644-6174637131342/AnsiballZ_stat.py'
Jan 23 09:02:13 compute-0 sudo[99643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:13 compute-0 python3.9[99645]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:13 compute-0 sudo[99643]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:13 compute-0 sudo[99721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sczkeczusnfkquxihaewzogbgwhcxmow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158933.3093-644-6174637131342/AnsiballZ_file.py'
Jan 23 09:02:13 compute-0 sudo[99721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:14 compute-0 python3.9[99723]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:14 compute-0 sudo[99721]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:14 compute-0 sudo[99873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltxhbkoyxgchrhjtwlhmbvrvvjllhtva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158934.1688874-713-60194024356073/AnsiballZ_file.py'
Jan 23 09:02:14 compute-0 sudo[99873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:14 compute-0 python3.9[99875]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:14 compute-0 sudo[99873]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:14 compute-0 sudo[100025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-touwgeseoxrnelfqpfxqsqgqntoxbtdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158934.674251-737-122462367073607/AnsiballZ_stat.py'
Jan 23 09:02:14 compute-0 sudo[100025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:15 compute-0 python3.9[100027]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:15 compute-0 sudo[100025]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:15 compute-0 sudo[100103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duurszkgsgbgaqjwdvvdlsdcyevqxnxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158934.674251-737-122462367073607/AnsiballZ_file.py'
Jan 23 09:02:15 compute-0 sudo[100103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:15 compute-0 python3.9[100105]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:15 compute-0 sudo[100103]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:15 compute-0 sudo[100255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huxzcbhmfetdfypsugujesgvnmrqyjgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158935.542871-773-53755587816233/AnsiballZ_stat.py'
Jan 23 09:02:15 compute-0 sudo[100255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:15 compute-0 python3.9[100257]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:15 compute-0 sudo[100255]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:16 compute-0 sudo[100333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldctlwmzgpanlgtkszzivxeyptedosgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158935.542871-773-53755587816233/AnsiballZ_file.py'
Jan 23 09:02:16 compute-0 sudo[100333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:16 compute-0 python3.9[100335]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:16 compute-0 sudo[100333]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:16 compute-0 sudo[100485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajwtndenseboxkqssitnejzadrslmpur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158936.4144871-809-168682430520696/AnsiballZ_systemd.py'
Jan 23 09:02:16 compute-0 sudo[100485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:16 compute-0 python3.9[100487]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:16 compute-0 systemd[1]: Reloading.
Jan 23 09:02:16 compute-0 systemd-rc-local-generator[100505]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:02:16 compute-0 systemd-sysv-generator[100510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:02:17 compute-0 sudo[100485]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:17 compute-0 sudo[100674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atwfnjpgjqhiddbzosuuedvxmzszhhaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158937.2731023-833-152766215165321/AnsiballZ_stat.py'
Jan 23 09:02:17 compute-0 sudo[100674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:17 compute-0 python3.9[100676]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:17 compute-0 sudo[100674]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:17 compute-0 sudo[100752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyesxvxkgcwjnogdbsrqdpjlkmmxxiky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158937.2731023-833-152766215165321/AnsiballZ_file.py'
Jan 23 09:02:17 compute-0 sudo[100752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:18 compute-0 python3.9[100754]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:18 compute-0 sudo[100752]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:18 compute-0 sudo[100904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agyfeberwdnavybvquyajhnhlyicyqnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158938.1723151-869-48669926884581/AnsiballZ_stat.py'
Jan 23 09:02:18 compute-0 sudo[100904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:18 compute-0 python3.9[100906]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:18 compute-0 sudo[100904]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:18 compute-0 sudo[100991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlrkibtjsfhaxrtjwnqefmjvnjosiqfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158938.1723151-869-48669926884581/AnsiballZ_file.py'
Jan 23 09:02:18 compute-0 sudo[100991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:18 compute-0 ovn_controller[94697]: 2026-01-23T09:02:18Z|00025|memory|INFO|16128 kB peak resident set size after 30.0 seconds
Jan 23 09:02:18 compute-0 ovn_controller[94697]: 2026-01-23T09:02:18Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 23 09:02:18 compute-0 podman[100956]: 2026-01-23 09:02:18.798103062 +0000 UTC m=+0.067407565 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 23 09:02:18 compute-0 python3.9[100998]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:18 compute-0 sudo[100991]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:19 compute-0 sudo[101157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlopmhorjiwhrukuhqeelynexwwpjnzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158939.0554545-905-190028626685808/AnsiballZ_systemd.py'
Jan 23 09:02:19 compute-0 sudo[101157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:19 compute-0 python3.9[101159]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:19 compute-0 systemd[1]: Reloading.
Jan 23 09:02:19 compute-0 systemd-rc-local-generator[101184]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:02:19 compute-0 systemd-sysv-generator[101188]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:02:19 compute-0 systemd[1]: Starting Create netns directory...
Jan 23 09:02:19 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 09:02:19 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 09:02:19 compute-0 systemd[1]: Finished Create netns directory.
Jan 23 09:02:19 compute-0 sudo[101157]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:20 compute-0 sudo[101350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqudlcwppallqhcnsdhdsubyxusmuzgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158939.9795814-935-135606784250129/AnsiballZ_file.py'
Jan 23 09:02:20 compute-0 sudo[101350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:20 compute-0 python3.9[101352]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:20 compute-0 sudo[101350]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:20 compute-0 sudo[101502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cflkgbskhyyotchzzzijcartznvehbpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158940.4679236-959-167205938991174/AnsiballZ_stat.py'
Jan 23 09:02:20 compute-0 sudo[101502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:20 compute-0 python3.9[101504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:20 compute-0 sudo[101502]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:21 compute-0 sudo[101625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qotufhsrfpgfmhdzpjyygcaallxlosip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158940.4679236-959-167205938991174/AnsiballZ_copy.py'
Jan 23 09:02:21 compute-0 sudo[101625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:21 compute-0 python3.9[101627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158940.4679236-959-167205938991174/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:21 compute-0 sudo[101625]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:21 compute-0 sudo[101777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfjemnipmqccyhlgiheedysydzthekrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158941.5632415-1010-263909000932849/AnsiballZ_file.py'
Jan 23 09:02:21 compute-0 sudo[101777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:21 compute-0 python3.9[101779]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:21 compute-0 sudo[101777]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:22 compute-0 sudo[101929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkhigstjdskgshivtkdlisuulodwttjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158942.0693202-1034-60077090601254/AnsiballZ_file.py'
Jan 23 09:02:22 compute-0 sudo[101929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:22 compute-0 python3.9[101931]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:02:22 compute-0 sudo[101929]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:22 compute-0 sudo[102081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqqqlwhrqhnwggkywdttvrxzbkmlzuzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158942.551074-1058-150309640255065/AnsiballZ_stat.py'
Jan 23 09:02:22 compute-0 sudo[102081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:22 compute-0 python3.9[102083]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:22 compute-0 sudo[102081]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:23 compute-0 sudo[102204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwgplelkftqvazdbeykqqaswzuljvvlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158942.551074-1058-150309640255065/AnsiballZ_copy.py'
Jan 23 09:02:23 compute-0 sudo[102204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:23 compute-0 python3.9[102206]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158942.551074-1058-150309640255065/.source.json _original_basename=.4wu1trbo follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:23 compute-0 sudo[102204]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:23 compute-0 python3.9[102356]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:25 compute-0 sudo[102777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swdibpsthxzpwauejftauuusmayzqney ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158945.0823624-1178-187946767469236/AnsiballZ_container_config_data.py'
Jan 23 09:02:25 compute-0 sudo[102777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:25 compute-0 python3.9[102779]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 23 09:02:25 compute-0 sudo[102777]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:26 compute-0 sudo[102929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lphaflmkdammnkzhlmuikcwknmpsbiil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158945.8848808-1211-181389825445893/AnsiballZ_container_config_hash.py'
Jan 23 09:02:26 compute-0 sudo[102929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:26 compute-0 python3.9[102931]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 09:02:26 compute-0 sudo[102929]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:27 compute-0 sudo[103081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcjrghjvzoeubcalpcrizlnbyqobxokp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769158946.7225053-1241-187916139993948/AnsiballZ_edpm_container_manage.py'
Jan 23 09:02:27 compute-0 sudo[103081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:27 compute-0 python3[103083]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 09:02:34 compute-0 podman[103093]: 2026-01-23 09:02:34.382515092 +0000 UTC m=+6.952414063 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:02:34 compute-0 podman[103174]: 2026-01-23 09:02:34.482317502 +0000 UTC m=+0.029339703 container create a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:02:34 compute-0 podman[103174]: 2026-01-23 09:02:34.468547296 +0000 UTC m=+0.015569517 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:02:34 compute-0 python3[103083]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:02:34 compute-0 sudo[103081]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:35 compute-0 sudo[103350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdadpzomhegueilkzjwunrpaotaftzju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158955.2180114-1265-94570315335292/AnsiballZ_stat.py'
Jan 23 09:02:35 compute-0 sudo[103350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:35 compute-0 python3.9[103352]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:02:35 compute-0 sudo[103350]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:35 compute-0 sudo[103504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inytikazzqacytzcnustopyajyiczsbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158955.7698824-1292-279621328085238/AnsiballZ_file.py'
Jan 23 09:02:35 compute-0 sudo[103504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:36 compute-0 python3.9[103506]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:36 compute-0 sudo[103504]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:36 compute-0 sudo[103580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jatscgywmxvbpmickawiukngtxtwtvnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158955.7698824-1292-279621328085238/AnsiballZ_stat.py'
Jan 23 09:02:36 compute-0 sudo[103580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:36 compute-0 python3.9[103582]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:02:36 compute-0 sudo[103580]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:36 compute-0 sudo[103731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwddbpavkddhgwedogqxelcwogaqrfcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158956.455043-1292-18469185662799/AnsiballZ_copy.py'
Jan 23 09:02:36 compute-0 sudo[103731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:36 compute-0 python3.9[103733]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158956.455043-1292-18469185662799/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:36 compute-0 sudo[103731]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:37 compute-0 sudo[103807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctmcjtlxlafviixcyshgtifgoiqqmdkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158956.455043-1292-18469185662799/AnsiballZ_systemd.py'
Jan 23 09:02:37 compute-0 sudo[103807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:37 compute-0 python3.9[103809]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:02:37 compute-0 systemd[1]: Reloading.
Jan 23 09:02:37 compute-0 systemd-rc-local-generator[103834]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:02:37 compute-0 systemd-sysv-generator[103839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:02:37 compute-0 sudo[103807]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:37 compute-0 sudo[103918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdyrkpqqivogwafsscbzqzjjmemqwdsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158956.455043-1292-18469185662799/AnsiballZ_systemd.py'
Jan 23 09:02:37 compute-0 sudo[103918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:37 compute-0 python3.9[103920]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:37 compute-0 systemd[1]: Reloading.
Jan 23 09:02:38 compute-0 systemd-sysv-generator[103946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:02:38 compute-0 systemd-rc-local-generator[103943]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:02:38 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Jan 23 09:02:38 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:02:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce02413a9f657d4551d69bbbdbb6ede025d3c16b70c59cb08c39fc2a92c30b4b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 23 09:02:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce02413a9f657d4551d69bbbdbb6ede025d3c16b70c59cb08c39fc2a92c30b4b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:02:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d.
Jan 23 09:02:38 compute-0 podman[103961]: 2026-01-23 09:02:38.225868683 +0000 UTC m=+0.078702875 container init a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + sudo -E kolla_set_configs
Jan 23 09:02:38 compute-0 podman[103961]: 2026-01-23 09:02:38.246914148 +0000 UTC m=+0.099748319 container start a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 09:02:38 compute-0 edpm-start-podman-container[103961]: ovn_metadata_agent
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Validating config file
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Copying service configuration files
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Writing out command to execute
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 23 09:02:38 compute-0 edpm-start-podman-container[103960]: Creating additional drop-in dependency for "ovn_metadata_agent" (a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d)
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: ++ cat /run_command
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + CMD=neutron-ovn-metadata-agent
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + ARGS=
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + sudo kolla_copy_cacerts
Jan 23 09:02:38 compute-0 systemd[1]: Reloading.
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: Running command: 'neutron-ovn-metadata-agent'
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + [[ ! -n '' ]]
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + . kolla_extend_start
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + umask 0022
Jan 23 09:02:38 compute-0 ovn_metadata_agent[103973]: + exec neutron-ovn-metadata-agent
Jan 23 09:02:38 compute-0 podman[103980]: 2026-01-23 09:02:38.322164295 +0000 UTC m=+0.067215989 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:02:38 compute-0 systemd-rc-local-generator[104037]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:02:38 compute-0 systemd-sysv-generator[104041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:02:38 compute-0 systemd[1]: Started ovn_metadata_agent container.
Jan 23 09:02:38 compute-0 sudo[103918]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:39 compute-0 python3.9[104205]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.799 103978 INFO neutron.common.config [-] Logging enabled!
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.799 103978 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.799 103978 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.800 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.800 103978 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.800 103978 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.800 103978 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.800 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.800 103978 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.800 103978 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.800 103978 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.801 103978 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.802 103978 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.803 103978 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.804 103978 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.805 103978 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.806 103978 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.807 103978 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.808 103978 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.809 103978 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.810 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.811 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.812 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.813 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.814 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.815 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.816 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.817 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.818 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.819 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.820 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.821 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.822 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.823 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.824 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.825 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.826 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.827 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.828 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.829 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.830 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.831 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.831 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.831 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.831 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.831 103978 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.831 103978 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.838 103978 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.838 103978 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.839 103978 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.839 103978 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.839 103978 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 23 09:02:39 compute-0 sudo[104355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxvedwjxbkaargxxdpnfctjslqlbwxqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158959.6768336-1427-28505162413288/AnsiballZ_stat.py'
Jan 23 09:02:39 compute-0 sudo[104355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.849 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 5bdcc3dc-0ac8-4139-a9e7-75947c17f20e (UUID: 5bdcc3dc-0ac8-4139-a9e7-75947c17f20e) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.865 103978 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.865 103978 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.865 103978 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.865 103978 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.868 103978 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.873 103978 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.876 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '5bdcc3dc-0ac8-4139-a9e7-75947c17f20e'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], external_ids={}, name=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, nb_cfg_timestamp=1769158916760, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.877 103978 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f363770eb50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.878 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.878 103978 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.878 103978 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.878 103978 INFO oslo_service.service [-] Starting 1 workers
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.881 103978 DEBUG oslo_service.service [-] Started child 104358 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.884 103978 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp33ak9y42/privsep.sock']
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.884 104358 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-947613'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.902 104358 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.903 104358 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.903 104358 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.905 104358 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.909 104358 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 09:02:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:39.913 104358 INFO eventlet.wsgi.server [-] (104358) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 23 09:02:39 compute-0 python3.9[104357]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:02:40 compute-0 sudo[104355]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:40 compute-0 sudo[104485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dstorvjkdwirdenntedbdjqpvnxgcsfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158959.6768336-1427-28505162413288/AnsiballZ_copy.py'
Jan 23 09:02:40 compute-0 sudo[104485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:40 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.421 103978 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.422 103978 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp33ak9y42/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.335 104488 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.338 104488 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.340 104488 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.340 104488 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104488
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.424 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[a98228cc-3175-4060-af56-486c7f9dafdf]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:02:40 compute-0 python3.9[104487]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158959.6768336-1427-28505162413288/.source.yaml _original_basename=.89ai01ao follow=False checksum=ab31ce83297dc6526d14252dba018f686c850eca backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:40 compute-0 sudo[104485]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:40 compute-0 sshd-session[95721]: Connection closed by 192.168.122.30 port 49072
Jan 23 09:02:40 compute-0 sshd-session[95718]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:02:40 compute-0 systemd-logind[746]: Session 21 logged out. Waiting for processes to exit.
Jan 23 09:02:40 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Jan 23 09:02:40 compute-0 systemd[1]: session-21.scope: Consumed 37.984s CPU time.
Jan 23 09:02:40 compute-0 systemd-logind[746]: Removed session 21.
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.829 104488 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.829 104488 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:02:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:40.829 104488 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.259 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[cc65f235-964c-4eba-9c4b-3d5ed687b296]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.260 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, column=external_ids, values=({'neutron:ovn-metadata-id': '1a95796e-17ae-59ae-b067-53c4ab964d99'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.265 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.270 103978 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.270 103978 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.270 103978 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.270 103978 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.270 103978 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.270 103978 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.270 103978 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.270 103978 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.271 103978 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.272 103978 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.273 103978 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.274 103978 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.274 103978 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.274 103978 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.274 103978 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.274 103978 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.274 103978 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.274 103978 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.274 103978 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.275 103978 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.276 103978 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.277 103978 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.278 103978 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.279 103978 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.280 103978 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.281 103978 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.281 103978 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.281 103978 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.281 103978 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.281 103978 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.281 103978 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.281 103978 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.282 103978 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.283 103978 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.284 103978 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.285 103978 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.286 103978 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.287 103978 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.288 103978 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.289 103978 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.290 103978 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.291 103978 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.292 103978 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.293 103978 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.294 103978 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.295 103978 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.295 103978 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.295 103978 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.295 103978 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.295 103978 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.295 103978 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.295 103978 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.295 103978 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.296 103978 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.296 103978 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.296 103978 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.297 103978 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.298 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.299 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.300 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.301 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.302 103978 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.302 103978 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.302 103978 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.302 103978 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.302 103978 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:02:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:02:41.302 103978 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 09:02:45 compute-0 sshd-session[104517]: Accepted publickey for zuul from 192.168.122.30 port 47766 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:02:45 compute-0 systemd-logind[746]: New session 22 of user zuul.
Jan 23 09:02:45 compute-0 systemd[1]: Started Session 22 of User zuul.
Jan 23 09:02:45 compute-0 sshd-session[104517]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:02:46 compute-0 python3.9[104670]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:02:47 compute-0 sudo[104824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xahfmhlsidxuwfzenxzjegerwvmytzab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158966.9115937-62-59703580163574/AnsiballZ_command.py'
Jan 23 09:02:47 compute-0 sudo[104824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:47 compute-0 python3.9[104826]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:02:47 compute-0 sudo[104824]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:48 compute-0 sudo[104985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reufmzvojgtqpglrfngkqsrsctoexhme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158967.723048-95-268388053757468/AnsiballZ_systemd_service.py'
Jan 23 09:02:48 compute-0 sudo[104985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:48 compute-0 python3.9[104987]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:02:48 compute-0 systemd[1]: Reloading.
Jan 23 09:02:48 compute-0 systemd-sysv-generator[105013]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:02:48 compute-0 systemd-rc-local-generator[105008]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:02:48 compute-0 sudo[104985]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:49 compute-0 podman[105146]: 2026-01-23 09:02:49.064376468 +0000 UTC m=+0.059431773 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:02:49 compute-0 python3.9[105184]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:02:49 compute-0 network[105212]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:02:49 compute-0 network[105213]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:02:49 compute-0 network[105214]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:02:51 compute-0 sudo[105473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejnwcxhlkdobcsepwviiyxupjlplacye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158971.1961567-152-234464372396357/AnsiballZ_systemd_service.py'
Jan 23 09:02:51 compute-0 sudo[105473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:51 compute-0 python3.9[105475]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:51 compute-0 sudo[105473]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:51 compute-0 sudo[105626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wexwfpfqszxjsqkbactnaxxghffoquxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158971.7569568-152-35863577067341/AnsiballZ_systemd_service.py'
Jan 23 09:02:51 compute-0 sudo[105626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:52 compute-0 python3.9[105628]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:52 compute-0 sudo[105626]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:52 compute-0 sudo[105779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxmrqodzfqlvtxslxmiuaccdluenbucr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158972.306289-152-264388916984065/AnsiballZ_systemd_service.py'
Jan 23 09:02:52 compute-0 sudo[105779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:53 compute-0 python3.9[105781]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:53 compute-0 sudo[105779]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:53 compute-0 sudo[105932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrujmwdwxeugwbwcftcdoojurgyttvby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158973.718342-152-230408288984999/AnsiballZ_systemd_service.py'
Jan 23 09:02:53 compute-0 sudo[105932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:54 compute-0 python3.9[105934]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:54 compute-0 sudo[105932]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:54 compute-0 sudo[106085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uztynfqhyndvaqpygbvuldoyetgdhrrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158974.263901-152-112997719355778/AnsiballZ_systemd_service.py'
Jan 23 09:02:54 compute-0 sudo[106085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:54 compute-0 python3.9[106087]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:54 compute-0 sudo[106085]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:55 compute-0 sudo[106238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppfhqbikppsnjdfbomozibeikqxbiriu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158974.8157299-152-149647078327862/AnsiballZ_systemd_service.py'
Jan 23 09:02:55 compute-0 sudo[106238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:55 compute-0 python3.9[106240]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:55 compute-0 sudo[106238]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:55 compute-0 sudo[106391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgingtzptlibfebvtibdfekjjawoywms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158975.3765078-152-92095223202120/AnsiballZ_systemd_service.py'
Jan 23 09:02:55 compute-0 sudo[106391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:55 compute-0 python3.9[106393]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:02:55 compute-0 sudo[106391]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:56 compute-0 sudo[106544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzmputvhvzzkyxpxbmhbrrlpgxwjuvty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158976.2091584-308-12017402153374/AnsiballZ_file.py'
Jan 23 09:02:56 compute-0 sudo[106544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:56 compute-0 python3.9[106546]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:56 compute-0 sudo[106544]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:56 compute-0 sudo[106696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucdbvfpkxrxdtmjchequxcpnminbrbpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158976.7901726-308-181369654445787/AnsiballZ_file.py'
Jan 23 09:02:56 compute-0 sudo[106696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:57 compute-0 python3.9[106698]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:57 compute-0 sudo[106696]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:57 compute-0 sudo[106848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvovltiphmygwcmcstyrnpgadhmkyqoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158977.2251246-308-134980485404265/AnsiballZ_file.py'
Jan 23 09:02:57 compute-0 sudo[106848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:57 compute-0 python3.9[106850]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:57 compute-0 sudo[106848]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:57 compute-0 sudo[107000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoxjphqfgthyvuvglnxuywqzltmzyzex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158977.6512692-308-147662096044659/AnsiballZ_file.py'
Jan 23 09:02:57 compute-0 sudo[107000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:57 compute-0 python3.9[107002]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:57 compute-0 sudo[107000]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:58 compute-0 sudo[107152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caqzynyeuvpmpixjdtnikcljrcywmrlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158978.0798645-308-178298754557641/AnsiballZ_file.py'
Jan 23 09:02:58 compute-0 sudo[107152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:58 compute-0 python3.9[107154]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:58 compute-0 sudo[107152]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:58 compute-0 sudo[107304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scsvnoibapmovpgwtxnwwrvaajxinose ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158978.5171103-308-102613426928191/AnsiballZ_file.py'
Jan 23 09:02:58 compute-0 sudo[107304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:58 compute-0 python3.9[107306]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:58 compute-0 sudo[107304]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:59 compute-0 sudo[107456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohxohinjnqszjgkpiluduxnehopifkaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158978.927134-308-32131307719041/AnsiballZ_file.py'
Jan 23 09:02:59 compute-0 sudo[107456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:59 compute-0 python3.9[107458]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:59 compute-0 sudo[107456]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:00 compute-0 sudo[107608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apgcoxhknbitmddyyiisrzzidqfjfkkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158980.5402694-458-78709577058578/AnsiballZ_file.py'
Jan 23 09:03:00 compute-0 sudo[107608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:00 compute-0 python3.9[107610]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:03:00 compute-0 sudo[107608]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:01 compute-0 sudo[107760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxuyxovfmszyxdrpulcthwibououcrhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158980.9559224-458-199828137612751/AnsiballZ_file.py'
Jan 23 09:03:01 compute-0 sudo[107760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:01 compute-0 python3.9[107762]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:03:01 compute-0 sudo[107760]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:01 compute-0 sudo[107912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukjxlenajldnhuqfhrhiypmfbwxsqqqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158981.4645214-458-145312593109030/AnsiballZ_file.py'
Jan 23 09:03:01 compute-0 sudo[107912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:01 compute-0 python3.9[107914]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:03:01 compute-0 sudo[107912]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:02 compute-0 sudo[108064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqcroylenwqxwfubkpuwczeatjdaoqrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158981.8853486-458-172090247010057/AnsiballZ_file.py'
Jan 23 09:03:02 compute-0 sudo[108064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:02 compute-0 python3.9[108066]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:03:02 compute-0 sudo[108064]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:02 compute-0 sudo[108216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhuduyvlojfjtskyzbnvsxoylynjaexb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158982.2915125-458-60950062221850/AnsiballZ_file.py'
Jan 23 09:03:02 compute-0 sudo[108216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:02 compute-0 python3.9[108218]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:03:02 compute-0 sudo[108216]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:02 compute-0 sudo[108368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjltkrepgqdewwnymyszsfqclbfmcmmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158982.7083993-458-116556431427851/AnsiballZ_file.py'
Jan 23 09:03:02 compute-0 sudo[108368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:03 compute-0 python3.9[108370]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:03:03 compute-0 sudo[108368]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:03 compute-0 sudo[108520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfretvucjxlgnvnttwvnakfrmhksravq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158983.12833-458-273170712538877/AnsiballZ_file.py'
Jan 23 09:03:03 compute-0 sudo[108520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:03 compute-0 python3.9[108522]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:03:03 compute-0 sudo[108520]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:04 compute-0 sudo[108672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijyhftbpetxlfcujwsjxkraijsercwva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158983.9439917-611-212408660736417/AnsiballZ_command.py'
Jan 23 09:03:04 compute-0 sudo[108672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:04 compute-0 python3.9[108674]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:03:04 compute-0 sudo[108672]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:04 compute-0 python3.9[108826]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:03:05 compute-0 sudo[108976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvkdppobeezqiqexzhdhqqjfnmzdlaxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158985.1428745-665-211083972578523/AnsiballZ_systemd_service.py'
Jan 23 09:03:05 compute-0 sudo[108976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:05 compute-0 python3.9[108978]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:03:05 compute-0 systemd[1]: Reloading.
Jan 23 09:03:05 compute-0 systemd-sysv-generator[109002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:03:05 compute-0 systemd-rc-local-generator[108998]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:03:05 compute-0 sudo[108976]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:06 compute-0 sudo[109163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaxweicwdxnklfrdzdjqxxrvfizxffdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158985.9127586-689-195784391299225/AnsiballZ_command.py'
Jan 23 09:03:06 compute-0 sudo[109163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:06 compute-0 python3.9[109165]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:03:06 compute-0 sudo[109163]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:06 compute-0 sudo[109316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfcndvcegthigildvkgietdvxvqajhfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158986.3787122-689-152207711243606/AnsiballZ_command.py'
Jan 23 09:03:06 compute-0 sudo[109316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:06 compute-0 python3.9[109318]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:03:06 compute-0 sudo[109316]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:07 compute-0 sudo[109469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgchnwsgxexqivgqrsdoefvwdydjycdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158986.8317604-689-117039710813374/AnsiballZ_command.py'
Jan 23 09:03:07 compute-0 sudo[109469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:07 compute-0 python3.9[109471]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:03:07 compute-0 sudo[109469]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:07 compute-0 sudo[109622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qumkwuvuopywpyvermdxhpzqcwwxasbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158987.2783616-689-89217919987762/AnsiballZ_command.py'
Jan 23 09:03:07 compute-0 sudo[109622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:07 compute-0 python3.9[109624]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:03:07 compute-0 sudo[109622]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:07 compute-0 sudo[109775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gimkuygxdvbkukceecptzsyozhasdtcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158987.7312481-689-60919277834707/AnsiballZ_command.py'
Jan 23 09:03:07 compute-0 sudo[109775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:08 compute-0 python3.9[109777]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:03:08 compute-0 sudo[109775]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:08 compute-0 sudo[109928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbflxhxmclnpthpwbymijweqhwvqhsly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158988.3255262-689-204485456773617/AnsiballZ_command.py'
Jan 23 09:03:08 compute-0 sudo[109928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:08 compute-0 python3.9[109930]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:03:08 compute-0 sudo[109928]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:08 compute-0 podman[109932]: 2026-01-23 09:03:08.717337004 +0000 UTC m=+0.041993897 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 09:03:08 compute-0 sudo[110097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eedugxhbysmpagjataawqciahjdpbvin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158988.7772274-689-28266819899910/AnsiballZ_command.py'
Jan 23 09:03:08 compute-0 sudo[110097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:09 compute-0 python3.9[110099]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:03:09 compute-0 sudo[110097]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:09 compute-0 sudo[110250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxfuwdpliikbanfwovpnsjfhplacizxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158989.5727117-851-235608885019190/AnsiballZ_getent.py'
Jan 23 09:03:09 compute-0 sudo[110250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:10 compute-0 python3.9[110252]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 23 09:03:10 compute-0 sudo[110250]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:10 compute-0 sudo[110403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkivzwehuergbdyidakinhuqsvzgoill ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158990.1621292-875-123555822104191/AnsiballZ_group.py'
Jan 23 09:03:10 compute-0 sudo[110403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:10 compute-0 python3.9[110405]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:03:10 compute-0 groupadd[110406]: group added to /etc/group: name=libvirt, GID=42473
Jan 23 09:03:10 compute-0 groupadd[110406]: group added to /etc/gshadow: name=libvirt
Jan 23 09:03:10 compute-0 groupadd[110406]: new group: name=libvirt, GID=42473
Jan 23 09:03:10 compute-0 sudo[110403]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:11 compute-0 sudo[110561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmcwdstmlmgsxndrngkqsprhggpktesz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158990.7833967-899-47310008945582/AnsiballZ_user.py'
Jan 23 09:03:11 compute-0 sudo[110561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:11 compute-0 python3.9[110563]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 09:03:11 compute-0 useradd[110565]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 09:03:11 compute-0 sudo[110561]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:11 compute-0 sudo[110721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvtdqorpsgvxvcqfnhqykciqiikzqxkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158991.7463477-932-263565010595271/AnsiballZ_setup.py'
Jan 23 09:03:11 compute-0 sudo[110721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:12 compute-0 python3.9[110723]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:03:12 compute-0 sudo[110721]: pam_unix(sudo:session): session closed for user root
Jan 23 09:03:12 compute-0 sudo[110805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpboldgwohjrzpyxjokbccsnsuyialey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769158991.7463477-932-263565010595271/AnsiballZ_dnf.py'
Jan 23 09:03:12 compute-0 sudo[110805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:03:12 compute-0 python3.9[110807]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:03:19 compute-0 podman[110817]: 2026-01-23 09:03:19.22075158 +0000 UTC m=+0.058223146 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:03:39 compute-0 podman[111022]: 2026-01-23 09:03:39.201169555 +0000 UTC m=+0.038223462 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:03:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:03:39.841 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:03:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:03:39.841 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:03:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:03:39.841 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:03:40 compute-0 kernel: SELinux:  Converting 2764 SID table entries...
Jan 23 09:03:40 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:03:40 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:03:40 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:03:40 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:03:40 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:03:40 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:03:40 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:03:48 compute-0 kernel: SELinux:  Converting 2764 SID table entries...
Jan 23 09:03:48 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:03:48 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:03:48 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:03:48 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:03:48 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:03:48 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:03:48 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:03:50 compute-0 dbus-broker-launch[733]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 23 09:03:50 compute-0 podman[111054]: 2026-01-23 09:03:50.227206223 +0000 UTC m=+0.060118975 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Jan 23 09:04:10 compute-0 podman[123695]: 2026-01-23 09:04:10.19815205 +0000 UTC m=+0.035946589 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 23 09:04:21 compute-0 podman[127961]: 2026-01-23 09:04:21.216031813 +0000 UTC m=+0.054584403 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 23 09:04:23 compute-0 kernel: SELinux:  Converting 2765 SID table entries...
Jan 23 09:04:23 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:04:23 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:04:23 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:04:23 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:04:23 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:04:23 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:04:23 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:04:24 compute-0 groupadd[127997]: group added to /etc/group: name=dnsmasq, GID=993
Jan 23 09:04:24 compute-0 groupadd[127997]: group added to /etc/gshadow: name=dnsmasq
Jan 23 09:04:24 compute-0 groupadd[127997]: new group: name=dnsmasq, GID=993
Jan 23 09:04:24 compute-0 useradd[128004]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 23 09:04:24 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Jan 23 09:04:24 compute-0 dbus-broker-launch[733]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 23 09:04:24 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Jan 23 09:04:25 compute-0 groupadd[128017]: group added to /etc/group: name=clevis, GID=992
Jan 23 09:04:25 compute-0 groupadd[128017]: group added to /etc/gshadow: name=clevis
Jan 23 09:04:25 compute-0 groupadd[128017]: new group: name=clevis, GID=992
Jan 23 09:04:25 compute-0 useradd[128024]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 23 09:04:25 compute-0 usermod[128034]: add 'clevis' to group 'tss'
Jan 23 09:04:25 compute-0 usermod[128034]: add 'clevis' to shadow group 'tss'
Jan 23 09:04:26 compute-0 polkitd[43484]: Reloading rules
Jan 23 09:04:26 compute-0 polkitd[43484]: Collecting garbage unconditionally...
Jan 23 09:04:26 compute-0 polkitd[43484]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 09:04:26 compute-0 polkitd[43484]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 09:04:26 compute-0 polkitd[43484]: Finished loading, compiling and executing 3 rules
Jan 23 09:04:26 compute-0 polkitd[43484]: Reloading rules
Jan 23 09:04:26 compute-0 polkitd[43484]: Collecting garbage unconditionally...
Jan 23 09:04:26 compute-0 polkitd[43484]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 09:04:26 compute-0 polkitd[43484]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 09:04:26 compute-0 polkitd[43484]: Finished loading, compiling and executing 3 rules
Jan 23 09:04:27 compute-0 groupadd[128224]: group added to /etc/group: name=ceph, GID=167
Jan 23 09:04:27 compute-0 groupadd[128224]: group added to /etc/gshadow: name=ceph
Jan 23 09:04:27 compute-0 groupadd[128224]: new group: name=ceph, GID=167
Jan 23 09:04:27 compute-0 useradd[128230]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 23 09:04:29 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Jan 23 09:04:29 compute-0 sshd[963]: Received signal 15; terminating.
Jan 23 09:04:29 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Jan 23 09:04:29 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Jan 23 09:04:29 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Jan 23 09:04:29 compute-0 systemd[1]: Stopping sshd-keygen.target...
Jan 23 09:04:29 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:04:29 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:04:29 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:04:29 compute-0 systemd[1]: Reached target sshd-keygen.target.
Jan 23 09:04:29 compute-0 systemd[1]: Starting OpenSSH server daemon...
Jan 23 09:04:29 compute-0 sshd[128749]: Server listening on 0.0.0.0 port 22.
Jan 23 09:04:29 compute-0 sshd[128749]: Server listening on :: port 22.
Jan 23 09:04:29 compute-0 systemd[1]: Started OpenSSH server daemon.
Jan 23 09:04:30 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:04:30 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:04:30 compute-0 systemd[1]: Reloading.
Jan 23 09:04:30 compute-0 systemd-rc-local-generator[129001]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:30 compute-0 systemd-sysv-generator[129007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:30 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:04:32 compute-0 sudo[110805]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:33 compute-0 sudo[133273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tridfbfuplrwwtoepcndwiqvfwaakbjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159072.800703-968-208783163716082/AnsiballZ_systemd.py'
Jan 23 09:04:33 compute-0 sudo[133273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:33 compute-0 python3.9[133295]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:04:33 compute-0 systemd[1]: Reloading.
Jan 23 09:04:33 compute-0 systemd-rc-local-generator[133795]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:33 compute-0 systemd-sysv-generator[133798]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:33 compute-0 sudo[133273]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:34 compute-0 sudo[134552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtnqzfplhwwiethdxhlkrnvmefmxdvzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159073.8263838-968-214792650727607/AnsiballZ_systemd.py'
Jan 23 09:04:34 compute-0 sudo[134552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:34 compute-0 python3.9[134576]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:04:34 compute-0 systemd[1]: Reloading.
Jan 23 09:04:34 compute-0 systemd-sysv-generator[135080]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:34 compute-0 systemd-rc-local-generator[135077]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:34 compute-0 sudo[134552]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:34 compute-0 sudo[135862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvzeeepstleqjerfhgfgjbicardlhgns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159074.6275947-968-125670689925014/AnsiballZ_systemd.py'
Jan 23 09:04:34 compute-0 sudo[135862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:35 compute-0 python3.9[135885]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:04:35 compute-0 systemd[1]: Reloading.
Jan 23 09:04:35 compute-0 systemd-rc-local-generator[136425]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:35 compute-0 systemd-sysv-generator[136429]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:35 compute-0 sudo[135862]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:35 compute-0 sudo[137240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohylrdnacrzdomfphrtradthoavtfnzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159075.4422622-968-8648312183993/AnsiballZ_systemd.py'
Jan 23 09:04:35 compute-0 sudo[137240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:35 compute-0 python3.9[137262]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:04:35 compute-0 systemd[1]: Reloading.
Jan 23 09:04:35 compute-0 systemd-rc-local-generator[137733]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:35 compute-0 systemd-sysv-generator[137736]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:36 compute-0 sudo[137240]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:36 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:04:36 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:04:36 compute-0 systemd[1]: man-db-cache-update.service: Consumed 6.917s CPU time.
Jan 23 09:04:36 compute-0 systemd[1]: run-r384012675c2c450e90011a41f9e34585.service: Deactivated successfully.
Jan 23 09:04:36 compute-0 sudo[138296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvlmhqoufkxkcmzxmwrrqacwcvaboapj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159076.2520835-1055-167233128133079/AnsiballZ_systemd.py'
Jan 23 09:04:36 compute-0 sudo[138296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:36 compute-0 python3.9[138298]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:36 compute-0 systemd[1]: Reloading.
Jan 23 09:04:36 compute-0 systemd-rc-local-generator[138322]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:36 compute-0 systemd-sysv-generator[138325]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:36 compute-0 sudo[138296]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:37 compute-0 sudo[138485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayajbwwcxlkomwmrrawbugslrwpakepg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159076.981484-1055-77604675835625/AnsiballZ_systemd.py'
Jan 23 09:04:37 compute-0 sudo[138485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:37 compute-0 python3.9[138487]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:37 compute-0 systemd[1]: Reloading.
Jan 23 09:04:37 compute-0 systemd-rc-local-generator[138511]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:37 compute-0 systemd-sysv-generator[138516]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:37 compute-0 sudo[138485]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:37 compute-0 sudo[138675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqhpkmmgdfadymuepjxrgnaqknczbpkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159077.7389443-1055-194602604209549/AnsiballZ_systemd.py'
Jan 23 09:04:37 compute-0 sudo[138675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:38 compute-0 python3.9[138677]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:38 compute-0 systemd[1]: Reloading.
Jan 23 09:04:38 compute-0 systemd-rc-local-generator[138701]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:38 compute-0 systemd-sysv-generator[138704]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:38 compute-0 sudo[138675]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:38 compute-0 sudo[138864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqxgbxlegkbyrrkxgeljnwompuiiswjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159078.4823415-1055-152675693133017/AnsiballZ_systemd.py'
Jan 23 09:04:38 compute-0 sudo[138864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:38 compute-0 python3.9[138866]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:38 compute-0 sudo[138864]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:39 compute-0 sudo[139019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppacapnnsgdxjgarglsotlilwuilmemg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159079.04087-1055-5824730204551/AnsiballZ_systemd.py'
Jan 23 09:04:39 compute-0 sudo[139019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:39 compute-0 python3.9[139021]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:39 compute-0 systemd[1]: Reloading.
Jan 23 09:04:39 compute-0 systemd-rc-local-generator[139045]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:39 compute-0 systemd-sysv-generator[139048]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:39 compute-0 sudo[139019]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:04:39.842 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:04:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:04:39.843 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:04:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:04:39.843 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:04:40 compute-0 sudo[139209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqxjqvundfthiodjvyceqxajghlrjmgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159079.9284227-1163-85258347800190/AnsiballZ_systemd.py'
Jan 23 09:04:40 compute-0 sudo[139209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:40 compute-0 python3.9[139211]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:04:40 compute-0 systemd[1]: Reloading.
Jan 23 09:04:40 compute-0 podman[139213]: 2026-01-23 09:04:40.419901312 +0000 UTC m=+0.044367414 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 09:04:40 compute-0 systemd-rc-local-generator[139251]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:04:40 compute-0 systemd-sysv-generator[139254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:04:40 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 23 09:04:40 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 23 09:04:40 compute-0 sudo[139209]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:40 compute-0 sudo[139418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjrgyfkpojfuqbttoxoocelakrnkasqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159080.7977645-1187-93466521774751/AnsiballZ_systemd.py'
Jan 23 09:04:40 compute-0 sudo[139418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:41 compute-0 python3.9[139420]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:42 compute-0 sudo[139418]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:42 compute-0 sudo[139573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edvwbeuryhogqkeqghcjiomocdteanof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159082.4394794-1187-68553762679848/AnsiballZ_systemd.py'
Jan 23 09:04:42 compute-0 sudo[139573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:42 compute-0 python3.9[139575]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:42 compute-0 sudo[139573]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:43 compute-0 sudo[139728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yugzaopvzmyluacvrsjchjyntatlrbfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159083.2298038-1187-191949352861669/AnsiballZ_systemd.py'
Jan 23 09:04:43 compute-0 sudo[139728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:43 compute-0 python3.9[139730]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:43 compute-0 sudo[139728]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:44 compute-0 sudo[139883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulunrzwvqwhkuemzfjebfhgcqduajqmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159083.8624222-1187-259285729708249/AnsiballZ_systemd.py'
Jan 23 09:04:44 compute-0 sudo[139883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:44 compute-0 python3.9[139885]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:44 compute-0 sudo[139883]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:44 compute-0 sudo[140038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxoqsbzzdshomwcbykumayvufrwdsfcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159084.5004342-1187-252368866225467/AnsiballZ_systemd.py'
Jan 23 09:04:44 compute-0 sudo[140038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:44 compute-0 python3.9[140040]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:45 compute-0 sudo[140038]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:45 compute-0 sudo[140193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsgzpabidrxmxnroagsopnzaocacrpsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159085.1564367-1187-150920640543220/AnsiballZ_systemd.py'
Jan 23 09:04:45 compute-0 sudo[140193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:45 compute-0 python3.9[140195]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:45 compute-0 sudo[140193]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:45 compute-0 sudo[140348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvfqywfazhiljigpwgwmnlrhvwknsyxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159085.7925143-1187-245814090424561/AnsiballZ_systemd.py'
Jan 23 09:04:45 compute-0 sudo[140348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:46 compute-0 python3.9[140350]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:46 compute-0 sudo[140348]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:46 compute-0 sudo[140503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zixslbdqlneuhjktpxeaqppfpncbbrjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159086.3668299-1187-185757350122791/AnsiballZ_systemd.py'
Jan 23 09:04:46 compute-0 sudo[140503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:46 compute-0 python3.9[140505]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:46 compute-0 sudo[140503]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:47 compute-0 sudo[140658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlajngxldryrowfjwkdehcdyomtvupke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159086.9621158-1187-200090902863981/AnsiballZ_systemd.py'
Jan 23 09:04:47 compute-0 sudo[140658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:47 compute-0 python3.9[140660]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:47 compute-0 sudo[140658]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:47 compute-0 sudo[140813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqepbrkuadwvuatqdjwbbwgkltefbgyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159087.536607-1187-78451144062360/AnsiballZ_systemd.py'
Jan 23 09:04:47 compute-0 sudo[140813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:47 compute-0 python3.9[140815]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:48 compute-0 sudo[140813]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:48 compute-0 sudo[140968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgmazmgahlnenzgwgoefjvojufdtabrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159088.155624-1187-135394044206006/AnsiballZ_systemd.py'
Jan 23 09:04:48 compute-0 sudo[140968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:48 compute-0 python3.9[140970]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:48 compute-0 sudo[140968]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:48 compute-0 sudo[141123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygfsfhxltkwexgjsmfczxycdzcfiwndf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159088.7096384-1187-100321505630653/AnsiballZ_systemd.py'
Jan 23 09:04:48 compute-0 sudo[141123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:49 compute-0 python3.9[141125]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:49 compute-0 sudo[141123]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:49 compute-0 sudo[141278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyfebwbhqnpdoffvzagkzizzuhpfvzym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159089.2523117-1187-269704598105004/AnsiballZ_systemd.py'
Jan 23 09:04:49 compute-0 sudo[141278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:49 compute-0 python3.9[141280]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:49 compute-0 sudo[141278]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:49 compute-0 sudo[141433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrurkgrpqjwqwhbleqgnrlodrcfovbtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159089.8109844-1187-107648637743856/AnsiballZ_systemd.py'
Jan 23 09:04:49 compute-0 sudo[141433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:50 compute-0 python3.9[141435]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 09:04:50 compute-0 sudo[141433]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:50 compute-0 sudo[141588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naghssjmjzwgnsmugzpprtujndoxsffn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159090.6136074-1493-103287656716503/AnsiballZ_file.py'
Jan 23 09:04:50 compute-0 sudo[141588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:50 compute-0 python3.9[141590]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:04:50 compute-0 sudo[141588]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:51 compute-0 sudo[141740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvjhurmvocjhqrzkbnszdmvytirkhjlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159091.064187-1493-23669714209546/AnsiballZ_file.py'
Jan 23 09:04:51 compute-0 sudo[141740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:51 compute-0 podman[141742]: 2026-01-23 09:04:51.318891599 +0000 UTC m=+0.056895678 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true)
Jan 23 09:04:51 compute-0 python3.9[141743]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:04:51 compute-0 sudo[141740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:51 compute-0 sudo[141915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isouqstfovnfqjnktcdjtxntotugjtzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159091.5217185-1493-211702512186052/AnsiballZ_file.py'
Jan 23 09:04:51 compute-0 sudo[141915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:51 compute-0 python3.9[141917]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:04:51 compute-0 sudo[141915]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:52 compute-0 sudo[142067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdzqscxyfserksppdstgrlbjtziykpkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159091.956136-1493-19361725436468/AnsiballZ_file.py'
Jan 23 09:04:52 compute-0 sudo[142067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:52 compute-0 python3.9[142069]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:04:52 compute-0 sudo[142067]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:52 compute-0 sudo[142219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggpkvyieohnqtmdwjokwywcpnotgbbgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159092.3954394-1493-120690707553835/AnsiballZ_file.py'
Jan 23 09:04:52 compute-0 sudo[142219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:52 compute-0 python3.9[142221]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:04:52 compute-0 sudo[142219]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:52 compute-0 sudo[142371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqfkwqfsbevcszywtfbiswgnpqfvjnml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159092.811229-1493-233904578602577/AnsiballZ_file.py'
Jan 23 09:04:52 compute-0 sudo[142371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:53 compute-0 python3.9[142373]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:04:53 compute-0 sudo[142371]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:53 compute-0 python3.9[142523]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:04:54 compute-0 sudo[142673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smehcixxocgztsedmscizjivqjjejusq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159094.0272233-1646-60345102092034/AnsiballZ_stat.py'
Jan 23 09:04:54 compute-0 sudo[142673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:54 compute-0 python3.9[142675]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:04:54 compute-0 sudo[142673]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:54 compute-0 sudo[142798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yukgppskztzwvrshhbxviinfjfpcxntf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159094.0272233-1646-60345102092034/AnsiballZ_copy.py'
Jan 23 09:04:54 compute-0 sudo[142798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:55 compute-0 python3.9[142800]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159094.0272233-1646-60345102092034/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:04:55 compute-0 sudo[142798]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:55 compute-0 sudo[142950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jijeedmlvekilojnxlkagfhdnacttvop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159095.2012193-1646-134517210133434/AnsiballZ_stat.py'
Jan 23 09:04:55 compute-0 sudo[142950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:55 compute-0 python3.9[142952]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:04:55 compute-0 sudo[142950]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:55 compute-0 sudo[143075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjjxjpuesqqcfercpempyrgyynceeqel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159095.2012193-1646-134517210133434/AnsiballZ_copy.py'
Jan 23 09:04:55 compute-0 sudo[143075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:55 compute-0 python3.9[143077]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159095.2012193-1646-134517210133434/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:04:55 compute-0 sudo[143075]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:56 compute-0 sudo[143227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygthrhuhytqxgpgtlpzdfawyjgdbyfzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159096.050027-1646-95237690369590/AnsiballZ_stat.py'
Jan 23 09:04:56 compute-0 sudo[143227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:56 compute-0 python3.9[143229]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:04:56 compute-0 sudo[143227]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:56 compute-0 sudo[143352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aobqjgwaoszgeyhwkklrjdtoxprezabq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159096.050027-1646-95237690369590/AnsiballZ_copy.py'
Jan 23 09:04:56 compute-0 sudo[143352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:56 compute-0 python3.9[143354]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159096.050027-1646-95237690369590/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:04:56 compute-0 sudo[143352]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:57 compute-0 sudo[143504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpsaplwygdahibpjrqlyljtnugkdsvpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159096.9213889-1646-118521511926514/AnsiballZ_stat.py'
Jan 23 09:04:57 compute-0 sudo[143504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:57 compute-0 python3.9[143506]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:04:57 compute-0 sudo[143504]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:57 compute-0 sudo[143629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyvdjgzgopbehshrijdcvynqoireqlzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159096.9213889-1646-118521511926514/AnsiballZ_copy.py'
Jan 23 09:04:57 compute-0 sudo[143629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:57 compute-0 python3.9[143631]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159096.9213889-1646-118521511926514/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:04:57 compute-0 sudo[143629]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:57 compute-0 sudo[143781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rchsivxxlwoknwhamahvptxphyljdctf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159097.7193725-1646-10638020889463/AnsiballZ_stat.py'
Jan 23 09:04:57 compute-0 sudo[143781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:58 compute-0 python3.9[143783]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:04:58 compute-0 sudo[143781]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:58 compute-0 sudo[143906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwsiqtavpqgmpqzzqzuititzkoxcgxma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159097.7193725-1646-10638020889463/AnsiballZ_copy.py'
Jan 23 09:04:58 compute-0 sudo[143906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:58 compute-0 python3.9[143908]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159097.7193725-1646-10638020889463/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:04:58 compute-0 sudo[143906]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:58 compute-0 sudo[144058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdplhyfxfsiryuivkvbeljcdueloyaln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159098.6889758-1646-31124706830073/AnsiballZ_stat.py'
Jan 23 09:04:58 compute-0 sudo[144058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:59 compute-0 python3.9[144060]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:04:59 compute-0 sudo[144058]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:59 compute-0 sudo[144183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nctjkorpjlukzltvelevuesuirkdbwkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159098.6889758-1646-31124706830073/AnsiballZ_copy.py'
Jan 23 09:04:59 compute-0 sudo[144183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:59 compute-0 python3.9[144185]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159098.6889758-1646-31124706830073/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:04:59 compute-0 sudo[144183]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:59 compute-0 sudo[144335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atqgxqflfdpyblbzrbkwmkjzggrfvevu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159099.5464451-1646-202768217963931/AnsiballZ_stat.py'
Jan 23 09:04:59 compute-0 sudo[144335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:59 compute-0 python3.9[144337]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:04:59 compute-0 sudo[144335]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:00 compute-0 sudo[144458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgiozzwwfpstknsvdxufvdsnxtsanhzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159099.5464451-1646-202768217963931/AnsiballZ_copy.py'
Jan 23 09:05:00 compute-0 sudo[144458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:00 compute-0 python3.9[144460]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159099.5464451-1646-202768217963931/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:00 compute-0 sudo[144458]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:00 compute-0 sudo[144610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulwuyvcfztayaywmwalqqlegfwtaofsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159100.3815904-1646-9587894562005/AnsiballZ_stat.py'
Jan 23 09:05:00 compute-0 sudo[144610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:00 compute-0 python3.9[144612]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:00 compute-0 sudo[144610]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:00 compute-0 sudo[144735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvsbdztqeglxwkmfdgqprfkifdokfnda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159100.3815904-1646-9587894562005/AnsiballZ_copy.py'
Jan 23 09:05:00 compute-0 sudo[144735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:01 compute-0 python3.9[144737]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159100.3815904-1646-9587894562005/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:01 compute-0 sudo[144735]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:01 compute-0 sudo[144887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxazunzdexhtgkbgobdcpzqsmfcjfhbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159101.2865956-1985-275663858343157/AnsiballZ_command.py'
Jan 23 09:05:01 compute-0 sudo[144887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:01 compute-0 python3.9[144889]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 23 09:05:01 compute-0 sudo[144887]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:01 compute-0 sudo[145040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqlplihmkkqnjlxtkgbhyydnvldwsguc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159101.8198466-2012-111356445921983/AnsiballZ_file.py'
Jan 23 09:05:01 compute-0 sudo[145040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:02 compute-0 python3.9[145042]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:02 compute-0 sudo[145040]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:02 compute-0 sudo[145192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocvvejktowxzhqpfthkmnhluplxgtfpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159102.2577262-2012-70579080932778/AnsiballZ_file.py'
Jan 23 09:05:02 compute-0 sudo[145192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:02 compute-0 python3.9[145194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:02 compute-0 sudo[145192]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:02 compute-0 sudo[145344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eocjjolglpcjfyztilnbwddnccflospv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159102.6893728-2012-124515131791194/AnsiballZ_file.py'
Jan 23 09:05:02 compute-0 sudo[145344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:03 compute-0 python3.9[145346]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:03 compute-0 sudo[145344]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:03 compute-0 sudo[145496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsaphswxkobfnhtvdyatguwycolfbydc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159103.1150956-2012-244935295608568/AnsiballZ_file.py'
Jan 23 09:05:03 compute-0 sudo[145496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:03 compute-0 python3.9[145498]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:03 compute-0 sudo[145496]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:03 compute-0 sudo[145648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-punixinuxjqspospwcymikhdcdhrwdck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159103.5418088-2012-168327232061273/AnsiballZ_file.py'
Jan 23 09:05:03 compute-0 sudo[145648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:03 compute-0 python3.9[145650]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:03 compute-0 sudo[145648]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:04 compute-0 sudo[145800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejmrhldeehkjgrxlqmmunxlyipqqdtqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159104.0231855-2012-77156720138956/AnsiballZ_file.py'
Jan 23 09:05:04 compute-0 sudo[145800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:04 compute-0 python3.9[145802]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:04 compute-0 sudo[145800]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:04 compute-0 sudo[145952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndvlnvkzfrdinsfinyziaovnuvtkvjia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159104.4933326-2012-22323043519954/AnsiballZ_file.py'
Jan 23 09:05:04 compute-0 sudo[145952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:04 compute-0 python3.9[145954]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:04 compute-0 sudo[145952]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:05 compute-0 sudo[146104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdvetnxsuoyenswxxmsgvgeezcqhoiuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159104.9426956-2012-272843396572573/AnsiballZ_file.py'
Jan 23 09:05:05 compute-0 sudo[146104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:05 compute-0 python3.9[146106]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:05 compute-0 sudo[146104]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:05 compute-0 sudo[146256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swsayhrawyvvdssoglmvhysgdlzujwks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159105.3848033-2012-143888830911984/AnsiballZ_file.py'
Jan 23 09:05:05 compute-0 sudo[146256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:05 compute-0 python3.9[146258]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:05 compute-0 sudo[146256]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:05 compute-0 sudo[146408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leyamjipiecitibxxetcrbubcmijigxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159105.8227937-2012-57323987055599/AnsiballZ_file.py'
Jan 23 09:05:05 compute-0 sudo[146408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:06 compute-0 python3.9[146410]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:06 compute-0 sudo[146408]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:06 compute-0 sudo[146560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryditcmmegeutbipwgqdvmrknytyjfza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159106.2746875-2012-186898723987498/AnsiballZ_file.py'
Jan 23 09:05:06 compute-0 sudo[146560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:06 compute-0 python3.9[146562]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:06 compute-0 sudo[146560]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:06 compute-0 sudo[146712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkngduxfutgojsapdvpzaocxtsojhhji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159106.7267213-2012-33268816117967/AnsiballZ_file.py'
Jan 23 09:05:06 compute-0 sudo[146712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:07 compute-0 python3.9[146714]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:07 compute-0 sudo[146712]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:07 compute-0 sudo[146864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eunqzfskzctofpawowpvwrkzqeyiddej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159107.1814966-2012-10026778931088/AnsiballZ_file.py'
Jan 23 09:05:07 compute-0 sudo[146864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:07 compute-0 python3.9[146866]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:07 compute-0 sudo[146864]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:07 compute-0 sudo[147016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxadhzamtyghnbchhzelpsegomdqqzfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159107.623737-2012-191925597566600/AnsiballZ_file.py'
Jan 23 09:05:07 compute-0 sudo[147016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:07 compute-0 python3.9[147018]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:07 compute-0 sudo[147016]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:08 compute-0 sudo[147168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfnoyhaasvrqfwtbzzltavlqhvgolees ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159108.1324255-2309-130306136334153/AnsiballZ_stat.py'
Jan 23 09:05:08 compute-0 sudo[147168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:08 compute-0 python3.9[147170]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:08 compute-0 sudo[147168]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:08 compute-0 sudo[147291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taqorgszvnxtjjuoodzrncfcuswffacq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159108.1324255-2309-130306136334153/AnsiballZ_copy.py'
Jan 23 09:05:08 compute-0 sudo[147291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:09 compute-0 python3.9[147293]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159108.1324255-2309-130306136334153/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:09 compute-0 sudo[147291]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:09 compute-0 sudo[147443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbxfulmgcvijxvnlbvnmvwkhuncweihb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159109.147317-2309-239425046988397/AnsiballZ_stat.py'
Jan 23 09:05:09 compute-0 sudo[147443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:09 compute-0 python3.9[147445]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:09 compute-0 sudo[147443]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:09 compute-0 sudo[147566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgiaejgepfssnlcwalopqczvypmtvcbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159109.147317-2309-239425046988397/AnsiballZ_copy.py'
Jan 23 09:05:09 compute-0 sudo[147566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:09 compute-0 python3.9[147568]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159109.147317-2309-239425046988397/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:09 compute-0 sudo[147566]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:10 compute-0 sudo[147718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaawoetymqtinokbdgfvnjctwutbmraw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159109.972028-2309-107050103437266/AnsiballZ_stat.py'
Jan 23 09:05:10 compute-0 sudo[147718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:10 compute-0 python3.9[147720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:10 compute-0 sudo[147718]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:10 compute-0 sudo[147841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlgmnlgwapshmkmswvelbyhgcxgcilek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159109.972028-2309-107050103437266/AnsiballZ_copy.py'
Jan 23 09:05:10 compute-0 sudo[147841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:10 compute-0 python3.9[147843]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159109.972028-2309-107050103437266/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:10 compute-0 sudo[147841]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:10 compute-0 sudo[148002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knonxhxryubimhurioclccnbzyagvjdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159110.8224254-2309-127612854825919/AnsiballZ_stat.py'
Jan 23 09:05:11 compute-0 sudo[148002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:11 compute-0 podman[147967]: 2026-01-23 09:05:11.036139429 +0000 UTC m=+0.066907576 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 09:05:11 compute-0 python3.9[148011]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:11 compute-0 sudo[148002]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:11 compute-0 sudo[148133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvtzxjycfuxcrucbpghmqifoatqxodny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159110.8224254-2309-127612854825919/AnsiballZ_copy.py'
Jan 23 09:05:11 compute-0 sudo[148133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:11 compute-0 python3.9[148135]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159110.8224254-2309-127612854825919/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:11 compute-0 sudo[148133]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:11 compute-0 sudo[148285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-araaiugutxhjfvgiwpycfwyicruqcnky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159111.6418188-2309-240348453448741/AnsiballZ_stat.py'
Jan 23 09:05:11 compute-0 sudo[148285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:11 compute-0 python3.9[148287]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:11 compute-0 sudo[148285]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:12 compute-0 sudo[148408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjmozkrknukhtzfxxxjvktzopolgdxyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159111.6418188-2309-240348453448741/AnsiballZ_copy.py'
Jan 23 09:05:12 compute-0 sudo[148408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:12 compute-0 python3.9[148410]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159111.6418188-2309-240348453448741/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:12 compute-0 sudo[148408]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:12 compute-0 sudo[148560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsgovlrwawgmwhzwpeqtzrsmgemvwpwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159112.4689245-2309-80673079394838/AnsiballZ_stat.py'
Jan 23 09:05:12 compute-0 sudo[148560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:12 compute-0 python3.9[148562]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:12 compute-0 sudo[148560]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:13 compute-0 sudo[148683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atapwwaezglrbnfiwlgzqymsiezuchpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159112.4689245-2309-80673079394838/AnsiballZ_copy.py'
Jan 23 09:05:13 compute-0 sudo[148683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:13 compute-0 python3.9[148685]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159112.4689245-2309-80673079394838/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:13 compute-0 sudo[148683]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:13 compute-0 sudo[148835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxwmrvxxssmhhghihqufeuxfoihzdxah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159113.3237405-2309-137660595633188/AnsiballZ_stat.py'
Jan 23 09:05:13 compute-0 sudo[148835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:13 compute-0 python3.9[148837]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:13 compute-0 sudo[148835]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:14 compute-0 sudo[148958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guszfakisyqahvqafzsswldpwmkwresv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159113.3237405-2309-137660595633188/AnsiballZ_copy.py'
Jan 23 09:05:14 compute-0 sudo[148958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:14 compute-0 python3.9[148960]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159113.3237405-2309-137660595633188/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:14 compute-0 sudo[148958]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:14 compute-0 sudo[149110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-robrvmlikaduqemolixpuvjuagiucbmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159114.329451-2309-62793072965150/AnsiballZ_stat.py'
Jan 23 09:05:14 compute-0 sudo[149110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:14 compute-0 python3.9[149112]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:14 compute-0 sudo[149110]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:14 compute-0 sudo[149233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvknmnnnxdqcraqopjiavflzptjrkuco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159114.329451-2309-62793072965150/AnsiballZ_copy.py'
Jan 23 09:05:14 compute-0 sudo[149233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:15 compute-0 python3.9[149235]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159114.329451-2309-62793072965150/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:15 compute-0 sudo[149233]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:15 compute-0 sudo[149385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foqbajjiuqtdpznckfpextvkyiztybon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159115.1590111-2309-203075667107803/AnsiballZ_stat.py'
Jan 23 09:05:15 compute-0 sudo[149385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:15 compute-0 python3.9[149387]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:15 compute-0 sudo[149385]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:15 compute-0 sudo[149508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvmuszeybpeaoshzndgizqoetgyqyksp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159115.1590111-2309-203075667107803/AnsiballZ_copy.py'
Jan 23 09:05:15 compute-0 sudo[149508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:15 compute-0 python3.9[149510]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159115.1590111-2309-203075667107803/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:15 compute-0 sudo[149508]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:16 compute-0 sudo[149660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yufbkttaoikhobwuhfexdoheliyksmfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159116.0045757-2309-127613920661448/AnsiballZ_stat.py'
Jan 23 09:05:16 compute-0 sudo[149660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:16 compute-0 python3.9[149662]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:16 compute-0 sudo[149660]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:16 compute-0 sudo[149783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghcsilhjlkzfyhatibaedewzeveopjvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159116.0045757-2309-127613920661448/AnsiballZ_copy.py'
Jan 23 09:05:16 compute-0 sudo[149783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:16 compute-0 python3.9[149785]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159116.0045757-2309-127613920661448/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:16 compute-0 sudo[149783]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:17 compute-0 sudo[149935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdandwhbfieinpdiecxcagboxnxuwvod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159116.8288333-2309-159986623438124/AnsiballZ_stat.py'
Jan 23 09:05:17 compute-0 sudo[149935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:17 compute-0 python3.9[149937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:17 compute-0 sudo[149935]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:17 compute-0 sudo[150058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwefisfzztjazubkhkuiizfhmelmorzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159116.8288333-2309-159986623438124/AnsiballZ_copy.py'
Jan 23 09:05:17 compute-0 sudo[150058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:17 compute-0 python3.9[150060]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159116.8288333-2309-159986623438124/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:17 compute-0 sudo[150058]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:17 compute-0 sudo[150210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynldutdqelxtlcxkbjpldqlsdunilzte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159117.681604-2309-64254624087210/AnsiballZ_stat.py'
Jan 23 09:05:17 compute-0 sudo[150210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:18 compute-0 python3.9[150212]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:18 compute-0 sudo[150210]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:18 compute-0 sudo[150333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhrtrpvwjjhnhxxsyyzovetqtrfbykqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159117.681604-2309-64254624087210/AnsiballZ_copy.py'
Jan 23 09:05:18 compute-0 sudo[150333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:18 compute-0 python3.9[150335]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159117.681604-2309-64254624087210/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:18 compute-0 sudo[150333]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:18 compute-0 sudo[150485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruiziepaxunuvkifjguyfbzogrqftdvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159118.5153687-2309-75143248722433/AnsiballZ_stat.py'
Jan 23 09:05:18 compute-0 sudo[150485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:18 compute-0 python3.9[150487]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:18 compute-0 sudo[150485]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:19 compute-0 sudo[150608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvcvtgplzjsbglaouuuhudbqtqaebybw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159118.5153687-2309-75143248722433/AnsiballZ_copy.py'
Jan 23 09:05:19 compute-0 sudo[150608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:19 compute-0 python3.9[150610]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159118.5153687-2309-75143248722433/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:19 compute-0 sudo[150608]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:19 compute-0 sudo[150760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwjvgkavrgonzhzhzyuwblmbulgjnyah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159119.5315177-2309-262273776997880/AnsiballZ_stat.py'
Jan 23 09:05:19 compute-0 sudo[150760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:19 compute-0 python3.9[150762]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:19 compute-0 sudo[150760]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:20 compute-0 sudo[150883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpqikyemrgcmjkjvjrfnuzqisevdqeeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159119.5315177-2309-262273776997880/AnsiballZ_copy.py'
Jan 23 09:05:20 compute-0 sudo[150883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:20 compute-0 python3.9[150885]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159119.5315177-2309-262273776997880/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:20 compute-0 sudo[150883]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:20 compute-0 python3.9[151035]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:05:21 compute-0 sudo[151188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qshmwgnivnhpzmfstkeohhacjffcfags ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159120.9281242-2927-213206761911147/AnsiballZ_seboolean.py'
Jan 23 09:05:21 compute-0 sudo[151188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:21 compute-0 python3.9[151190]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 23 09:05:22 compute-0 dbus-broker-launch[733]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 23 09:05:22 compute-0 sudo[151188]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:22 compute-0 podman[151195]: 2026-01-23 09:05:22.228918766 +0000 UTC m=+0.063419159 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:05:22 compute-0 sudo[151368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfsrqnszvblawjgbtlybqkptxtehyjyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159122.3296306-2951-256531066973154/AnsiballZ_copy.py'
Jan 23 09:05:22 compute-0 sudo[151368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:22 compute-0 python3.9[151370]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:22 compute-0 sudo[151368]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:22 compute-0 sudo[151520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhesytbbfbadkchehezevddhsxuzyxbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159122.7616649-2951-143598755540746/AnsiballZ_copy.py'
Jan 23 09:05:22 compute-0 sudo[151520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:23 compute-0 python3.9[151522]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:23 compute-0 sudo[151520]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:23 compute-0 sudo[151672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqezaspkfrwsbqcgiaxtcnfemesiowvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159123.2139971-2951-2113967896831/AnsiballZ_copy.py'
Jan 23 09:05:23 compute-0 sudo[151672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:23 compute-0 python3.9[151674]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:23 compute-0 sudo[151672]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:23 compute-0 sudo[151824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztllllwfpbcjduwwmpimthuisqtijtsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159123.646053-2951-69114612334840/AnsiballZ_copy.py'
Jan 23 09:05:23 compute-0 sudo[151824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:23 compute-0 python3.9[151826]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:23 compute-0 sudo[151824]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:24 compute-0 sudo[151976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwqhxhpxrknqqcnvaupqxnpifswtxgnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159124.0812318-2951-73853288670970/AnsiballZ_copy.py'
Jan 23 09:05:24 compute-0 sudo[151976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:24 compute-0 python3.9[151978]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:24 compute-0 sudo[151976]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:24 compute-0 sudo[152128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrxmfkexfsrwesswiamjgtkiakbddxbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159124.7867308-3059-39781208426652/AnsiballZ_copy.py'
Jan 23 09:05:24 compute-0 sudo[152128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:25 compute-0 python3.9[152130]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:25 compute-0 sudo[152128]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:25 compute-0 sudo[152280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvzkcjbjocokpgecvseilbvgeidxotre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159125.2440536-3059-29870029955664/AnsiballZ_copy.py'
Jan 23 09:05:25 compute-0 sudo[152280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:25 compute-0 python3.9[152282]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:25 compute-0 sudo[152280]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:25 compute-0 sudo[152432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbigpwhufwcywpiejycvdjdvlcekybwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159125.6988852-3059-264425797801418/AnsiballZ_copy.py'
Jan 23 09:05:25 compute-0 sudo[152432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:26 compute-0 python3.9[152434]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:26 compute-0 sudo[152432]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:26 compute-0 sudo[152584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzkmbqjaowqzuwiojdemzorbdkmigfsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159126.139642-3059-58797058380950/AnsiballZ_copy.py'
Jan 23 09:05:26 compute-0 sudo[152584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:26 compute-0 python3.9[152586]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:26 compute-0 sudo[152584]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:26 compute-0 sudo[152736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msoxtxsbjsnuvghwcsluifuqyktxfjzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159126.5717034-3059-246830535575182/AnsiballZ_copy.py'
Jan 23 09:05:26 compute-0 sudo[152736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:26 compute-0 python3.9[152738]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:26 compute-0 sudo[152736]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:27 compute-0 sudo[152888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvrgjjjdznpqtymynjljuleehhuznxwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159127.1278465-3167-126389265860755/AnsiballZ_systemd.py'
Jan 23 09:05:27 compute-0 sudo[152888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:27 compute-0 python3.9[152890]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:05:27 compute-0 systemd[1]: Reloading.
Jan 23 09:05:27 compute-0 systemd-sysv-generator[152916]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:05:27 compute-0 systemd-rc-local-generator[152913]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:05:27 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Jan 23 09:05:27 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Jan 23 09:05:27 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 23 09:05:27 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 23 09:05:27 compute-0 systemd[1]: Starting libvirt logging daemon...
Jan 23 09:05:27 compute-0 systemd[1]: Started libvirt logging daemon.
Jan 23 09:05:27 compute-0 sudo[152888]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:28 compute-0 sudo[153081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceeefnejocmxmubzjglbhurnggpduhfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159127.939703-3167-174751086196014/AnsiballZ_systemd.py'
Jan 23 09:05:28 compute-0 sudo[153081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:28 compute-0 python3.9[153083]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:05:28 compute-0 systemd[1]: Reloading.
Jan 23 09:05:28 compute-0 systemd-rc-local-generator[153105]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:05:28 compute-0 systemd-sysv-generator[153108]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:05:28 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 23 09:05:28 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 23 09:05:28 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 23 09:05:28 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 23 09:05:28 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 23 09:05:28 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 23 09:05:28 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 09:05:28 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 23 09:05:28 compute-0 sudo[153081]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:28 compute-0 sudo[153298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jskpiliqxahjbozkdmgujwyiroqmnxye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159128.773862-3167-15636907209313/AnsiballZ_systemd.py'
Jan 23 09:05:28 compute-0 sudo[153298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:29 compute-0 python3.9[153300]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:05:29 compute-0 systemd[1]: Reloading.
Jan 23 09:05:29 compute-0 systemd-rc-local-generator[153321]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:05:29 compute-0 systemd-sysv-generator[153326]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:05:29 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 23 09:05:29 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 23 09:05:29 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 23 09:05:29 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 23 09:05:29 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 23 09:05:29 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 23 09:05:29 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 23 09:05:29 compute-0 sudo[153298]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:29 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 23 09:05:29 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 23 09:05:29 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 23 09:05:29 compute-0 sudo[153518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omekldcbwrvecvmgngqfcmysvqptkuof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159129.5766284-3167-51270781407773/AnsiballZ_systemd.py'
Jan 23 09:05:29 compute-0 sudo[153518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:30 compute-0 python3.9[153520]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:05:30 compute-0 systemd[1]: Reloading.
Jan 23 09:05:30 compute-0 systemd-sysv-generator[153545]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:05:30 compute-0 systemd-rc-local-generator[153542]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:05:30 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Jan 23 09:05:30 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 23 09:05:30 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 23 09:05:30 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 23 09:05:30 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 23 09:05:30 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 23 09:05:30 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 23 09:05:30 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 23 09:05:30 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 23 09:05:30 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 23 09:05:30 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 09:05:30 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 23 09:05:30 compute-0 sudo[153518]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:30 compute-0 setroubleshoot[153338]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 042d92b4-bc75-4ec8-99ee-dc3128053b60
Jan 23 09:05:30 compute-0 setroubleshoot[153338]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 23 09:05:30 compute-0 setroubleshoot[153338]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 042d92b4-bc75-4ec8-99ee-dc3128053b60
Jan 23 09:05:30 compute-0 setroubleshoot[153338]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 23 09:05:30 compute-0 sudo[153736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcnxpgdxqgepesnqsqvywdqqihpalogz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159130.6126997-3167-194504259642379/AnsiballZ_systemd.py'
Jan 23 09:05:30 compute-0 sudo[153736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:31 compute-0 python3.9[153738]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:05:31 compute-0 systemd[1]: Reloading.
Jan 23 09:05:31 compute-0 systemd-rc-local-generator[153759]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:05:31 compute-0 systemd-sysv-generator[153762]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:05:31 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Jan 23 09:05:31 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Jan 23 09:05:31 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 23 09:05:31 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 23 09:05:31 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 23 09:05:31 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 23 09:05:31 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 23 09:05:31 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 23 09:05:31 compute-0 sudo[153736]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:31 compute-0 sudo[153948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yakaezkyfnmmddzzmqtpjaapfqjtfeyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159131.5791237-3278-26354840050336/AnsiballZ_file.py'
Jan 23 09:05:31 compute-0 sudo[153948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:31 compute-0 python3.9[153950]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:31 compute-0 sudo[153948]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:32 compute-0 sudo[154100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slnwmcesqivaawydkmyzmcmqrqycgxbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159132.1637666-3302-43363605305695/AnsiballZ_find.py'
Jan 23 09:05:32 compute-0 sudo[154100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:32 compute-0 python3.9[154102]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:05:32 compute-0 sudo[154100]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:33 compute-0 sudo[154252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsssvkuafurzggmryruprppvyimplhwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159132.9984303-3344-138571228196845/AnsiballZ_stat.py'
Jan 23 09:05:33 compute-0 sudo[154252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:33 compute-0 python3.9[154254]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:33 compute-0 sudo[154252]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:33 compute-0 sudo[154375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibvuqmfpqrvhvaddptaagcnixrqpdjjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159132.9984303-3344-138571228196845/AnsiballZ_copy.py'
Jan 23 09:05:33 compute-0 sudo[154375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:33 compute-0 python3.9[154377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159132.9984303-3344-138571228196845/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:33 compute-0 sudo[154375]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:34 compute-0 sudo[154527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwiswweducptpljnvlbkqlvldfzahuqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159134.061529-3392-92274637796041/AnsiballZ_file.py'
Jan 23 09:05:34 compute-0 sudo[154527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:34 compute-0 python3.9[154529]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:34 compute-0 sudo[154527]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:34 compute-0 sudo[154679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eokgdqtdifpfqfjtrrwwojbhlzjtrulq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159134.599179-3416-221079835589957/AnsiballZ_stat.py'
Jan 23 09:05:34 compute-0 sudo[154679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:34 compute-0 python3.9[154681]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:34 compute-0 sudo[154679]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:35 compute-0 sudo[154757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ockkjlgvpabyljcjhtrsxusxpslipqhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159134.599179-3416-221079835589957/AnsiballZ_file.py'
Jan 23 09:05:35 compute-0 sudo[154757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:35 compute-0 python3.9[154759]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:35 compute-0 sudo[154757]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:35 compute-0 sudo[154909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoqgjtzpyfvlvhnqgfjfpebexurhwhto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159135.4802003-3452-49018851062683/AnsiballZ_stat.py'
Jan 23 09:05:35 compute-0 sudo[154909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:35 compute-0 python3.9[154911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:35 compute-0 sudo[154909]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:35 compute-0 sudo[154987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evifxeeqygxcxqxwtmmeukroqnorqqkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159135.4802003-3452-49018851062683/AnsiballZ_file.py'
Jan 23 09:05:35 compute-0 sudo[154987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:36 compute-0 python3.9[154989]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xh0nf3j9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:36 compute-0 sudo[154987]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:36 compute-0 sudo[155139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbzhspitqfpjskxqnxqckgwoiildxxzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159136.3303437-3488-80636388575122/AnsiballZ_stat.py'
Jan 23 09:05:36 compute-0 sudo[155139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:36 compute-0 python3.9[155141]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:36 compute-0 sudo[155139]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:36 compute-0 sudo[155217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yidooagwfvdnryhbmnlxvkbwhxqonxsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159136.3303437-3488-80636388575122/AnsiballZ_file.py'
Jan 23 09:05:36 compute-0 sudo[155217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:37 compute-0 python3.9[155219]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:37 compute-0 sudo[155217]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:37 compute-0 sudo[155369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmhotyibmudyeyjrcjbbjqacragnlmnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159137.2640333-3527-159588097028864/AnsiballZ_command.py'
Jan 23 09:05:37 compute-0 sudo[155369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:37 compute-0 python3.9[155371]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:05:37 compute-0 sudo[155369]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:38 compute-0 sudo[155522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elyqfqowixrbxksqzyvhhicmksutdegi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769159137.7816167-3551-47115695582289/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 09:05:38 compute-0 sudo[155522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:38 compute-0 python3[155524]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 09:05:38 compute-0 sudo[155522]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:38 compute-0 sudo[155674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvnedklqxwkohoxyujlvnllvorbtbwlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159138.3830311-3575-43549821514006/AnsiballZ_stat.py'
Jan 23 09:05:38 compute-0 sudo[155674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:38 compute-0 python3.9[155676]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:38 compute-0 sudo[155674]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:38 compute-0 sudo[155752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyhbhpweiozgzorwvddosqzvvptmoycu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159138.3830311-3575-43549821514006/AnsiballZ_file.py'
Jan 23 09:05:38 compute-0 sudo[155752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:39 compute-0 python3.9[155754]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:39 compute-0 sudo[155752]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:39 compute-0 sudo[155904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sklxpxsdghdpmmjcxxgjshngxrlhwsfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159139.263732-3611-10995753698559/AnsiballZ_stat.py'
Jan 23 09:05:39 compute-0 sudo[155904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:39 compute-0 python3.9[155906]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:39 compute-0 sudo[155904]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:05:39.843 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:05:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:05:39.848 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:05:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:05:39.848 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:05:39 compute-0 sudo[156029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phunfkmckkhzobjhpzfxubismtrtzkmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159139.263732-3611-10995753698559/AnsiballZ_copy.py'
Jan 23 09:05:39 compute-0 sudo[156029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:40 compute-0 python3.9[156031]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159139.263732-3611-10995753698559/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:40 compute-0 sudo[156029]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:40 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 23 09:05:40 compute-0 sudo[156181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqrxidazguqthmhvpqftenoencylmhtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159140.2506447-3656-225231713720124/AnsiballZ_stat.py'
Jan 23 09:05:40 compute-0 sudo[156181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:40 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 23 09:05:40 compute-0 python3.9[156183]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:40 compute-0 sudo[156181]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:40 compute-0 sudo[156259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbyhnhessmggqflbwyjbgnpfwbmhanix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159140.2506447-3656-225231713720124/AnsiballZ_file.py'
Jan 23 09:05:40 compute-0 sudo[156259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:41 compute-0 python3.9[156261]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:41 compute-0 sudo[156259]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:41 compute-0 podman[156286]: 2026-01-23 09:05:41.203141075 +0000 UTC m=+0.039711226 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 09:05:41 compute-0 sudo[156427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owyizxjqvxwvzbgasxxjwkmplqovuzxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159141.2979429-3692-89593467478978/AnsiballZ_stat.py'
Jan 23 09:05:41 compute-0 sudo[156427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:41 compute-0 python3.9[156429]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:41 compute-0 sudo[156427]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:41 compute-0 sudo[156505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gghmpytkbemewrtsjptkmxpaxsadztcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159141.2979429-3692-89593467478978/AnsiballZ_file.py'
Jan 23 09:05:41 compute-0 sudo[156505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:41 compute-0 python3.9[156507]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:41 compute-0 sudo[156505]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:42 compute-0 sudo[156657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhoiyrmzexwiunhjvosaocdbfcczgvux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159142.2410731-3728-101004966738940/AnsiballZ_stat.py'
Jan 23 09:05:42 compute-0 sudo[156657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:42 compute-0 python3.9[156659]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:42 compute-0 sudo[156657]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:42 compute-0 sudo[156782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytbptkgkpflmtkgyzsjarfdopvlonvud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159142.2410731-3728-101004966738940/AnsiballZ_copy.py'
Jan 23 09:05:42 compute-0 sudo[156782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:43 compute-0 python3.9[156784]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159142.2410731-3728-101004966738940/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:43 compute-0 sudo[156782]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:43 compute-0 sudo[156934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ataktatfkwbdjritldmdqytgxjkobbst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159143.33994-3773-263764755903595/AnsiballZ_file.py'
Jan 23 09:05:43 compute-0 sudo[156934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:43 compute-0 python3.9[156936]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:43 compute-0 sudo[156934]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:44 compute-0 sudo[157086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvvvhzmfkzxlvzdwhebvtrmhembmcmvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159143.8911376-3797-136267486499475/AnsiballZ_command.py'
Jan 23 09:05:44 compute-0 sudo[157086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:44 compute-0 python3.9[157088]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:05:44 compute-0 sudo[157086]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:44 compute-0 sudo[157241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qngpikdmkjbtmmhiznbdksvetbndeosk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159144.4721823-3821-40466833625126/AnsiballZ_blockinfile.py'
Jan 23 09:05:44 compute-0 sudo[157241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:44 compute-0 python3.9[157243]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:44 compute-0 sudo[157241]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:45 compute-0 sudo[157393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxoywvhizdxlfgvejwqwcophbompgutr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159145.2560804-3848-124544679554359/AnsiballZ_command.py'
Jan 23 09:05:45 compute-0 sudo[157393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:45 compute-0 python3.9[157395]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:05:45 compute-0 sudo[157393]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:46 compute-0 sudo[157546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xivztdqluwewzfgklibqpvluuywmcxko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159145.973766-3872-112667599615629/AnsiballZ_stat.py'
Jan 23 09:05:46 compute-0 sudo[157546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:46 compute-0 python3.9[157548]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:05:46 compute-0 sudo[157546]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:46 compute-0 sudo[157700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idjpahzupsubwqyphyhmblwfvmeaktik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159146.5010555-3896-255514177492064/AnsiballZ_command.py'
Jan 23 09:05:46 compute-0 sudo[157700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:46 compute-0 python3.9[157702]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:05:46 compute-0 sudo[157700]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:47 compute-0 sudo[157855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkectawjsjimmiwgrsukbagbqbqqflqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159147.1029348-3920-192715187232469/AnsiballZ_file.py'
Jan 23 09:05:47 compute-0 sudo[157855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:47 compute-0 python3.9[157857]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:47 compute-0 sudo[157855]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:47 compute-0 sudo[158007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbiquotkamsoryeihmwuaofandlrjnxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159147.6831782-3944-152363569706410/AnsiballZ_stat.py'
Jan 23 09:05:47 compute-0 sudo[158007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:47 compute-0 python3.9[158009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:48 compute-0 sudo[158007]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:48 compute-0 sudo[158130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otcyxfcvedbskualiiirurubygtlotcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159147.6831782-3944-152363569706410/AnsiballZ_copy.py'
Jan 23 09:05:48 compute-0 sudo[158130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:48 compute-0 python3.9[158132]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159147.6831782-3944-152363569706410/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:48 compute-0 sudo[158130]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:48 compute-0 sudo[158282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnsdwmjuudgaictarxrsjgbtrmvnadwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159148.669025-3989-101622008610160/AnsiballZ_stat.py'
Jan 23 09:05:48 compute-0 sudo[158282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:48 compute-0 python3.9[158284]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:49 compute-0 sudo[158282]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:49 compute-0 sudo[158405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xokbbdpuqbvlwixzzdzndakvtozelfct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159148.669025-3989-101622008610160/AnsiballZ_copy.py'
Jan 23 09:05:49 compute-0 sudo[158405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:49 compute-0 python3.9[158407]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159148.669025-3989-101622008610160/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:49 compute-0 sudo[158405]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:49 compute-0 sudo[158557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwialubytlqxvybwbjjwcnqqxfgifoab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159149.6423798-4034-187957488497829/AnsiballZ_stat.py'
Jan 23 09:05:49 compute-0 sudo[158557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:49 compute-0 python3.9[158559]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:05:49 compute-0 sudo[158557]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:50 compute-0 sudo[158680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhpmxlptyffenkkxaplbskiufzlyhyyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159149.6423798-4034-187957488497829/AnsiballZ_copy.py'
Jan 23 09:05:50 compute-0 sudo[158680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:50 compute-0 python3.9[158682]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159149.6423798-4034-187957488497829/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:50 compute-0 sudo[158680]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:50 compute-0 sudo[158832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdgkxughscfriuprlbirfhdeqdejruxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159150.5911758-4079-11213059640145/AnsiballZ_systemd.py'
Jan 23 09:05:50 compute-0 sudo[158832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:50 compute-0 python3.9[158834]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:05:51 compute-0 systemd[1]: Reloading.
Jan 23 09:05:51 compute-0 systemd-sysv-generator[158860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:05:51 compute-0 systemd-rc-local-generator[158857]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:05:51 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Jan 23 09:05:51 compute-0 sudo[158832]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:51 compute-0 sudo[159023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzlnngbdvkcwbmujxdazsmjpjhmpigul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159151.469516-4103-155963208452568/AnsiballZ_systemd.py'
Jan 23 09:05:51 compute-0 sudo[159023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:51 compute-0 python3.9[159025]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 09:05:51 compute-0 systemd[1]: Reloading.
Jan 23 09:05:51 compute-0 systemd-rc-local-generator[159046]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:05:51 compute-0 systemd-sysv-generator[159049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:05:52 compute-0 systemd[1]: Reloading.
Jan 23 09:05:52 compute-0 systemd-rc-local-generator[159083]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:05:52 compute-0 systemd-sysv-generator[159086]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:05:52 compute-0 sudo[159023]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:52 compute-0 podman[159098]: 2026-01-23 09:05:52.396122308 +0000 UTC m=+0.057693366 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 23 09:05:52 compute-0 sshd-session[104520]: Connection closed by 192.168.122.30 port 47766
Jan 23 09:05:52 compute-0 sshd-session[104517]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:05:52 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Jan 23 09:05:52 compute-0 systemd[1]: session-22.scope: Consumed 2min 20.782s CPU time.
Jan 23 09:05:52 compute-0 systemd-logind[746]: Session 22 logged out. Waiting for processes to exit.
Jan 23 09:05:52 compute-0 systemd-logind[746]: Removed session 22.
Jan 23 09:05:58 compute-0 sshd-session[159145]: Accepted publickey for zuul from 192.168.122.30 port 49558 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:05:58 compute-0 systemd-logind[746]: New session 23 of user zuul.
Jan 23 09:05:58 compute-0 systemd[1]: Started Session 23 of User zuul.
Jan 23 09:05:58 compute-0 sshd-session[159145]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:05:58 compute-0 python3.9[159298]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:05:59 compute-0 python3.9[159452]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:05:59 compute-0 network[159469]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:05:59 compute-0 network[159470]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:05:59 compute-0 network[159471]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:06:03 compute-0 sudo[159740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bupgiomrznlzyhuaaqaknnhxynwkqmgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159163.4641895-101-16181247641781/AnsiballZ_setup.py'
Jan 23 09:06:03 compute-0 sudo[159740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:03 compute-0 python3.9[159742]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:06:04 compute-0 sudo[159740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:04 compute-0 sudo[159824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onsgdizmflgptmchlqdjcglapwrhyysa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159163.4641895-101-16181247641781/AnsiballZ_dnf.py'
Jan 23 09:06:04 compute-0 sudo[159824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:04 compute-0 python3.9[159826]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:06:08 compute-0 sudo[159824]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:09 compute-0 sudo[159977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqrzlegzlqlhvuccyqlhbnughjwdfcxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159168.9604678-137-214168540652884/AnsiballZ_stat.py'
Jan 23 09:06:09 compute-0 sudo[159977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:09 compute-0 python3.9[159979]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:06:09 compute-0 sudo[159977]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:09 compute-0 sudo[160129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjturphpwihlrmayakgkgjcsaifgfrqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159169.6334634-167-51341697167063/AnsiballZ_command.py'
Jan 23 09:06:09 compute-0 sudo[160129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:10 compute-0 python3.9[160131]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:06:10 compute-0 sudo[160129]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:10 compute-0 sudo[160282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzcgtlrknrrfquchouizjkcckfschrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159170.419602-197-67646077087810/AnsiballZ_stat.py'
Jan 23 09:06:10 compute-0 sudo[160282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:10 compute-0 python3.9[160284]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:06:10 compute-0 sudo[160282]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:11 compute-0 sudo[160434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwcplpqroqvjvbctslrllzexdwruudkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159170.9936843-221-185019169287181/AnsiballZ_command.py'
Jan 23 09:06:11 compute-0 sudo[160434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:11 compute-0 python3.9[160436]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:06:11 compute-0 sudo[160434]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:11 compute-0 sudo[160596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usucvocterzmruzclkorfmobyzkvflwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159171.529865-245-3547419754093/AnsiballZ_stat.py'
Jan 23 09:06:11 compute-0 sudo[160596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:11 compute-0 podman[160561]: 2026-01-23 09:06:11.755375043 +0000 UTC m=+0.066605392 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:06:11 compute-0 python3.9[160603]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:06:11 compute-0 sudo[160596]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:12 compute-0 sudo[160726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnkmelpqhufbofahohawbzdjhnpetjpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159171.529865-245-3547419754093/AnsiballZ_copy.py'
Jan 23 09:06:12 compute-0 sudo[160726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:12 compute-0 python3.9[160728]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159171.529865-245-3547419754093/.source.iscsi _original_basename=.vdlncxi8 follow=False checksum=72f2fa1861fdae2ae81f8fb3e50bc15b78c42d93 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:12 compute-0 sudo[160726]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:12 compute-0 sudo[160878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixksnhoicgcoyezjygvqkzqwnttqpfek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159172.5188973-290-126027410832671/AnsiballZ_file.py'
Jan 23 09:06:12 compute-0 sudo[160878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:12 compute-0 python3.9[160880]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:12 compute-0 sudo[160878]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:13 compute-0 sudo[161030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fafymrgcgtvnmeijsnievjlkechmwqvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159173.1562161-314-130381360194435/AnsiballZ_lineinfile.py'
Jan 23 09:06:13 compute-0 sudo[161030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:13 compute-0 python3.9[161032]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:13 compute-0 sudo[161030]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:14 compute-0 sudo[161182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkwwidjfpcdgpsnjhbyvnigmvmuhrrxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159173.8328438-341-199352491869507/AnsiballZ_systemd_service.py'
Jan 23 09:06:14 compute-0 sudo[161182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:14 compute-0 python3.9[161184]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:14 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 23 09:06:14 compute-0 sudo[161182]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:14 compute-0 sudo[161338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmsebxglskliafedbikindiudgthstxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159174.7466087-365-156339917628876/AnsiballZ_systemd_service.py'
Jan 23 09:06:14 compute-0 sudo[161338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:15 compute-0 python3.9[161340]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:15 compute-0 systemd[1]: Reloading.
Jan 23 09:06:15 compute-0 systemd-rc-local-generator[161363]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:06:15 compute-0 systemd-sysv-generator[161370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:06:15 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 09:06:15 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 23 09:06:15 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Jan 23 09:06:15 compute-0 systemd[1]: Started Open-iSCSI.
Jan 23 09:06:15 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 23 09:06:15 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 23 09:06:15 compute-0 sudo[161338]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:16 compute-0 python3.9[161539]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:06:16 compute-0 network[161556]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:06:16 compute-0 network[161557]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:06:16 compute-0 network[161558]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:06:18 compute-0 sudo[161827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtffdlelqqabzipnnhdgmbtqaydofbrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159178.7300503-434-153811040097215/AnsiballZ_dnf.py'
Jan 23 09:06:18 compute-0 sudo[161827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:19 compute-0 python3.9[161829]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:06:21 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:06:21 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:06:21 compute-0 systemd[1]: Reloading.
Jan 23 09:06:21 compute-0 systemd-sysv-generator[161877]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:06:21 compute-0 systemd-rc-local-generator[161873]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:06:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:06:21 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:06:21 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:06:21 compute-0 systemd[1]: run-rf0429e2cb9004fdf92e0f043b26c5d5a.service: Deactivated successfully.
Jan 23 09:06:21 compute-0 sudo[161827]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:22 compute-0 sudo[162151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swmbjexliqyfwfymztnhtdlpulnlkeur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159182.5311003-461-26920133081696/AnsiballZ_file.py'
Jan 23 09:06:22 compute-0 sudo[162151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:22 compute-0 podman[162117]: 2026-01-23 09:06:22.741687957 +0000 UTC m=+0.061721131 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 23 09:06:22 compute-0 python3.9[162162]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 09:06:22 compute-0 sudo[162151]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:23 compute-0 sudo[162319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtjuhjwdymgcatlnhwsqqpsovcgktscx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159183.0596957-485-26129116316948/AnsiballZ_modprobe.py'
Jan 23 09:06:23 compute-0 sudo[162319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:23 compute-0 python3.9[162321]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 23 09:06:23 compute-0 sudo[162319]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:23 compute-0 sudo[162475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkcddnyepjnekebqzrwjbklbuggfezhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159183.7961168-509-86142636956792/AnsiballZ_stat.py'
Jan 23 09:06:23 compute-0 sudo[162475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:24 compute-0 python3.9[162477]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:06:24 compute-0 sudo[162475]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:24 compute-0 sudo[162598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klittfjyvutuwftdvgysmknvsjvclhxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159183.7961168-509-86142636956792/AnsiballZ_copy.py'
Jan 23 09:06:24 compute-0 sudo[162598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:24 compute-0 python3.9[162600]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159183.7961168-509-86142636956792/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:24 compute-0 sudo[162598]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:24 compute-0 sudo[162750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhfyndgndyzukselaqfhyjhvztoommda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159184.7630656-557-36962939045713/AnsiballZ_lineinfile.py'
Jan 23 09:06:24 compute-0 sudo[162750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:25 compute-0 python3.9[162752]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:25 compute-0 sudo[162750]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:25 compute-0 sudo[162902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aybuavmilenyjkjleonwuwydpogoioyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159185.2468905-581-118783125246390/AnsiballZ_systemd.py'
Jan 23 09:06:25 compute-0 sudo[162902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:25 compute-0 python3.9[162904]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:06:25 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 09:06:25 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 23 09:06:25 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 23 09:06:25 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 23 09:06:25 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 23 09:06:25 compute-0 sudo[162902]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:26 compute-0 sudo[163058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xewswulewyemvzpjtziitpubcykjlpnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159186.0858958-605-533586070252/AnsiballZ_command.py'
Jan 23 09:06:26 compute-0 sudo[163058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:26 compute-0 python3.9[163060]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:06:26 compute-0 sudo[163058]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:26 compute-0 sudo[163211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkkpkehhuyqknbgaaselmogudohkogfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159186.7142303-635-162906217295308/AnsiballZ_stat.py'
Jan 23 09:06:26 compute-0 sudo[163211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:27 compute-0 python3.9[163213]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:06:27 compute-0 sudo[163211]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:27 compute-0 sudo[163363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zumtxtowmtsbwqxxafcuurlzdnhbwtvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159187.2438984-662-258106605882079/AnsiballZ_stat.py'
Jan 23 09:06:27 compute-0 sudo[163363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:27 compute-0 python3.9[163365]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:06:27 compute-0 sudo[163363]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:27 compute-0 sudo[163486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amsbcgkyewpmqfyembiqrpircispdvpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159187.2438984-662-258106605882079/AnsiballZ_copy.py'
Jan 23 09:06:27 compute-0 sudo[163486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:27 compute-0 python3.9[163488]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159187.2438984-662-258106605882079/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:27 compute-0 sudo[163486]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:28 compute-0 sudo[163638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwqhoioqaxpjidvqztjdoyumpsdliclf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159188.1224363-707-77075485092380/AnsiballZ_command.py'
Jan 23 09:06:28 compute-0 sudo[163638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:28 compute-0 python3.9[163640]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:06:28 compute-0 sudo[163638]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:28 compute-0 sudo[163791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kalvlzgcdgptnzbppeyvuevzdtmhbczj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159188.6096718-731-247197680923585/AnsiballZ_lineinfile.py'
Jan 23 09:06:28 compute-0 sudo[163791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:28 compute-0 python3.9[163793]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:28 compute-0 sudo[163791]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:29 compute-0 sudo[163943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvnkxvfburxtobsotenjkzpsheaaadif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159189.0971158-755-56368170667758/AnsiballZ_replace.py'
Jan 23 09:06:29 compute-0 sudo[163943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:29 compute-0 python3.9[163945]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:29 compute-0 sudo[163943]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:29 compute-0 sudo[164095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoiosrprnpdfytcblmqsoilbxdylppim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159189.6707442-779-152418602316528/AnsiballZ_replace.py'
Jan 23 09:06:29 compute-0 sudo[164095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:29 compute-0 python3.9[164097]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:29 compute-0 sudo[164095]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:30 compute-0 sudo[164247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glievoxparrkpggptfjgryyenhgxvahp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159190.2012055-806-179390034280412/AnsiballZ_lineinfile.py'
Jan 23 09:06:30 compute-0 sudo[164247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:30 compute-0 python3.9[164249]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:30 compute-0 sudo[164247]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:30 compute-0 sudo[164399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgiwgtoirrxmaeadkoekarrbpkgwawfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159190.6050563-806-166038161150585/AnsiballZ_lineinfile.py'
Jan 23 09:06:30 compute-0 sudo[164399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:30 compute-0 python3.9[164401]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:30 compute-0 sudo[164399]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:31 compute-0 sudo[164551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozkrfksxtfoznqmbnozsjvfqobmxydyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159190.9988577-806-74008135980611/AnsiballZ_lineinfile.py'
Jan 23 09:06:31 compute-0 sudo[164551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:31 compute-0 python3.9[164553]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:31 compute-0 sudo[164551]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:31 compute-0 sudo[164703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uslxskhwehtxcfcrweajfnbdmspjtcre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159191.3945966-806-32508598532407/AnsiballZ_lineinfile.py'
Jan 23 09:06:31 compute-0 sudo[164703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:31 compute-0 python3.9[164705]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:31 compute-0 sudo[164703]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:32 compute-0 sudo[164855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfyzyzecqdtvxhwimjsuerrctugeyubn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159192.0131056-893-208389376702347/AnsiballZ_stat.py'
Jan 23 09:06:32 compute-0 sudo[164855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:32 compute-0 python3.9[164857]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:06:32 compute-0 sudo[164855]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:32 compute-0 sudo[165009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kanohopkzaoqcvgvplxiwnlemhwospys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159192.5172014-917-215121437325901/AnsiballZ_command.py'
Jan 23 09:06:32 compute-0 sudo[165009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:32 compute-0 python3.9[165011]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:06:32 compute-0 sudo[165009]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:33 compute-0 sudo[165162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrpzsfdzajdpvluywqozbhzzegrxxtzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159193.1081247-944-247121364431368/AnsiballZ_systemd_service.py'
Jan 23 09:06:33 compute-0 sudo[165162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:33 compute-0 python3.9[165164]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:33 compute-0 systemd[1]: Listening on multipathd control socket.
Jan 23 09:06:33 compute-0 sudo[165162]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:34 compute-0 sudo[165318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmbxyqrukbrnnlcolqmunmdmokfuksqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159193.7854257-968-215327892579643/AnsiballZ_systemd_service.py'
Jan 23 09:06:34 compute-0 sudo[165318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:34 compute-0 python3.9[165320]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:34 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 23 09:06:34 compute-0 udevadm[165325]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 23 09:06:34 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 23 09:06:34 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 09:06:34 compute-0 multipathd[165329]: --------start up--------
Jan 23 09:06:34 compute-0 multipathd[165329]: read /etc/multipath.conf
Jan 23 09:06:34 compute-0 multipathd[165329]: path checkers start up
Jan 23 09:06:34 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 09:06:34 compute-0 sudo[165318]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:34 compute-0 sudo[165486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-casgwlouptnqufjjzqxyhuhfldwuurbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159194.8084192-1004-178213012742078/AnsiballZ_file.py'
Jan 23 09:06:34 compute-0 sudo[165486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:35 compute-0 python3.9[165488]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 09:06:35 compute-0 sudo[165486]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:35 compute-0 sudo[165638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngroioedgdzqbrrnyeqlfrmsfojvtqro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159195.314038-1028-22784739583082/AnsiballZ_modprobe.py'
Jan 23 09:06:35 compute-0 sudo[165638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:35 compute-0 python3.9[165640]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 23 09:06:35 compute-0 kernel: Key type psk registered
Jan 23 09:06:35 compute-0 sudo[165638]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:36 compute-0 sudo[165799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdyymyjynxppoamcifonnqvqtafexpxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159195.8637557-1052-15107246917565/AnsiballZ_stat.py'
Jan 23 09:06:36 compute-0 sudo[165799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:36 compute-0 python3.9[165801]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:06:36 compute-0 sudo[165799]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:36 compute-0 sudo[165922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zewbemqfvwudsbhjgeaenmvjwkrxktfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159195.8637557-1052-15107246917565/AnsiballZ_copy.py'
Jan 23 09:06:36 compute-0 sudo[165922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:36 compute-0 python3.9[165924]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159195.8637557-1052-15107246917565/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:36 compute-0 sudo[165922]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:36 compute-0 sudo[166074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faxqzpzlkpvgejbahcjjdpjwlzojonrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159196.8085756-1100-238759437659128/AnsiballZ_lineinfile.py'
Jan 23 09:06:36 compute-0 sudo[166074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:37 compute-0 python3.9[166076]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:37 compute-0 sudo[166074]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:37 compute-0 sudo[166226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyxdrjsilipibyvkfpwnaprbsqafwvef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159197.2792249-1124-180398047450361/AnsiballZ_systemd.py'
Jan 23 09:06:37 compute-0 sudo[166226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:37 compute-0 python3.9[166228]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:06:37 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 09:06:37 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 23 09:06:37 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 23 09:06:37 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 23 09:06:37 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 23 09:06:37 compute-0 sudo[166226]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:38 compute-0 sudo[166382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eedtxamwrioosqcoukuthpcxnvbbkcup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159197.9499743-1148-122734641659362/AnsiballZ_dnf.py'
Jan 23 09:06:38 compute-0 sudo[166382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:38 compute-0 python3.9[166384]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:06:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:06:39.844 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:06:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:06:39.845 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:06:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:06:39.845 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:06:41 compute-0 systemd[1]: Reloading.
Jan 23 09:06:41 compute-0 systemd-rc-local-generator[166411]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:06:41 compute-0 systemd-sysv-generator[166419]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:06:41 compute-0 systemd[1]: Reloading.
Jan 23 09:06:42 compute-0 podman[166428]: 2026-01-23 09:06:42.00824059 +0000 UTC m=+0.047879460 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 09:06:42 compute-0 systemd-rc-local-generator[166463]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:06:42 compute-0 systemd-sysv-generator[166466]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:06:42 compute-0 systemd-logind[746]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 09:06:42 compute-0 systemd-logind[746]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 09:06:42 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:06:42 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:06:42 compute-0 systemd[1]: Reloading.
Jan 23 09:06:42 compute-0 systemd-sysv-generator[166559]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:06:42 compute-0 systemd-rc-local-generator[166555]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:06:42 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:06:42 compute-0 sudo[166382]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:43 compute-0 sudo[167832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgorbvjadtejvbpzttvfnfkxrsywwqee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159203.1320367-1172-198417155536517/AnsiballZ_systemd_service.py'
Jan 23 09:06:43 compute-0 sudo[167832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:43 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:06:43 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:06:43 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.040s CPU time.
Jan 23 09:06:43 compute-0 systemd[1]: run-r3986cd5d56b74bebad6bd95dfe243b2f.service: Deactivated successfully.
Jan 23 09:06:43 compute-0 python3.9[167852]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:06:43 compute-0 systemd[1]: Stopping Open-iSCSI...
Jan 23 09:06:43 compute-0 iscsid[161380]: iscsid shutting down.
Jan 23 09:06:43 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Jan 23 09:06:43 compute-0 systemd[1]: Stopped Open-iSCSI.
Jan 23 09:06:43 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 09:06:43 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 23 09:06:43 compute-0 systemd[1]: Started Open-iSCSI.
Jan 23 09:06:43 compute-0 sudo[167832]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:43 compute-0 sudo[168014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsjffmjovgsltyopxuasyjptqufgowbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159203.7735672-1196-275127313006055/AnsiballZ_systemd_service.py'
Jan 23 09:06:43 compute-0 sudo[168014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:44 compute-0 python3.9[168016]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:06:44 compute-0 multipathd[165329]: exit (signal)
Jan 23 09:06:44 compute-0 multipathd[165329]: --------shut down-------
Jan 23 09:06:44 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 23 09:06:44 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Jan 23 09:06:44 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 23 09:06:44 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 09:06:44 compute-0 multipathd[168022]: --------start up--------
Jan 23 09:06:44 compute-0 multipathd[168022]: read /etc/multipath.conf
Jan 23 09:06:44 compute-0 multipathd[168022]: path checkers start up
Jan 23 09:06:44 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 09:06:44 compute-0 sudo[168014]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:44 compute-0 python3.9[168179]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:06:45 compute-0 sudo[168333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsyxsjndccmdhcgomgkqxqmnqizbyivs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159205.2950747-1248-206508336593332/AnsiballZ_file.py'
Jan 23 09:06:45 compute-0 sudo[168333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:45 compute-0 python3.9[168335]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:45 compute-0 sudo[168333]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:46 compute-0 sudo[168485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syeyewcbelttfmxbqkaaarvsgjzqdztn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159206.021612-1281-196641692063604/AnsiballZ_systemd_service.py'
Jan 23 09:06:46 compute-0 sudo[168485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:46 compute-0 python3.9[168487]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:06:46 compute-0 systemd[1]: Reloading.
Jan 23 09:06:46 compute-0 systemd-rc-local-generator[168508]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:06:46 compute-0 systemd-sysv-generator[168516]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:06:46 compute-0 sudo[168485]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:47 compute-0 python3.9[168671]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:06:47 compute-0 network[168688]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:06:47 compute-0 network[168689]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:06:47 compute-0 network[168690]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:06:49 compute-0 sudo[168960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyemliizchbzqwhjgshxdtpvsftltndf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159209.4646664-1338-141120895850891/AnsiballZ_systemd_service.py'
Jan 23 09:06:49 compute-0 sudo[168960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:49 compute-0 python3.9[168962]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:49 compute-0 sudo[168960]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:50 compute-0 sudo[169113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhjemsfuqqaxfocbpykhnjcqwypjjtiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159210.0391452-1338-159731476479148/AnsiballZ_systemd_service.py'
Jan 23 09:06:50 compute-0 sudo[169113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:50 compute-0 python3.9[169115]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:51 compute-0 sudo[169113]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:51 compute-0 sudo[169266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tytjdihusdtcnxpfuynewnxenojjnhxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159211.5978742-1338-130006164275486/AnsiballZ_systemd_service.py'
Jan 23 09:06:51 compute-0 sudo[169266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:52 compute-0 python3.9[169268]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:52 compute-0 sudo[169266]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:52 compute-0 sudo[169419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cztpriebonpxawhdvqjhaecrygaendhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159212.12059-1338-65397261329418/AnsiballZ_systemd_service.py'
Jan 23 09:06:52 compute-0 sudo[169419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:52 compute-0 python3.9[169421]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:52 compute-0 sudo[169419]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:52 compute-0 sudo[169580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vojxcnlrslbhumdpgzycnymnmdxzgcly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159212.6502562-1338-32937856324690/AnsiballZ_systemd_service.py'
Jan 23 09:06:52 compute-0 sudo[169580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:52 compute-0 podman[169546]: 2026-01-23 09:06:52.861172633 +0000 UTC m=+0.061189125 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 09:06:53 compute-0 python3.9[169589]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:53 compute-0 sudo[169580]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:53 compute-0 sudo[169748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhthwnwzitdsoymgcqyafsgdcquyacy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159213.1964524-1338-212239923886555/AnsiballZ_systemd_service.py'
Jan 23 09:06:53 compute-0 sudo[169748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:53 compute-0 python3.9[169750]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:53 compute-0 sudo[169748]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:53 compute-0 sudo[169901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzvakjzmfnrlpqqlojtruqfiahjrcirz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159213.7367754-1338-94651947225244/AnsiballZ_systemd_service.py'
Jan 23 09:06:53 compute-0 sudo[169901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:54 compute-0 python3.9[169903]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:54 compute-0 sudo[169901]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:54 compute-0 sudo[170054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzpenejsqmhxmhkjpclkgwdiksswakvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159214.2505774-1338-43231462406945/AnsiballZ_systemd_service.py'
Jan 23 09:06:54 compute-0 sudo[170054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:54 compute-0 python3.9[170056]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:06:54 compute-0 sudo[170054]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:55 compute-0 sudo[170207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gajbxhkowlsfsacxiefeemilvhzwewuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159215.023167-1515-213859447598542/AnsiballZ_file.py'
Jan 23 09:06:55 compute-0 sudo[170207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:55 compute-0 python3.9[170209]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:55 compute-0 sudo[170207]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:55 compute-0 sudo[170359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcaoatszwvimnbwnbveabnfswyesrmxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159215.4467387-1515-95363207853291/AnsiballZ_file.py'
Jan 23 09:06:55 compute-0 sudo[170359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:55 compute-0 python3.9[170361]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:55 compute-0 sudo[170359]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:56 compute-0 sudo[170511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umqicvyzvulykmaoljsbqvsqggxrrjsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159215.8709247-1515-137816134179266/AnsiballZ_file.py'
Jan 23 09:06:56 compute-0 sudo[170511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:56 compute-0 python3.9[170513]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:56 compute-0 sudo[170511]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:56 compute-0 sudo[170663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdaqrzquutlkepfvnduspyygovsmazlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159216.4223611-1515-91493121090805/AnsiballZ_file.py'
Jan 23 09:06:56 compute-0 sudo[170663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:56 compute-0 python3.9[170665]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:56 compute-0 sudo[170663]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:57 compute-0 sudo[170815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhfetzwebffihrmmyrjhqpijyapkshzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159216.851564-1515-28796109337813/AnsiballZ_file.py'
Jan 23 09:06:57 compute-0 sudo[170815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:57 compute-0 python3.9[170817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:57 compute-0 sudo[170815]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:57 compute-0 sudo[170967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtaedvgicdkjrtbupsrhwltipniasvdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159217.2826157-1515-191636492558959/AnsiballZ_file.py'
Jan 23 09:06:57 compute-0 sudo[170967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:57 compute-0 python3.9[170969]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:57 compute-0 sudo[170967]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:57 compute-0 sudo[171119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwrubaymzjzpahdvctqfgunefghnooxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159217.7194698-1515-102097063166496/AnsiballZ_file.py'
Jan 23 09:06:57 compute-0 sudo[171119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:58 compute-0 python3.9[171121]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:58 compute-0 sudo[171119]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:58 compute-0 sudo[171271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgsvkunrdooywzelhnhwgqgoxvwvlojf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159218.1504188-1515-109095098900463/AnsiballZ_file.py'
Jan 23 09:06:58 compute-0 sudo[171271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:58 compute-0 python3.9[171273]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:58 compute-0 sudo[171271]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:58 compute-0 sudo[171423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtdejecyopfjnlgjsppvlwjwdfafngei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159218.7972205-1686-233618058785395/AnsiballZ_file.py'
Jan 23 09:06:58 compute-0 sudo[171423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:59 compute-0 python3.9[171425]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:59 compute-0 sudo[171423]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:59 compute-0 sudo[171575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bujwewojdpiyrvrbivcukzweysbdcjhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159219.2271526-1686-269641429510979/AnsiballZ_file.py'
Jan 23 09:06:59 compute-0 sudo[171575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:59 compute-0 python3.9[171577]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:59 compute-0 sudo[171575]: pam_unix(sudo:session): session closed for user root
Jan 23 09:06:59 compute-0 sudo[171727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcbvelfrogvpgnzlbppwhzarfbdcbxqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159219.6625698-1686-44344229300608/AnsiballZ_file.py'
Jan 23 09:06:59 compute-0 sudo[171727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:06:59 compute-0 python3.9[171729]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:06:59 compute-0 sudo[171727]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:00 compute-0 sudo[171879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrvvpzzzbtfgswmszsgdnprgemcvjmno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159220.0765007-1686-196198611968670/AnsiballZ_file.py'
Jan 23 09:07:00 compute-0 sudo[171879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:00 compute-0 python3.9[171881]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:00 compute-0 sudo[171879]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:00 compute-0 sudo[172031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muadeblmjiqremhmvmivngwjuuufqwmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159220.4826345-1686-183443497773323/AnsiballZ_file.py'
Jan 23 09:07:00 compute-0 sudo[172031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:00 compute-0 python3.9[172033]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:00 compute-0 sudo[172031]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:01 compute-0 sudo[172183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vondbdnyzhpvlhwksbzssmpazomcwrpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159220.8818438-1686-211010757074372/AnsiballZ_file.py'
Jan 23 09:07:01 compute-0 sudo[172183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:01 compute-0 python3.9[172185]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:01 compute-0 sudo[172183]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:01 compute-0 sudo[172335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzxeehphvnfoshmuofsqqufwececsku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159221.2978752-1686-277050435553001/AnsiballZ_file.py'
Jan 23 09:07:01 compute-0 sudo[172335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:01 compute-0 python3.9[172337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:01 compute-0 sudo[172335]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:01 compute-0 sudo[172487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoibvajpyvlvrdmlefkbbjgkstxmvbhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159221.7067044-1686-232519625110607/AnsiballZ_file.py'
Jan 23 09:07:01 compute-0 sudo[172487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:02 compute-0 python3.9[172489]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:02 compute-0 sudo[172487]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:02 compute-0 sudo[172639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjmdysmloslfdifmmpvtmzgbdcpplgiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159222.4053986-1860-196773777132628/AnsiballZ_command.py'
Jan 23 09:07:02 compute-0 sudo[172639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:02 compute-0 python3.9[172641]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:02 compute-0 sudo[172639]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:03 compute-0 python3.9[172793]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:07:03 compute-0 sudo[172943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uypasxfpjlbddeyhdrlcdguypalxfuby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159223.5861092-1914-234234511394455/AnsiballZ_systemd_service.py'
Jan 23 09:07:03 compute-0 sudo[172943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:03 compute-0 python3.9[172945]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:07:04 compute-0 systemd[1]: Reloading.
Jan 23 09:07:04 compute-0 systemd-sysv-generator[172971]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:07:04 compute-0 systemd-rc-local-generator[172967]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:07:04 compute-0 sudo[172943]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:04 compute-0 sudo[173132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvhehormgrudntyxjlkxarclrklddxmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159224.4326725-1938-68494596430386/AnsiballZ_command.py'
Jan 23 09:07:04 compute-0 sudo[173132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:04 compute-0 python3.9[173134]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:04 compute-0 sudo[173132]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:05 compute-0 sudo[173285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxqstrmnqnkbpldukjwyvwxgrsjatgqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159224.8784487-1938-236517422597339/AnsiballZ_command.py'
Jan 23 09:07:05 compute-0 sudo[173285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:05 compute-0 python3.9[173287]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:05 compute-0 sudo[173285]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:05 compute-0 sudo[173438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbrgmbdmwhvmuaqttjkonbkkmnomvpor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159225.3375132-1938-63856867586916/AnsiballZ_command.py'
Jan 23 09:07:05 compute-0 sudo[173438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:05 compute-0 python3.9[173440]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:05 compute-0 sudo[173438]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:05 compute-0 sudo[173591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoljpdtojyavjfozewsvucywtyzrgrci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159225.815712-1938-20546006839512/AnsiballZ_command.py'
Jan 23 09:07:05 compute-0 sudo[173591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:06 compute-0 python3.9[173593]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:06 compute-0 sudo[173591]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:06 compute-0 sudo[173744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktnypmoxytygjkdewmdyfmwgtgqzuxxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159226.2779799-1938-67577391476870/AnsiballZ_command.py'
Jan 23 09:07:06 compute-0 sudo[173744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:06 compute-0 python3.9[173746]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:06 compute-0 sudo[173744]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:06 compute-0 sudo[173897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aefjubvkhdkdnenopqesgsjccjcujjkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159226.7283072-1938-54404419621993/AnsiballZ_command.py'
Jan 23 09:07:06 compute-0 sudo[173897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:07 compute-0 python3.9[173899]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:07 compute-0 sudo[173897]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:07 compute-0 sudo[174050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzccceaupxptdkfwbsiylyiabusosfzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159227.162198-1938-58119489433778/AnsiballZ_command.py'
Jan 23 09:07:07 compute-0 sudo[174050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:07 compute-0 python3.9[174052]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:07 compute-0 sudo[174050]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:07 compute-0 sudo[174203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nncqosrepysgpzweaoxhdoptcergxcuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159227.589082-1938-7443111600463/AnsiballZ_command.py'
Jan 23 09:07:07 compute-0 sudo[174203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:07 compute-0 python3.9[174205]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:07 compute-0 sudo[174203]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:09 compute-0 sudo[174356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoqmwtlispzhpmwttmxdywfrnlcbjvki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159229.0033753-2145-87636051611025/AnsiballZ_file.py'
Jan 23 09:07:09 compute-0 sudo[174356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:09 compute-0 python3.9[174358]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:09 compute-0 sudo[174356]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:09 compute-0 sudo[174508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmibvxbvruzcednaqcscgfxqrqctooxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159229.461673-2145-29680196721383/AnsiballZ_file.py'
Jan 23 09:07:09 compute-0 sudo[174508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:09 compute-0 python3.9[174510]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:09 compute-0 sudo[174508]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:10 compute-0 sudo[174660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtofmzjeoyabruutxrwxuurpqjhtqfzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159229.9116082-2145-243251527429970/AnsiballZ_file.py'
Jan 23 09:07:10 compute-0 sudo[174660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:10 compute-0 python3.9[174662]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:10 compute-0 sudo[174660]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:10 compute-0 sudo[174812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etmlrmytflzdvscfrteczauxybxvvevw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159230.4145138-2211-270405700232832/AnsiballZ_file.py'
Jan 23 09:07:10 compute-0 sudo[174812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:10 compute-0 python3.9[174814]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:10 compute-0 sudo[174812]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:11 compute-0 sudo[174964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cehfaixoyhbttoozyujxlcbvjaormqsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159230.9293573-2211-197281606560880/AnsiballZ_file.py'
Jan 23 09:07:11 compute-0 sudo[174964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:11 compute-0 python3.9[174966]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:11 compute-0 sudo[174964]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:11 compute-0 sudo[175116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihfcuqxfkvzakpehmuxcyjcwbrnllckf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159231.3810258-2211-111930877811028/AnsiballZ_file.py'
Jan 23 09:07:11 compute-0 sudo[175116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:11 compute-0 python3.9[175118]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:11 compute-0 sudo[175116]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:12 compute-0 sudo[175268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejjwzgvckixxsbmmmvdadvxlsnybwegv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159231.8434029-2211-204970779763351/AnsiballZ_file.py'
Jan 23 09:07:12 compute-0 sudo[175268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:12 compute-0 python3.9[175270]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:12 compute-0 sudo[175268]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:12 compute-0 sudo[175429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzsivyaqlhavszhhyiawkwmuqvojjpth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159232.3089755-2211-180962409714352/AnsiballZ_file.py'
Jan 23 09:07:12 compute-0 sudo[175429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:12 compute-0 podman[175394]: 2026-01-23 09:07:12.539511963 +0000 UTC m=+0.059894655 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:07:12 compute-0 python3.9[175434]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:12 compute-0 sudo[175429]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:13 compute-0 sudo[175588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bebnwtftsofzpnnrejygmnxopbyepnpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159232.8026168-2211-146000971182320/AnsiballZ_file.py'
Jan 23 09:07:13 compute-0 sudo[175588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:13 compute-0 python3.9[175590]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:13 compute-0 sudo[175588]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:13 compute-0 sudo[175740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouphulwbbjotntxxtthzvlgoujmwrwda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159233.325906-2211-43125036624357/AnsiballZ_file.py'
Jan 23 09:07:13 compute-0 sudo[175740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:13 compute-0 python3.9[175742]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:13 compute-0 sudo[175740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:17 compute-0 sudo[175892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kchnpdkdnxtrgjgsdbvhreprtoggeoxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159237.118793-2516-73593489870434/AnsiballZ_getent.py'
Jan 23 09:07:17 compute-0 sudo[175892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:17 compute-0 python3.9[175894]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 23 09:07:17 compute-0 sudo[175892]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:18 compute-0 sudo[176045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wumrflxcsfswmgwdbujkxwchochbhhal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159237.8008635-2540-226723257001371/AnsiballZ_group.py'
Jan 23 09:07:18 compute-0 sudo[176045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:18 compute-0 python3.9[176047]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:07:18 compute-0 groupadd[176048]: group added to /etc/group: name=nova, GID=42436
Jan 23 09:07:18 compute-0 groupadd[176048]: group added to /etc/gshadow: name=nova
Jan 23 09:07:18 compute-0 groupadd[176048]: new group: name=nova, GID=42436
Jan 23 09:07:18 compute-0 sudo[176045]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:18 compute-0 sudo[176203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzfmzbahkmvobfkepdugyjqrmuzprpat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159238.5251553-2564-129658222256012/AnsiballZ_user.py'
Jan 23 09:07:18 compute-0 sudo[176203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:19 compute-0 python3.9[176205]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 09:07:19 compute-0 useradd[176207]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 23 09:07:19 compute-0 useradd[176207]: add 'nova' to group 'libvirt'
Jan 23 09:07:19 compute-0 useradd[176207]: add 'nova' to shadow group 'libvirt'
Jan 23 09:07:19 compute-0 sudo[176203]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:20 compute-0 sshd-session[176238]: Accepted publickey for zuul from 192.168.122.30 port 50340 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:07:20 compute-0 systemd-logind[746]: New session 24 of user zuul.
Jan 23 09:07:20 compute-0 systemd[1]: Started Session 24 of User zuul.
Jan 23 09:07:20 compute-0 sshd-session[176238]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:07:20 compute-0 sshd-session[176241]: Received disconnect from 192.168.122.30 port 50340:11: disconnected by user
Jan 23 09:07:20 compute-0 sshd-session[176241]: Disconnected from user zuul 192.168.122.30 port 50340
Jan 23 09:07:20 compute-0 sshd-session[176238]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:07:20 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Jan 23 09:07:20 compute-0 systemd-logind[746]: Session 24 logged out. Waiting for processes to exit.
Jan 23 09:07:20 compute-0 systemd-logind[746]: Removed session 24.
Jan 23 09:07:20 compute-0 python3.9[176391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:21 compute-0 python3.9[176512]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159240.387902-2639-172256897514139/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:21 compute-0 python3.9[176662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:21 compute-0 python3.9[176738]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:22 compute-0 python3.9[176888]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:22 compute-0 python3.9[177009]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159242.0994432-2639-62627790010413/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:23 compute-0 podman[177104]: 2026-01-23 09:07:23.227878413 +0000 UTC m=+0.066187854 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 09:07:23 compute-0 python3.9[177182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:23 compute-0 python3.9[177303]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159243.0505095-2639-106613716336356/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:24 compute-0 python3.9[177453]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:24 compute-0 python3.9[177574]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159243.9612122-2639-139527564743307/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:25 compute-0 python3.9[177724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:25 compute-0 python3.9[177845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159244.8299139-2639-256157093610397/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:25 compute-0 sudo[177995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahwytaxcglyiffpyncrtnlhijvoywafh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159245.7340796-2888-256604698816346/AnsiballZ_file.py'
Jan 23 09:07:25 compute-0 sudo[177995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:26 compute-0 python3.9[177997]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:26 compute-0 sudo[177995]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:26 compute-0 sudo[178147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqyycbjrqhwignjveixmuzovhqocosmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159246.2164583-2912-40383242625243/AnsiballZ_copy.py'
Jan 23 09:07:26 compute-0 sudo[178147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:26 compute-0 python3.9[178149]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:26 compute-0 sudo[178147]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:26 compute-0 sudo[178299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcivphivorqhufuaetdmbmzbmfbzgyee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159246.7201912-2936-83728784063497/AnsiballZ_stat.py'
Jan 23 09:07:26 compute-0 sudo[178299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:27 compute-0 python3.9[178301]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:07:27 compute-0 sudo[178299]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:27 compute-0 sudo[178451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncpmiqomkhvdmgoqsrqnebkthdzlhugk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159247.223656-2960-122707194295747/AnsiballZ_stat.py'
Jan 23 09:07:27 compute-0 sudo[178451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:27 compute-0 python3.9[178453]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:27 compute-0 sudo[178451]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:27 compute-0 sudo[178575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhznevlzldizilidhnhpcvnylhtjkqrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159247.223656-2960-122707194295747/AnsiballZ_copy.py'
Jan 23 09:07:27 compute-0 sudo[178575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:27 compute-0 python3.9[178577]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769159247.223656-2960-122707194295747/.source _original_basename=.g9g5p84b follow=False checksum=86fe1896b0f7e87872cd1a0d72acf64cc719fb0a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 23 09:07:27 compute-0 sudo[178575]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:28 compute-0 python3.9[178729]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:07:28 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 23 09:07:29 compute-0 python3.9[178882]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:29 compute-0 python3.9[179003]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159248.7439158-3038-178299555418701/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:29 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 09:07:29 compute-0 python3.9[179154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:07:30 compute-0 python3.9[179275]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159249.625667-3083-15740696195919/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:07:30 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 23 09:07:30 compute-0 sudo[179426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vexmdffuxgpgdkudeaxgqeqazcckiznz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159250.6639712-3134-95409239516521/AnsiballZ_container_config_data.py'
Jan 23 09:07:30 compute-0 sudo[179426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:31 compute-0 python3.9[179428]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 23 09:07:31 compute-0 sudo[179426]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:31 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 09:07:31 compute-0 sudo[179579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upfzhetskhprjnnoqeczohostwpoajvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159251.4960837-3167-261503944529676/AnsiballZ_container_config_hash.py'
Jan 23 09:07:31 compute-0 sudo[179579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:31 compute-0 python3.9[179581]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 09:07:31 compute-0 sudo[179579]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:32 compute-0 sudo[179731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fobdvxktdnkmymzmjxcvzbbbpomzgnzm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769159252.271722-3197-214468570129238/AnsiballZ_edpm_container_manage.py'
Jan 23 09:07:32 compute-0 sudo[179731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:32 compute-0 python3[179733]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 09:07:32 compute-0 podman[179761]: 2026-01-23 09:07:32.965189881 +0000 UTC m=+0.031238380 container create 4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_id=edpm)
Jan 23 09:07:32 compute-0 podman[179761]: 2026-01-23 09:07:32.950630275 +0000 UTC m=+0.016678784 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 09:07:32 compute-0 python3[179733]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 23 09:07:33 compute-0 sudo[179731]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:33 compute-0 sudo[179938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydtoudkhiquzvjigtomixvjiwpmpnten ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159253.1886241-3221-223493574012019/AnsiballZ_stat.py'
Jan 23 09:07:33 compute-0 sudo[179938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:33 compute-0 python3.9[179940]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:07:33 compute-0 sudo[179938]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:34 compute-0 sudo[180092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjmjenhssebgaosdqzzotdtjilgqfvps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159254.0921736-3257-248510993186545/AnsiballZ_container_config_data.py'
Jan 23 09:07:34 compute-0 sudo[180092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:34 compute-0 python3.9[180094]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 23 09:07:34 compute-0 sudo[180092]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:35 compute-0 sudo[180244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytlddppacpfzplzabtabzvaexusqmuxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159254.8202147-3290-129617730716617/AnsiballZ_container_config_hash.py'
Jan 23 09:07:35 compute-0 sudo[180244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:35 compute-0 python3.9[180246]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 09:07:35 compute-0 sudo[180244]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:35 compute-0 sudo[180396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjthipptdnisvakpihyrgwzpgwardmie ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769159255.576202-3320-158539297693246/AnsiballZ_edpm_container_manage.py'
Jan 23 09:07:35 compute-0 sudo[180396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:36 compute-0 python3[180398]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 09:07:36 compute-0 podman[180426]: 2026-01-23 09:07:36.155958409 +0000 UTC m=+0.035757713 container create 100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 09:07:36 compute-0 podman[180426]: 2026-01-23 09:07:36.141879551 +0000 UTC m=+0.021678876 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 09:07:36 compute-0 python3[180398]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 23 09:07:36 compute-0 sudo[180396]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:36 compute-0 sudo[180603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypqrdfbsfdrzlgrejguzkhditiruoubu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159256.4336665-3344-177851456091899/AnsiballZ_stat.py'
Jan 23 09:07:36 compute-0 sudo[180603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:36 compute-0 python3.9[180605]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:07:36 compute-0 sudo[180603]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:37 compute-0 sudo[180757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncaasulhytzdchueslpsyvvxkfeeyiud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159257.07334-3371-176643570920152/AnsiballZ_file.py'
Jan 23 09:07:37 compute-0 sudo[180757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:37 compute-0 python3.9[180759]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:37 compute-0 sudo[180757]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:37 compute-0 sudo[180908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlaakhuugldlndjstgllewajdcgskuvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159257.5129247-3371-176910270852753/AnsiballZ_copy.py'
Jan 23 09:07:37 compute-0 sudo[180908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:37 compute-0 python3.9[180910]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159257.5129247-3371-176910270852753/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:37 compute-0 sudo[180908]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:38 compute-0 sudo[180984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdqnagzxrlnrlscwawsgnrdwzbgxqnfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159257.5129247-3371-176910270852753/AnsiballZ_systemd.py'
Jan 23 09:07:38 compute-0 sudo[180984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:38 compute-0 python3.9[180986]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:07:38 compute-0 systemd[1]: Reloading.
Jan 23 09:07:38 compute-0 systemd-rc-local-generator[181007]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:07:38 compute-0 systemd-sysv-generator[181011]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:07:38 compute-0 sudo[180984]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:38 compute-0 sudo[181095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlhlbcrjpavwwmafejjyheqkxxdaivbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159257.5129247-3371-176910270852753/AnsiballZ_systemd.py'
Jan 23 09:07:38 compute-0 sudo[181095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:39 compute-0 python3.9[181097]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:07:39 compute-0 systemd[1]: Reloading.
Jan 23 09:07:39 compute-0 systemd-rc-local-generator[181120]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:07:39 compute-0 systemd-sysv-generator[181123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:07:39 compute-0 systemd[1]: Starting nova_compute container...
Jan 23 09:07:39 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:07:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:39 compute-0 podman[181137]: 2026-01-23 09:07:39.469774955 +0000 UTC m=+0.081974438 container init 100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:07:39 compute-0 podman[181137]: 2026-01-23 09:07:39.479243626 +0000 UTC m=+0.091443099 container start 100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_managed=true)
Jan 23 09:07:39 compute-0 podman[181137]: nova_compute
Jan 23 09:07:39 compute-0 nova_compute[181149]: + sudo -E kolla_set_configs
Jan 23 09:07:39 compute-0 systemd[1]: Started nova_compute container.
Jan 23 09:07:39 compute-0 sudo[181095]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Validating config file
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying service configuration files
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Deleting /etc/ceph
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Creating directory /etc/ceph
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Writing out command to execute
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 09:07:39 compute-0 nova_compute[181149]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 09:07:39 compute-0 nova_compute[181149]: ++ cat /run_command
Jan 23 09:07:39 compute-0 nova_compute[181149]: + CMD=nova-compute
Jan 23 09:07:39 compute-0 nova_compute[181149]: + ARGS=
Jan 23 09:07:39 compute-0 nova_compute[181149]: + sudo kolla_copy_cacerts
Jan 23 09:07:39 compute-0 nova_compute[181149]: + [[ ! -n '' ]]
Jan 23 09:07:39 compute-0 nova_compute[181149]: + . kolla_extend_start
Jan 23 09:07:39 compute-0 nova_compute[181149]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 09:07:39 compute-0 nova_compute[181149]: Running command: 'nova-compute'
Jan 23 09:07:39 compute-0 nova_compute[181149]: + umask 0022
Jan 23 09:07:39 compute-0 nova_compute[181149]: + exec nova-compute
Jan 23 09:07:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:07:39.845 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:07:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:07:39.845 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:07:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:07:39.845 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:07:40 compute-0 python3.9[181311]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:07:41 compute-0 python3.9[181461]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.212 181153 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.213 181153 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.213 181153 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.213 181153 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.325 181153 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.335 181153 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.335 181153 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 09:07:41 compute-0 python3.9[181615]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.759 181153 INFO nova.virt.driver [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.844 181153 INFO nova.compute.provider_config [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.864 181153 DEBUG oslo_concurrency.lockutils [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.864 181153 DEBUG oslo_concurrency.lockutils [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.864 181153 DEBUG oslo_concurrency.lockutils [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.865 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.865 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.865 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.865 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.865 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.865 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.865 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.866 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.866 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.866 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.866 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.866 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.866 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.866 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.866 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.867 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.867 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.867 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.867 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.867 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.867 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.867 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.868 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.868 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.868 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.868 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.868 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.868 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.868 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.869 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.869 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.869 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.869 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.869 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.869 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.869 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.869 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.870 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.870 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.870 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.870 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.870 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.871 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.871 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.871 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.871 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.871 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.871 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.871 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.872 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.872 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.872 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.872 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.872 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.872 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.872 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.873 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.873 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.873 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.873 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.873 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.873 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.873 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.874 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.874 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.874 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.874 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.874 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.874 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.874 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.875 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.875 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.875 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.875 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.875 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.875 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.875 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.876 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.876 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.876 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.876 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.876 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.876 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.876 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.877 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.877 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.877 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.877 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.877 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.877 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.877 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.877 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.878 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.878 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.878 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.878 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.878 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.878 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.878 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.879 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.879 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.879 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.879 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.879 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.879 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.879 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.879 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.880 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.880 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.880 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.880 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.880 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.880 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.880 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.881 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.881 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.881 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.881 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.881 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.881 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.881 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.882 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.882 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.882 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.882 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.882 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.882 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.882 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.882 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.883 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.883 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.883 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.883 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.883 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.883 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.883 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.884 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.884 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.884 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.884 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.884 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.884 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.884 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.884 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.885 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.885 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.885 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.885 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.885 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.885 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.885 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.886 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.886 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.886 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.886 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.886 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.886 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.886 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.887 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.887 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.887 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.887 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.887 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.887 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.887 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.888 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.888 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.888 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.888 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.888 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.888 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.888 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.889 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.889 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.889 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.889 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.889 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.889 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.890 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.890 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.890 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.890 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.890 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.890 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.890 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.891 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.891 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.891 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.891 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.891 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.891 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.891 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.891 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.892 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.892 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.892 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.892 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.892 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.892 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.892 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.893 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.893 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.893 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.893 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.893 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.893 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.893 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.894 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.894 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.894 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.894 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.894 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.894 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.894 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.895 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.895 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.895 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.895 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.895 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.895 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.895 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.895 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.896 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.896 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.896 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.896 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.896 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.896 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.896 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.897 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.897 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.897 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.897 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.897 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.897 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.897 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.898 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.898 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.898 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.898 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.898 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.898 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.898 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.899 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.899 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.899 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.899 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.899 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.899 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.899 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.899 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.900 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.900 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.900 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.900 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.900 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.900 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.900 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.901 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.901 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.901 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.901 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.901 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.901 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.901 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.902 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.902 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.902 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.902 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.902 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.902 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.902 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.903 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.903 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.903 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.903 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.903 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.903 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.903 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.903 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.904 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.904 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.904 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.904 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.904 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.904 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.904 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.905 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.905 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.905 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.905 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.905 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.905 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.905 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.906 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.906 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.906 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.906 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.906 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.906 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.906 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.907 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.907 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.907 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.907 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.907 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.907 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.907 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.907 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.908 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.908 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.908 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.908 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.908 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.908 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.908 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.909 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.909 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.909 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.909 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.909 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.909 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.909 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.910 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.910 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.910 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.910 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.910 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.910 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.910 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.911 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.911 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.911 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.911 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.911 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.911 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.911 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.911 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.912 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.912 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.912 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.912 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.912 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.912 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.912 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.913 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.913 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.913 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.913 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.913 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.913 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.913 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.914 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.914 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.914 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.914 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.914 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.914 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.915 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.915 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.915 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.915 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.915 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.916 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.916 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.916 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.916 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.916 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.916 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.916 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.916 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.917 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.917 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.917 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.917 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.917 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.917 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.917 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.918 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.918 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.918 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.918 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.918 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.918 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.918 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.918 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.919 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.919 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.919 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.919 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.919 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.919 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.920 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.920 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.920 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.920 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.920 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.920 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.920 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.921 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.921 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.921 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.921 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.921 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.921 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.921 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.921 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.922 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.922 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.922 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.922 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.922 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.922 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.922 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.923 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.923 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.923 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.923 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.923 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.923 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.923 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.924 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.924 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.924 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.924 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.924 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.924 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.924 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.924 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.925 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.925 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.925 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.925 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.925 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.925 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.925 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.926 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.926 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.926 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.926 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.926 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.926 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.926 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.927 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.927 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.927 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.927 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.927 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.927 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.927 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.927 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.928 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.928 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.928 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.928 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.928 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.928 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.928 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.929 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.929 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.929 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.929 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.929 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.929 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.929 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.930 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.930 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.930 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.930 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.930 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.930 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.930 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.931 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.931 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.931 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.931 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.931 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.931 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.931 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.932 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.932 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.932 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.932 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.932 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.932 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.932 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.932 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.933 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.933 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.933 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.933 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.933 181153 WARNING oslo_config.cfg [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 09:07:41 compute-0 nova_compute[181149]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 09:07:41 compute-0 nova_compute[181149]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 09:07:41 compute-0 nova_compute[181149]: and ``live_migration_inbound_addr`` respectively.
Jan 23 09:07:41 compute-0 nova_compute[181149]: ).  Its value may be silently ignored in the future.
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.933 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.934 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.934 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.934 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.934 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.934 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.934 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.934 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.935 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.935 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.935 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.935 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.935 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.935 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.935 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.936 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.936 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.936 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.936 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.936 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.936 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.936 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.937 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.937 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.937 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.937 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.937 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.937 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.937 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.938 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.938 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.938 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.938 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.938 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.938 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.939 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.939 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.939 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.939 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.939 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.939 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.939 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.939 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.940 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.940 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.940 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.940 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.940 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.940 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.940 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.941 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.941 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.941 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.941 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.941 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.941 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.941 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.942 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.942 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.942 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.942 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.942 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.942 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.942 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.943 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.943 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.943 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.943 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.943 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.943 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.943 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.943 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.944 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.944 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.944 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.944 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.944 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.944 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.944 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.945 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.945 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.945 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.945 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.945 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.945 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.945 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.946 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.946 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.946 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.946 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.946 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.946 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.946 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.947 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.947 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.947 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.947 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.947 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.947 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.947 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.948 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.948 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.948 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.948 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.948 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.948 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.948 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.948 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.949 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.949 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.949 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.949 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.949 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.949 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.949 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.950 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.950 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.950 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.950 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.950 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.950 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.950 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.951 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.951 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.951 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.951 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.951 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.951 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.951 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.952 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.952 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.952 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.952 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.952 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.952 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.952 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.953 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.953 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.953 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.953 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.953 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.953 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.954 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.954 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.954 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.954 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.954 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.954 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.954 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.954 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.955 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.955 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.955 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.955 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.955 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.955 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.955 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.956 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.956 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.956 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.956 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.956 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.956 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.956 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.957 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.957 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.957 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.957 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.957 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.957 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.957 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.958 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.958 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.958 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.958 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.958 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.958 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.958 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.958 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.959 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.959 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.959 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.959 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.959 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.959 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.960 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.960 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.960 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.960 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.960 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.960 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.960 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.961 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.961 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.961 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.961 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.961 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.961 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.961 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.962 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.962 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.962 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.962 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.962 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.962 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.962 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.963 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.963 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.963 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.963 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.963 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.963 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.963 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.963 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.964 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.964 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.964 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.964 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.964 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.964 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.964 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.965 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.965 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.965 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.965 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.965 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.965 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.965 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.966 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.966 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.966 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.966 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.966 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.966 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.966 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.966 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.967 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.967 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.967 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.967 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.967 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.967 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.967 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.968 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.968 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.968 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.968 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.968 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.968 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.968 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.969 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.969 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.969 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.969 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.969 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.969 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.969 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.970 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.970 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.970 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.970 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.970 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.970 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.970 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.971 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.971 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.971 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.971 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.971 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.971 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.971 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.972 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.972 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.972 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.972 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.972 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.972 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.972 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.973 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.973 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.973 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.973 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.973 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.973 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.973 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.974 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.974 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.974 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.974 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.974 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.974 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.974 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.975 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.975 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.975 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.975 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.975 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.975 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.975 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.976 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.976 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.976 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.976 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.976 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.976 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.976 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.977 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.977 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.977 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.977 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.977 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.977 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.977 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.978 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.978 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.978 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.978 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.978 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.978 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.978 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.978 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.979 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.979 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.979 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.979 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.979 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.979 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.979 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.980 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.980 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.980 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.980 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.980 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.980 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.980 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.981 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.981 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.981 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.981 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.981 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.981 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.981 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.982 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.982 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.982 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.982 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.982 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.982 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.982 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.983 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.983 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.983 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.983 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.983 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.983 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.983 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.984 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.984 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.984 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.984 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.984 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.984 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.984 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.985 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.985 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.985 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.985 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.985 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.985 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.985 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.986 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.986 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.986 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.986 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.986 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.986 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.986 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.986 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.987 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.987 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.987 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.987 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.987 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.987 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.987 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.988 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.988 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.988 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.988 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.988 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.988 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.988 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.989 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.989 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.989 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.989 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.989 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.989 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.989 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.990 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.990 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.990 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.990 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.990 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.990 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.990 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.991 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.991 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.991 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.991 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.991 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.991 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.991 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.991 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.992 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.992 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] privsep_osbrick.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.992 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.992 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.992 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.992 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.992 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.993 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] nova_sys_admin.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.993 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.993 181153 DEBUG oslo_service.service [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 09:07:41 compute-0 nova_compute[181149]: 2026-01-23 09:07:41.993 181153 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.004 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.005 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.005 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.005 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 23 09:07:42 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 09:07:42 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.057 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fca51066640> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.059 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fca51066640> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.060 181153 INFO nova.virt.libvirt.driver [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Connection event '1' reason 'None'
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.071 181153 WARNING nova.virt.libvirt.driver [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.071 181153 DEBUG nova.virt.libvirt.volume.mount [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 23 09:07:42 compute-0 sudo[181817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnsnpezhycceqjsrhachfzdhdcxfaijj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159261.9503758-3551-7069979897358/AnsiballZ_podman_container.py'
Jan 23 09:07:42 compute-0 sudo[181817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:42 compute-0 python3.9[181819]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 09:07:42 compute-0 sudo[181817]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:42 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:07:42 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.769 181153 INFO nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 09:07:42 compute-0 nova_compute[181149]: 
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <host>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <uuid>5a008b0a-4e78-4797-b262-5f7749cb75af</uuid>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <cpu>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <arch>x86_64</arch>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model>EPYC-Milan-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <vendor>AMD</vendor>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <microcode version='167776725'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <signature family='25' model='1' stepping='1'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <topology sockets='4' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <maxphysaddr mode='emulate' bits='48'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='x2apic'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='tsc-deadline'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='osxsave'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='hypervisor'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='tsc_adjust'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='ospke'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='vaes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='vpclmulqdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='spec-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='stibp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='arch-capabilities'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='ssbd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='cmp_legacy'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='virt-ssbd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='lbrv'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='tsc-scale'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='vmcb-clean'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='pause-filter'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='pfthreshold'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='v-vmsave-vmload'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='vgif'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='rdctl-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='skip-l1dfl-vmentry'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='mds-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature name='pschange-mc-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <pages unit='KiB' size='4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <pages unit='KiB' size='2048'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <pages unit='KiB' size='1048576'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </cpu>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <power_management>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <suspend_mem/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <suspend_disk/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <suspend_hybrid/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </power_management>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <iommu support='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <migration_features>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <live/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <uri_transports>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <uri_transport>tcp</uri_transport>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <uri_transport>rdma</uri_transport>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </uri_transports>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </migration_features>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <topology>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <cells num='1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <cell id='0'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:           <memory unit='KiB'>7865364</memory>
Jan 23 09:07:42 compute-0 nova_compute[181149]:           <pages unit='KiB' size='4'>1966341</pages>
Jan 23 09:07:42 compute-0 nova_compute[181149]:           <pages unit='KiB' size='2048'>0</pages>
Jan 23 09:07:42 compute-0 nova_compute[181149]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 23 09:07:42 compute-0 nova_compute[181149]:           <distances>
Jan 23 09:07:42 compute-0 nova_compute[181149]:             <sibling id='0' value='10'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:           </distances>
Jan 23 09:07:42 compute-0 nova_compute[181149]:           <cpus num='4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:           </cpus>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         </cell>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </cells>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </topology>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <cache>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </cache>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <secmodel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model>selinux</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <doi>0</doi>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </secmodel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <secmodel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model>dac</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <doi>0</doi>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </secmodel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </host>
Jan 23 09:07:42 compute-0 nova_compute[181149]: 
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <guest>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <os_type>hvm</os_type>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <arch name='i686'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <wordsize>32</wordsize>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <domain type='qemu'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <domain type='kvm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </arch>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <features>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <pae/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <nonpae/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <acpi default='on' toggle='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <apic default='on' toggle='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <cpuselection/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <deviceboot/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <disksnapshot default='on' toggle='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <externalSnapshot/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </features>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </guest>
Jan 23 09:07:42 compute-0 nova_compute[181149]: 
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <guest>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <os_type>hvm</os_type>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <arch name='x86_64'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <wordsize>64</wordsize>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <domain type='qemu'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <domain type='kvm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </arch>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <features>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <acpi default='on' toggle='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <apic default='on' toggle='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <cpuselection/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <deviceboot/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <disksnapshot default='on' toggle='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <externalSnapshot/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </features>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </guest>
Jan 23 09:07:42 compute-0 nova_compute[181149]: 
Jan 23 09:07:42 compute-0 nova_compute[181149]: </capabilities>
Jan 23 09:07:42 compute-0 nova_compute[181149]: 
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.775 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.795 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 09:07:42 compute-0 nova_compute[181149]: <domainCapabilities>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <domain>kvm</domain>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <arch>i686</arch>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <vcpu max='4096'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <iothreads supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <os supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <enum name='firmware'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <loader supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>rom</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pflash</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='readonly'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>yes</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>no</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='secure'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>no</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </loader>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </os>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <cpu>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='host-passthrough' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='hostPassthroughMigratable'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>on</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>off</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='maximum' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='maximumMigratable'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>on</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>off</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='host-model' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <vendor>AMD</vendor>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='x2apic'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='hypervisor'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vaes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vpclmulqdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='stibp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='ssbd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='overflow-recov'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='succor'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='lbrv'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc-scale'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='flushbyasid'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='pause-filter'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='pfthreshold'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vgif'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='custom' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='ClearwaterForest'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ddpd-u'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sha512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='ClearwaterForest-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ddpd-u'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sha512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Denverton'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Denverton-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Milan-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Milan-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Turin'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbpb'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Turin-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbpb'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-128'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-256'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-128'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-256'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v6'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v7'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='KnightsMill'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512er'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512pf'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='KnightsMill-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512er'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512pf'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G4-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tbm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G5-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tbm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='athlon'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='athlon-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='core2duo'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='core2duo-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='coreduo'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='coreduo-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='n270'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='n270-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='phenom'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='phenom-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </cpu>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <memoryBacking supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <enum name='sourceType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>file</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>anonymous</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>memfd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </memoryBacking>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <devices>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <disk supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='diskDevice'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>disk</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>cdrom</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>floppy</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>lun</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='bus'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>fdc</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>scsi</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>sata</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio-transitional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio-non-transitional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </disk>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <graphics supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vnc</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>egl-headless</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>dbus</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </graphics>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <video supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='modelType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vga</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>cirrus</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>none</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>bochs</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>ramfb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </video>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <hostdev supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='mode'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>subsystem</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='startupPolicy'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>default</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>mandatory</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>requisite</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>optional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='subsysType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pci</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>scsi</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='capsType'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='pciBackend'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </hostdev>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <rng supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio-transitional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio-non-transitional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>random</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>egd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>builtin</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </rng>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <filesystem supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='driverType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>path</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>handle</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtiofs</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </filesystem>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <tpm supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>tpm-tis</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>tpm-crb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>emulator</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>external</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendVersion'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>2.0</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </tpm>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <redirdev supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='bus'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </redirdev>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <channel supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pty</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>unix</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </channel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <crypto supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>qemu</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>builtin</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </crypto>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <interface supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>default</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>passt</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </interface>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <panic supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>isa</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>hyperv</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </panic>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <console supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>null</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vc</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pty</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>dev</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>file</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pipe</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>stdio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>udp</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>tcp</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>unix</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>qemu-vdagent</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>dbus</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </console>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </devices>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <features>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <gic supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <vmcoreinfo supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <genid supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <backingStoreInput supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <backup supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <async-teardown supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <s390-pv supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <ps2 supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <tdx supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <sev supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <sgx supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <hyperv supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='features'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>relaxed</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vapic</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>spinlocks</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vpindex</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>runtime</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>synic</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>stimer</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>reset</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vendor_id</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>frequencies</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>reenlightenment</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>tlbflush</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>ipi</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>avic</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>emsr_bitmap</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>xmm_input</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <defaults>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <spinlocks>4095</spinlocks>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <stimer_direct>on</stimer_direct>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </defaults>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </hyperv>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <launchSecurity supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </features>
Jan 23 09:07:42 compute-0 nova_compute[181149]: </domainCapabilities>
Jan 23 09:07:42 compute-0 nova_compute[181149]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.801 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 09:07:42 compute-0 nova_compute[181149]: <domainCapabilities>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <domain>kvm</domain>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <arch>i686</arch>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <vcpu max='240'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <iothreads supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <os supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <enum name='firmware'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <loader supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>rom</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pflash</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='readonly'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>yes</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>no</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='secure'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>no</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </loader>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </os>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <cpu>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='host-passthrough' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='hostPassthroughMigratable'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>on</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>off</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='maximum' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='maximumMigratable'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>on</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>off</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='host-model' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <vendor>AMD</vendor>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='x2apic'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='hypervisor'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vaes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vpclmulqdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='stibp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='ssbd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='overflow-recov'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='succor'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='lbrv'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc-scale'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='flushbyasid'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='pause-filter'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='pfthreshold'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vgif'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='custom' supported='yes'>
Jan 23 09:07:42 compute-0 podman[181885]: 2026-01-23 09:07:42.835242164 +0000 UTC m=+0.061160353 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='ClearwaterForest'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ddpd-u'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sha512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='ClearwaterForest-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ddpd-u'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sha512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Denverton'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Denverton-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Milan-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Milan-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Turin'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbpb'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Turin-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbpb'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-128'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-256'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-128'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-256'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v6'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v7'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='KnightsMill'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512er'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512pf'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='KnightsMill-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512er'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512pf'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G4-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tbm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G5-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tbm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='athlon'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='athlon-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='core2duo'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='core2duo-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='coreduo'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='coreduo-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='n270'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='n270-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='phenom'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='phenom-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </cpu>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <memoryBacking supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <enum name='sourceType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>file</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>anonymous</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>memfd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </memoryBacking>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <devices>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <disk supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='diskDevice'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>disk</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>cdrom</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>floppy</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>lun</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='bus'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>ide</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>fdc</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>scsi</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>sata</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio-transitional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio-non-transitional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </disk>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <graphics supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vnc</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>egl-headless</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>dbus</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </graphics>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <video supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='modelType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vga</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>cirrus</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>none</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>bochs</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>ramfb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </video>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <hostdev supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='mode'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>subsystem</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='startupPolicy'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>default</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>mandatory</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>requisite</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>optional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='subsysType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pci</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>scsi</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='capsType'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='pciBackend'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </hostdev>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <rng supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio-transitional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtio-non-transitional</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>random</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>egd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>builtin</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </rng>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <filesystem supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='driverType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>path</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>handle</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>virtiofs</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </filesystem>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <tpm supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>tpm-tis</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>tpm-crb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>emulator</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>external</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendVersion'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>2.0</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </tpm>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <redirdev supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='bus'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </redirdev>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <channel supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pty</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>unix</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </channel>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <crypto supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>qemu</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>builtin</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </crypto>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <interface supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='backendType'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>default</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>passt</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </interface>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <panic supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>isa</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>hyperv</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </panic>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <console supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>null</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vc</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pty</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>dev</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>file</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pipe</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>stdio</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>udp</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>tcp</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>unix</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>qemu-vdagent</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>dbus</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </console>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </devices>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <features>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <gic supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <vmcoreinfo supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <genid supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <backingStoreInput supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <backup supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <async-teardown supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <s390-pv supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <ps2 supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <tdx supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <sev supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <sgx supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <hyperv supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='features'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>relaxed</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vapic</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>spinlocks</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vpindex</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>runtime</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>synic</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>stimer</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>reset</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>vendor_id</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>frequencies</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>reenlightenment</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>tlbflush</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>ipi</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>avic</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>emsr_bitmap</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>xmm_input</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <defaults>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <spinlocks>4095</spinlocks>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <stimer_direct>on</stimer_direct>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </defaults>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </hyperv>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <launchSecurity supported='no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </features>
Jan 23 09:07:42 compute-0 nova_compute[181149]: </domainCapabilities>
Jan 23 09:07:42 compute-0 nova_compute[181149]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.823 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 09:07:42 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.826 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 09:07:42 compute-0 nova_compute[181149]: <domainCapabilities>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <domain>kvm</domain>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <arch>x86_64</arch>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <vcpu max='4096'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <iothreads supported='yes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <os supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <enum name='firmware'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>efi</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <loader supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>rom</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>pflash</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='readonly'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>yes</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>no</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='secure'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>yes</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>no</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </loader>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   </os>
Jan 23 09:07:42 compute-0 nova_compute[181149]:   <cpu>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='host-passthrough' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='hostPassthroughMigratable'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>on</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>off</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='maximum' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <enum name='maximumMigratable'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>on</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <value>off</value>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='host-model' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <vendor>AMD</vendor>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='x2apic'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='hypervisor'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vaes'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vpclmulqdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='stibp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='ssbd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='overflow-recov'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='succor'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='lbrv'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc-scale'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='flushbyasid'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='pause-filter'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='pfthreshold'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='vgif'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:42 compute-0 nova_compute[181149]:     <mode name='custom' supported='yes'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Broadwell-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 sudo[182017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgmifyiqgumhwufjcliwwftttiobzafq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159262.7766252-3575-182827827943532/AnsiballZ_systemd.py'
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 09:07:42 compute-0 sudo[182017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='ClearwaterForest'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ddpd-u'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sha512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='ClearwaterForest-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bhi-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ddpd-u'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sha512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm3'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sm4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Cooperlake-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Denverton'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Denverton-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Milan-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Milan-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Turin'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbpb'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='EPYC-Turin-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbpb'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-128'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-256'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-128'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-256'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx10-512'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-IBRS'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Haswell-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v6'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v7'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='KnightsMill'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512er'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512pf'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='KnightsMill-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512er'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512pf'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G4-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G5'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tbm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='Opteron_G5-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tbm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v2'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v3'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v4'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v1'>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 09:07:42 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v4'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v5'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v4'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='athlon'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='athlon-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='core2duo'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='core2duo-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='coreduo'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='coreduo-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='n270'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='n270-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='phenom'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='phenom-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </cpu>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <memoryBacking supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <enum name='sourceType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <value>file</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <value>anonymous</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <value>memfd</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </memoryBacking>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <devices>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <disk supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='diskDevice'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>disk</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>cdrom</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>floppy</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>lun</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='bus'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>fdc</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>scsi</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>sata</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio-transitional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio-non-transitional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </disk>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <graphics supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vnc</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>egl-headless</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>dbus</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </graphics>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <video supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='modelType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vga</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>cirrus</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>none</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>bochs</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>ramfb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </video>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <hostdev supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='mode'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>subsystem</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='startupPolicy'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>default</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>mandatory</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>requisite</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>optional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='subsysType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pci</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>scsi</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='capsType'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='pciBackend'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </hostdev>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <rng supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio-transitional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio-non-transitional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>random</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>egd</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>builtin</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </rng>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <filesystem supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='driverType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>path</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>handle</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtiofs</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </filesystem>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <tpm supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>tpm-tis</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>tpm-crb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>emulator</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>external</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendVersion'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>2.0</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </tpm>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <redirdev supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='bus'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </redirdev>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <channel supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pty</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>unix</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </channel>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <crypto supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>qemu</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>builtin</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </crypto>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <interface supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>default</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>passt</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </interface>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <panic supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>isa</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>hyperv</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </panic>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <console supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>null</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vc</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pty</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>dev</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>file</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pipe</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>stdio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>udp</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>tcp</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>unix</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>qemu-vdagent</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>dbus</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </console>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </devices>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <features>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <gic supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <vmcoreinfo supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <genid supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <backingStoreInput supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <backup supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <async-teardown supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <s390-pv supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <ps2 supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <tdx supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <sev supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <sgx supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <hyperv supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='features'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>relaxed</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vapic</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>spinlocks</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vpindex</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>runtime</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>synic</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>stimer</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>reset</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vendor_id</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>frequencies</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>reenlightenment</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>tlbflush</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>ipi</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>avic</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>emsr_bitmap</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>xmm_input</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <defaults>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <spinlocks>4095</spinlocks>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <stimer_direct>on</stimer_direct>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </defaults>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </hyperv>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <launchSecurity supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </features>
Jan 23 09:07:43 compute-0 nova_compute[181149]: </domainCapabilities>
Jan 23 09:07:43 compute-0 nova_compute[181149]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:42.912 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 09:07:43 compute-0 nova_compute[181149]: <domainCapabilities>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <domain>kvm</domain>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <arch>x86_64</arch>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <vcpu max='240'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <iothreads supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <os supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <enum name='firmware'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <loader supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>rom</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pflash</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='readonly'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>yes</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>no</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='secure'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>no</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </loader>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </os>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <cpu>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <mode name='host-passthrough' supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='hostPassthroughMigratable'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>on</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>off</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <mode name='maximum' supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='maximumMigratable'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>on</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>off</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <mode name='host-model' supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <vendor>AMD</vendor>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='x2apic'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='hypervisor'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='vaes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='vpclmulqdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='stibp'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='ssbd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='overflow-recov'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='succor'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='lbrv'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='tsc-scale'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='flushbyasid'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='pause-filter'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='pfthreshold'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='vgif'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <mode name='custom' supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Broadwell'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Broadwell-IBRS'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Broadwell-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Broadwell-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='ClearwaterForest'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bhi-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ddpd-u'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sha512'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sm3'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sm4'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='ClearwaterForest-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bhi-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ddpd-u'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sha512'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sm3'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sm4'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cooperlake'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cooperlake-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Cooperlake-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Denverton'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Denverton-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='EPYC-Milan-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='EPYC-Milan-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='EPYC-Turin'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='prefetchi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbpb'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='EPYC-Turin-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amd-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='auto-ibrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='perfmon-v2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='prefetchi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbpb'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='stibp-always-on'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx10'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx10-128'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx10-256'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx10-512'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='GraniteRapids-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx10'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx10-128'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx10-256'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx10-512'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='prefetchiti'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Haswell'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Haswell-IBRS'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Haswell-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Haswell-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v4'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v5'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v6'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Icelake-Server-v7'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='KnightsMill'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512er'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512pf'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='KnightsMill-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512er'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512pf'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Opteron_G4'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Opteron_G4-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Opteron_G5'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tbm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Opteron_G5-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fma4'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tbm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xop'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SapphireRapids-v4'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='amx-tile'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-bf16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-fp16'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bitalg'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrc'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fzrm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='la57'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='taa-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='xfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SierraForest'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='SierraForest-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ifma'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cmpccxadd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fbsdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='fsrs'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ibrs-all'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='intel-psfd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='lam'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mcdt-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='pbrsb-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='psdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='serialize'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Client-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='hle'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='rtm'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v4'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Skylake-Server-v5'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512bw'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512cd'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512dq'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512f'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='avx512vl'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='mpx'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v2'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v3'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='core-capability'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='split-lock-detect'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='Snowridge-v4'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='cldemote'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='gfni'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdir64b'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='movdiri'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='athlon'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='athlon-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='core2duo'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='core2duo-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='coreduo'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='coreduo-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='n270'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='n270-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='ss'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='phenom'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <blockers model='phenom-v1'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnow'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <feature name='3dnowext'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </blockers>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </mode>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </cpu>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <memoryBacking supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <enum name='sourceType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <value>file</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <value>anonymous</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <value>memfd</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </memoryBacking>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <devices>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <disk supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='diskDevice'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>disk</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>cdrom</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>floppy</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>lun</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='bus'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>ide</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>fdc</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>scsi</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>sata</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio-transitional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio-non-transitional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </disk>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <graphics supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vnc</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>egl-headless</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>dbus</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </graphics>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <video supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='modelType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vga</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>cirrus</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>none</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>bochs</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>ramfb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </video>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <hostdev supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='mode'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>subsystem</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='startupPolicy'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>default</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>mandatory</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>requisite</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>optional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='subsysType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pci</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>scsi</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='capsType'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='pciBackend'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </hostdev>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <rng supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio-transitional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtio-non-transitional</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>random</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>egd</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>builtin</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </rng>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <filesystem supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='driverType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>path</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>handle</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>virtiofs</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </filesystem>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <tpm supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>tpm-tis</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>tpm-crb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>emulator</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>external</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendVersion'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>2.0</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </tpm>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <redirdev supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='bus'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>usb</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </redirdev>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <channel supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pty</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>unix</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </channel>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <crypto supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>qemu</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendModel'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>builtin</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </crypto>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <interface supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='backendType'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>default</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>passt</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </interface>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <panic supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='model'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>isa</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>hyperv</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </panic>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <console supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='type'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>null</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vc</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pty</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>dev</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>file</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>pipe</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>stdio</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>udp</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>tcp</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>unix</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>qemu-vdagent</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>dbus</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </console>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </devices>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <features>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <gic supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <vmcoreinfo supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <genid supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <backingStoreInput supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <backup supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <async-teardown supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <s390-pv supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <ps2 supported='yes'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <tdx supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <sev supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <sgx supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <hyperv supported='yes'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <enum name='features'>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>relaxed</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vapic</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>spinlocks</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vpindex</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>runtime</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>synic</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>stimer</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>reset</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>vendor_id</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>frequencies</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>reenlightenment</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>tlbflush</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>ipi</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>avic</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>emsr_bitmap</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <value>xmm_input</value>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </enum>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       <defaults>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <spinlocks>4095</spinlocks>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <stimer_direct>on</stimer_direct>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 09:07:43 compute-0 nova_compute[181149]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 09:07:43 compute-0 nova_compute[181149]:       </defaults>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     </hyperv>
Jan 23 09:07:43 compute-0 nova_compute[181149]:     <launchSecurity supported='no'/>
Jan 23 09:07:43 compute-0 nova_compute[181149]:   </features>
Jan 23 09:07:43 compute-0 nova_compute[181149]: </domainCapabilities>
Jan 23 09:07:43 compute-0 nova_compute[181149]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.006 181153 DEBUG nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.006 181153 INFO nova.virt.libvirt.host [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Secure Boot support detected
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.008 181153 INFO nova.virt.libvirt.driver [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.008 181153 INFO nova.virt.libvirt.driver [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.015 181153 DEBUG nova.virt.libvirt.driver [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 23 09:07:43 compute-0 nova_compute[181149]:   <model>Nehalem</model>
Jan 23 09:07:43 compute-0 nova_compute[181149]: </cpu>
Jan 23 09:07:43 compute-0 nova_compute[181149]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.021 181153 DEBUG nova.virt.libvirt.driver [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.038 181153 INFO nova.virt.node [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Determined node identity 052a7ae7-9ec7-49ca-a013-73791f9c049a from /var/lib/nova/compute_id
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.048 181153 WARNING nova.compute.manager [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Compute nodes ['052a7ae7-9ec7-49ca-a013-73791f9c049a'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.061 181153 INFO nova.compute.manager [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.079 181153 WARNING nova.compute.manager [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.079 181153 DEBUG oslo_concurrency.lockutils [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.080 181153 DEBUG oslo_concurrency.lockutils [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.080 181153 DEBUG oslo_concurrency.lockutils [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.080 181153 DEBUG nova.compute.resource_tracker [None req-4e3f4713-fb6c-4c3f-b635-a19322aa7775 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:07:43 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 09:07:43 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 23 09:07:43 compute-0 python3.9[182019]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:07:43 compute-0 systemd[1]: Stopping nova_compute container...
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.315 181153 DEBUG oslo_concurrency.lockutils [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.315 181153 DEBUG oslo_concurrency.lockutils [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:07:43 compute-0 nova_compute[181149]: 2026-01-23 09:07:43.315 181153 DEBUG oslo_concurrency.lockutils [None req-9ac1f8bb-9596-4d44-a839-ab99d50ff6c2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:07:43 compute-0 virtqemud[181713]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 23 09:07:43 compute-0 virtqemud[181713]: hostname: compute-0
Jan 23 09:07:43 compute-0 virtqemud[181713]: End of file while reading data: Input/output error
Jan 23 09:07:43 compute-0 systemd[1]: libpod-100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7.scope: Deactivated successfully.
Jan 23 09:07:43 compute-0 systemd[1]: libpod-100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7.scope: Consumed 2.481s CPU time.
Jan 23 09:07:43 compute-0 podman[182046]: 2026-01-23 09:07:43.58937067 +0000 UTC m=+0.306133592 container died 100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 09:07:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7-userdata-shm.mount: Deactivated successfully.
Jan 23 09:07:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5-merged.mount: Deactivated successfully.
Jan 23 09:07:43 compute-0 podman[182046]: 2026-01-23 09:07:43.622570798 +0000 UTC m=+0.339333720 container cleanup 100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute)
Jan 23 09:07:43 compute-0 podman[182046]: nova_compute
Jan 23 09:07:43 compute-0 podman[182070]: nova_compute
Jan 23 09:07:43 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 23 09:07:43 compute-0 systemd[1]: Stopped nova_compute container.
Jan 23 09:07:43 compute-0 systemd[1]: Starting nova_compute container...
Jan 23 09:07:43 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:07:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2cc4feb039b3cf2f8d07a1808ad8d4208c72f5d206f7a02038bdf30c4c0afd5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:43 compute-0 podman[182079]: 2026-01-23 09:07:43.746898489 +0000 UTC m=+0.059665554 container init 100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:07:43 compute-0 podman[182079]: 2026-01-23 09:07:43.754268612 +0000 UTC m=+0.067035667 container start 100c0896c4c8ebda1cb7b9984bad246815641274ac7caf699b5a52afc7cc8ad7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 09:07:43 compute-0 podman[182079]: nova_compute
Jan 23 09:07:43 compute-0 nova_compute[182092]: + sudo -E kolla_set_configs
Jan 23 09:07:43 compute-0 systemd[1]: Started nova_compute container.
Jan 23 09:07:43 compute-0 sudo[182017]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Validating config file
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying service configuration files
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /etc/ceph
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Creating directory /etc/ceph
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Writing out command to execute
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 09:07:43 compute-0 nova_compute[182092]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 09:07:43 compute-0 nova_compute[182092]: ++ cat /run_command
Jan 23 09:07:43 compute-0 nova_compute[182092]: + CMD=nova-compute
Jan 23 09:07:43 compute-0 nova_compute[182092]: + ARGS=
Jan 23 09:07:43 compute-0 nova_compute[182092]: + sudo kolla_copy_cacerts
Jan 23 09:07:43 compute-0 nova_compute[182092]: + [[ ! -n '' ]]
Jan 23 09:07:43 compute-0 nova_compute[182092]: + . kolla_extend_start
Jan 23 09:07:43 compute-0 nova_compute[182092]: Running command: 'nova-compute'
Jan 23 09:07:43 compute-0 nova_compute[182092]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 09:07:43 compute-0 nova_compute[182092]: + umask 0022
Jan 23 09:07:43 compute-0 nova_compute[182092]: + exec nova-compute
Jan 23 09:07:44 compute-0 sudo[182253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfrvjpqhjgguldaixhfxdfeurffnknqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159263.9736993-3602-26811097111104/AnsiballZ_podman_container.py'
Jan 23 09:07:44 compute-0 sudo[182253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:44 compute-0 python3.9[182255]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 09:07:44 compute-0 systemd[1]: Started libpod-conmon-4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc.scope.
Jan 23 09:07:44 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:07:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaa950e06ce2a98096daecac6c2cc0bc5062110e6619e0eb3f7be3c895c589e0/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaa950e06ce2a98096daecac6c2cc0bc5062110e6619e0eb3f7be3c895c589e0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaa950e06ce2a98096daecac6c2cc0bc5062110e6619e0eb3f7be3c895c589e0/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 23 09:07:44 compute-0 podman[182275]: 2026-01-23 09:07:44.513299126 +0000 UTC m=+0.092312289 container init 4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:07:44 compute-0 podman[182275]: 2026-01-23 09:07:44.520267351 +0000 UTC m=+0.099280494 container start 4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:07:44 compute-0 python3.9[182255]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Applying nova statedir ownership
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 23 09:07:44 compute-0 nova_compute_init[182295]: INFO:nova_statedir:Nova statedir ownership complete
Jan 23 09:07:44 compute-0 systemd[1]: libpod-4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc.scope: Deactivated successfully.
Jan 23 09:07:44 compute-0 podman[182307]: 2026-01-23 09:07:44.594283215 +0000 UTC m=+0.023322527 container died 4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init)
Jan 23 09:07:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc-userdata-shm.mount: Deactivated successfully.
Jan 23 09:07:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-eaa950e06ce2a98096daecac6c2cc0bc5062110e6619e0eb3f7be3c895c589e0-merged.mount: Deactivated successfully.
Jan 23 09:07:44 compute-0 podman[182307]: 2026-01-23 09:07:44.615007719 +0000 UTC m=+0.044047022 container cleanup 4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 09:07:44 compute-0 systemd[1]: libpod-conmon-4bc8d721c652af65e63925696a9c646adcb6318f25c4d68b64bcff8cd53092dc.scope: Deactivated successfully.
Jan 23 09:07:44 compute-0 sudo[182253]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:45 compute-0 sshd-session[159148]: Connection closed by 192.168.122.30 port 49558
Jan 23 09:07:45 compute-0 sshd-session[159145]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:07:45 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Jan 23 09:07:45 compute-0 systemd[1]: session-23.scope: Consumed 1min 7.490s CPU time.
Jan 23 09:07:45 compute-0 systemd-logind[746]: Session 23 logged out. Waiting for processes to exit.
Jan 23 09:07:45 compute-0 systemd-logind[746]: Removed session 23.
Jan 23 09:07:45 compute-0 nova_compute[182092]: 2026-01-23 09:07:45.510 182096 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 09:07:45 compute-0 nova_compute[182092]: 2026-01-23 09:07:45.510 182096 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 09:07:45 compute-0 nova_compute[182092]: 2026-01-23 09:07:45.510 182096 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 09:07:45 compute-0 nova_compute[182092]: 2026-01-23 09:07:45.510 182096 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 23 09:07:45 compute-0 nova_compute[182092]: 2026-01-23 09:07:45.630 182096 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:07:45 compute-0 nova_compute[182092]: 2026-01-23 09:07:45.641 182096 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:07:45 compute-0 nova_compute[182092]: 2026-01-23 09:07:45.641 182096 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.050 182096 INFO nova.virt.driver [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.130 182096 INFO nova.compute.provider_config [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.137 182096 DEBUG oslo_concurrency.lockutils [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.137 182096 DEBUG oslo_concurrency.lockutils [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.137 182096 DEBUG oslo_concurrency.lockutils [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.137 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.138 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.138 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.138 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.138 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.138 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.138 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.139 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.139 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.139 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.139 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.139 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.139 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.139 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.140 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.140 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.140 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.140 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.140 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.140 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.140 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.141 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.141 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.141 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.141 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.141 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.141 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.141 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.142 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.142 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.142 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.142 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.142 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.142 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.142 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.143 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.143 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.143 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.143 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.143 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.143 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.144 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.144 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.144 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.144 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.144 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.145 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.145 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.145 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.145 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.145 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.145 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.146 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.146 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.146 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.146 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.146 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.146 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.146 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.146 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.147 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.147 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.147 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.147 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.147 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.147 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.147 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.148 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.148 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.148 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.148 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.148 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.148 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.148 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.149 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.149 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.149 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.149 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.149 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.149 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.149 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.150 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.150 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.150 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.150 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.150 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.150 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.150 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.151 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.151 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.151 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.151 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.151 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.151 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.151 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.151 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.152 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.152 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.152 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.152 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.152 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.152 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.152 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.153 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.153 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.153 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.153 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.153 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.153 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.153 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.154 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.154 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.154 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.154 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.154 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.154 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.154 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.154 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.155 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.155 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.155 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.155 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.155 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.155 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.155 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.156 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.156 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.156 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.156 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.156 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.156 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.156 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.157 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.157 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.157 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.157 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.157 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.157 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.157 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.157 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.158 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.158 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.158 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.158 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.158 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.158 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.158 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.159 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.159 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.159 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.159 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.159 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.159 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.159 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.160 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.160 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.160 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.160 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.160 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.160 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.161 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.161 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.161 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.161 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.161 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.161 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.161 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.162 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.162 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.162 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.162 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.162 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.162 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.162 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.163 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.163 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.163 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.163 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.163 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.163 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.163 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.164 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.164 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.164 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.164 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.164 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.164 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.164 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.165 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.165 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.165 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.165 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.165 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.165 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.165 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.166 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.166 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.166 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.166 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.166 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.166 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.166 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.167 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.167 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.167 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.167 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.167 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.167 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.167 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.167 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.168 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.168 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.168 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.168 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.168 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.168 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.169 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.169 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.169 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.169 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.169 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.169 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.169 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.169 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.170 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.170 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.170 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.170 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.170 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.170 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.170 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.171 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.171 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.171 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.171 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.171 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.171 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.171 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.172 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.172 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.172 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.172 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.172 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.172 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.172 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.173 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.173 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.173 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.173 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.173 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.173 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.173 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.174 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.174 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.174 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.174 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.174 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.174 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.174 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.175 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.175 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.175 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.175 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.175 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.175 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.175 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.176 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.176 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.176 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.176 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.176 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.176 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.176 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.177 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.177 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.177 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.177 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.177 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.177 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.177 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.178 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.178 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.178 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.178 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.178 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.178 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.178 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.179 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.179 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.179 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.179 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.179 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.179 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.179 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.180 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.180 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.180 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.180 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.180 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.180 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.180 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.181 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.181 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.181 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.181 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.181 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.181 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.181 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.182 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.182 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.182 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.182 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.182 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.182 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.182 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.182 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.183 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.183 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.183 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.183 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.183 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.183 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.183 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.184 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.184 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.184 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.184 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.184 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.184 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.184 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.185 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.185 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.185 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.185 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.185 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.185 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.185 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.186 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.186 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.186 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.186 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.186 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.186 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.186 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.187 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.187 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.187 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.187 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.187 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.187 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.187 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.188 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.188 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.188 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.188 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.188 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.188 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.189 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.189 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.189 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.189 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.189 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.189 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.189 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.190 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.190 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.190 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.190 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.190 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.190 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.190 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.191 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.191 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.191 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.191 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.191 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.191 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.191 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.192 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.192 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.192 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.192 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.192 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.192 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.192 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.192 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.193 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.193 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.193 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.193 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.193 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.193 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.194 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.194 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.194 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.194 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.194 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.194 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.194 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.195 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.195 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.195 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.195 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.195 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.195 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.195 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.195 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.196 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.196 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.196 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.196 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.196 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.196 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.196 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.197 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.197 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.197 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.197 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.197 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.197 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.197 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.198 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.198 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.198 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.198 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.198 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.198 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.198 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.199 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.199 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.199 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.199 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.199 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.199 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.199 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.199 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.200 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.200 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.200 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.200 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.200 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.200 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.200 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.201 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.201 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.201 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.201 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.201 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.201 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.201 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.202 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.202 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.202 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.202 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.202 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.202 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.202 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.203 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.203 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.203 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.203 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.203 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.203 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.203 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.204 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.204 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.204 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.204 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.204 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.204 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.204 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.205 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.205 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.205 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.205 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.205 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.205 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.205 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.206 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.206 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.206 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.206 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.206 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.206 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.206 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.206 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.207 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.207 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.207 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.207 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.207 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.207 182096 WARNING oslo_config.cfg [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 09:07:46 compute-0 nova_compute[182092]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 09:07:46 compute-0 nova_compute[182092]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 09:07:46 compute-0 nova_compute[182092]: and ``live_migration_inbound_addr`` respectively.
Jan 23 09:07:46 compute-0 nova_compute[182092]: ).  Its value may be silently ignored in the future.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.208 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.208 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.208 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.208 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.208 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.208 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.209 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.209 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.209 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.209 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.209 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.209 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.209 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.210 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.210 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.210 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.210 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.210 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.210 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.211 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.211 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.211 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.211 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.211 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.211 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.211 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.211 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.212 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.212 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.212 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.212 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.212 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.212 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.213 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.213 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.213 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.213 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.213 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.213 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.213 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.214 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.214 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.214 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.214 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.214 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.214 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.214 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.215 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.215 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.215 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.215 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.215 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.215 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.215 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.216 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.216 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.216 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.216 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.216 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.216 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.216 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.217 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.217 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.217 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.217 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.217 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.217 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.217 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.218 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.218 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.218 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.218 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.218 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.218 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.218 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.219 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.219 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.219 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.219 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.219 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.219 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.219 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.220 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.220 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.220 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.220 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.220 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.220 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.220 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.221 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.221 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.221 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.221 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.221 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.221 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.221 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.222 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.222 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.222 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.222 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.222 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.222 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.222 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.223 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.223 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.223 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.223 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.223 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.223 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.223 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.223 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.224 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.224 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.224 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.224 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.224 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.224 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.224 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.225 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.225 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.225 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.225 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.225 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.225 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.225 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.226 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.226 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.226 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.226 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.226 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.226 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.226 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.227 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.227 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.227 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.227 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.227 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.227 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.227 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.228 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.228 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.228 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.228 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.228 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.228 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.229 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.229 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.229 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.229 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.229 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.229 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.229 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.230 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.230 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.230 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.230 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.230 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.230 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.230 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.231 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.231 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.231 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.231 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.231 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.231 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.231 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.232 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.232 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.232 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.232 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.232 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.232 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.232 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.233 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.233 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.233 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.233 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.233 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.233 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.233 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.234 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.234 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.234 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.234 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.234 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.234 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.234 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.235 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.235 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.235 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.235 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.235 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.235 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.235 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.236 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.236 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.236 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.236 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.236 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.236 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.237 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.237 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.237 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.237 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.237 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.237 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.237 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.238 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.238 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.238 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.238 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.238 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.238 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.238 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.238 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.239 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.239 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.239 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.239 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.239 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.239 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.239 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.240 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.240 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.240 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.240 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.240 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.240 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.240 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.241 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.241 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.241 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.241 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.241 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.241 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.241 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.242 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.242 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.242 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.242 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.242 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.242 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.242 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.242 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.243 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.243 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.243 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.243 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.243 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.243 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.244 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.244 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.244 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.244 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.244 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.244 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.245 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.245 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.245 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.245 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.245 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.245 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.246 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.246 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.246 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.246 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.246 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.246 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.247 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.247 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.247 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.247 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.247 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.247 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.247 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.248 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.248 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.248 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.248 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.248 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.248 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.248 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.249 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.249 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.249 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.249 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.249 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.249 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.249 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.250 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.250 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.250 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.250 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.250 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.250 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.250 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.251 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.251 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.251 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.251 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.251 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.251 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.251 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.252 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.252 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.252 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.252 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.252 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.252 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.252 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.253 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.253 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.253 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.253 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.253 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.253 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.253 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.254 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.254 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.254 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.254 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.254 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.254 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.255 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.255 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.255 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.255 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.255 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.255 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.256 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.256 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.256 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.256 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.256 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.256 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.256 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.257 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.257 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.257 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.257 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.257 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.257 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.258 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.258 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.258 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.258 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.258 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.258 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.258 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.259 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.259 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.259 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.259 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.259 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.259 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.259 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.259 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.260 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.260 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.260 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.260 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.260 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.260 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.260 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.261 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.261 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.261 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.261 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.261 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.261 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.261 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.262 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.262 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.262 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.262 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.262 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.262 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.262 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.262 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.263 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.263 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.263 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.263 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.263 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.263 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.263 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.264 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.264 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.264 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.264 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.264 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.264 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.264 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.265 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.265 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.265 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.265 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.265 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.265 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.265 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.266 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.266 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.266 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.266 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.266 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.266 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.266 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.267 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.267 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.267 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.267 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.267 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.267 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.267 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.268 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.268 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] privsep_osbrick.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.268 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.268 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.268 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.268 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.268 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.269 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] nova_sys_admin.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.269 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.269 182096 DEBUG oslo_service.service [None req-20812ad2-ae0e-4979-8476-4d83d868d2ea - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.269 182096 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.278 182096 INFO nova.virt.node [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Determined node identity 052a7ae7-9ec7-49ca-a013-73791f9c049a from /var/lib/nova/compute_id
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.278 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.279 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.279 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.279 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.288 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f70b4c38a00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.290 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f70b4c38a00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.291 182096 INFO nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Connection event '1' reason 'None'
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.294 182096 INFO nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]: 
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <host>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <uuid>5a008b0a-4e78-4797-b262-5f7749cb75af</uuid>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <arch>x86_64</arch>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model>EPYC-Milan-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <vendor>AMD</vendor>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <microcode version='167776725'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <signature family='25' model='1' stepping='1'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <topology sockets='4' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <maxphysaddr mode='emulate' bits='48'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='x2apic'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='tsc-deadline'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='osxsave'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='hypervisor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='tsc_adjust'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='ospke'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='vaes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='vpclmulqdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='spec-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='stibp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='arch-capabilities'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='cmp_legacy'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='virt-ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='lbrv'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='tsc-scale'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='vmcb-clean'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='pause-filter'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='pfthreshold'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='v-vmsave-vmload'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='vgif'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='rdctl-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='skip-l1dfl-vmentry'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='mds-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature name='pschange-mc-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <pages unit='KiB' size='4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <pages unit='KiB' size='2048'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <pages unit='KiB' size='1048576'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <power_management>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <suspend_mem/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <suspend_disk/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <suspend_hybrid/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </power_management>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <iommu support='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <migration_features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <live/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <uri_transports>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <uri_transport>tcp</uri_transport>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <uri_transport>rdma</uri_transport>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </uri_transports>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </migration_features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <topology>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <cells num='1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <cell id='0'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:           <memory unit='KiB'>7865364</memory>
Jan 23 09:07:46 compute-0 nova_compute[182092]:           <pages unit='KiB' size='4'>1966341</pages>
Jan 23 09:07:46 compute-0 nova_compute[182092]:           <pages unit='KiB' size='2048'>0</pages>
Jan 23 09:07:46 compute-0 nova_compute[182092]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 23 09:07:46 compute-0 nova_compute[182092]:           <distances>
Jan 23 09:07:46 compute-0 nova_compute[182092]:             <sibling id='0' value='10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:           </distances>
Jan 23 09:07:46 compute-0 nova_compute[182092]:           <cpus num='4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:           </cpus>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         </cell>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </cells>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </topology>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <cache>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </cache>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <secmodel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model>selinux</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <doi>0</doi>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </secmodel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <secmodel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model>dac</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <doi>0</doi>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </secmodel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </host>
Jan 23 09:07:46 compute-0 nova_compute[182092]: 
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <guest>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <os_type>hvm</os_type>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <arch name='i686'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <wordsize>32</wordsize>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <domain type='qemu'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <domain type='kvm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </arch>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <pae/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <nonpae/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <acpi default='on' toggle='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <apic default='on' toggle='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <cpuselection/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <deviceboot/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <disksnapshot default='on' toggle='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <externalSnapshot/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </guest>
Jan 23 09:07:46 compute-0 nova_compute[182092]: 
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <guest>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <os_type>hvm</os_type>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <arch name='x86_64'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <wordsize>64</wordsize>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <domain type='qemu'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <domain type='kvm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </arch>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <acpi default='on' toggle='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <apic default='on' toggle='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <cpuselection/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <deviceboot/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <disksnapshot default='on' toggle='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <externalSnapshot/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </guest>
Jan 23 09:07:46 compute-0 nova_compute[182092]: 
Jan 23 09:07:46 compute-0 nova_compute[182092]: </capabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]: 
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.297 182096 DEBUG nova.virt.libvirt.volume.mount [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.300 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.309 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 09:07:46 compute-0 nova_compute[182092]: <domainCapabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <domain>kvm</domain>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <arch>i686</arch>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <vcpu max='240'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <iothreads supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <os supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <enum name='firmware'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <loader supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>rom</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pflash</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='readonly'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>yes</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>no</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='secure'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>no</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </loader>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </os>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='host-passthrough' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='hostPassthroughMigratable'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>on</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>off</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='maximum' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='maximumMigratable'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>on</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>off</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='host-model' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <vendor>AMD</vendor>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='x2apic'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='hypervisor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vaes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vpclmulqdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='stibp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='overflow-recov'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='succor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='lbrv'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc-scale'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='flushbyasid'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='pause-filter'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='pfthreshold'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vgif'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='custom' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='ClearwaterForest'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ddpd-u'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sha512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='ClearwaterForest-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ddpd-u'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sha512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Denverton'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Denverton-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Milan-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Milan-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Turin'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbpb'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Turin-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbpb'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-128'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-256'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-128'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-256'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v6'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v7'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='KnightsMill'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512er'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512pf'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='KnightsMill-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512er'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512pf'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G4-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tbm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G5-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tbm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='athlon'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='athlon-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='core2duo'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='core2duo-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='coreduo'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='coreduo-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='n270'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='n270-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='phenom'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='phenom-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <memoryBacking supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <enum name='sourceType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>file</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>anonymous</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>memfd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </memoryBacking>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <disk supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='diskDevice'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>disk</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>cdrom</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>floppy</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>lun</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='bus'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ide</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>fdc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>scsi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>sata</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-non-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <graphics supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vnc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>egl-headless</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dbus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <video supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='modelType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vga</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>cirrus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>none</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>bochs</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ramfb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </video>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <hostdev supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='mode'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>subsystem</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='startupPolicy'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>default</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>mandatory</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>requisite</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>optional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='subsysType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pci</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>scsi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='capsType'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='pciBackend'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </hostdev>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <rng supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-non-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>random</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>egd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>builtin</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <filesystem supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='driverType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>path</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>handle</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtiofs</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </filesystem>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <tpm supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tpm-tis</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tpm-crb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>emulator</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>external</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendVersion'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>2.0</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </tpm>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <redirdev supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='bus'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </redirdev>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <channel supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pty</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>unix</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </channel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <crypto supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>qemu</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>builtin</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </crypto>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <interface supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>default</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>passt</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <panic supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>isa</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>hyperv</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </panic>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <console supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>null</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pty</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dev</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>file</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pipe</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>stdio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>udp</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tcp</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>unix</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>qemu-vdagent</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dbus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </console>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <gic supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <vmcoreinfo supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <genid supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <backingStoreInput supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <backup supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <async-teardown supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <s390-pv supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <ps2 supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <tdx supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <sev supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <sgx supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <hyperv supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='features'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>relaxed</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vapic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>spinlocks</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vpindex</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>runtime</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>synic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>stimer</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>reset</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vendor_id</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>frequencies</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>reenlightenment</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tlbflush</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ipi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>avic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>emsr_bitmap</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>xmm_input</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <defaults>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <spinlocks>4095</spinlocks>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <stimer_direct>on</stimer_direct>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </defaults>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </hyperv>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <launchSecurity supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </features>
Jan 23 09:07:46 compute-0 nova_compute[182092]: </domainCapabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.313 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 09:07:46 compute-0 nova_compute[182092]: <domainCapabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <domain>kvm</domain>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <arch>i686</arch>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <vcpu max='4096'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <iothreads supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <os supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <enum name='firmware'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <loader supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>rom</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pflash</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='readonly'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>yes</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>no</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='secure'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>no</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </loader>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </os>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='host-passthrough' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='hostPassthroughMigratable'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>on</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>off</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='maximum' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='maximumMigratable'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>on</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>off</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='host-model' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <vendor>AMD</vendor>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='x2apic'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='hypervisor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vaes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vpclmulqdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='stibp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='overflow-recov'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='succor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='lbrv'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc-scale'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='flushbyasid'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='pause-filter'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='pfthreshold'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vgif'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='custom' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='ClearwaterForest'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ddpd-u'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sha512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='ClearwaterForest-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ddpd-u'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sha512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Denverton'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Denverton-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Milan-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Milan-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Turin'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbpb'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Turin-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbpb'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-128'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-256'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-128'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-256'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v6'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v7'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='KnightsMill'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512er'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512pf'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='KnightsMill-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512er'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512pf'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G4-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tbm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G5-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tbm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='athlon'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='athlon-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='core2duo'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='core2duo-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='coreduo'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='coreduo-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='n270'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='n270-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='phenom'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='phenom-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <memoryBacking supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <enum name='sourceType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>file</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>anonymous</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>memfd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </memoryBacking>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <disk supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='diskDevice'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>disk</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>cdrom</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>floppy</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>lun</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='bus'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>fdc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>scsi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>sata</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-non-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <graphics supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vnc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>egl-headless</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dbus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <video supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='modelType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vga</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>cirrus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>none</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>bochs</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ramfb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </video>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <hostdev supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='mode'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>subsystem</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='startupPolicy'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>default</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>mandatory</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>requisite</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>optional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='subsysType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pci</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>scsi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='capsType'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='pciBackend'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </hostdev>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <rng supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-non-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>random</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>egd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>builtin</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <filesystem supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='driverType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>path</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>handle</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtiofs</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </filesystem>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <tpm supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tpm-tis</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tpm-crb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>emulator</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>external</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendVersion'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>2.0</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </tpm>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <redirdev supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='bus'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </redirdev>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <channel supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pty</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>unix</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </channel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <crypto supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>qemu</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>builtin</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </crypto>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <interface supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>default</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>passt</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <panic supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>isa</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>hyperv</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </panic>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <console supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>null</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pty</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dev</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>file</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pipe</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>stdio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>udp</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tcp</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>unix</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>qemu-vdagent</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dbus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </console>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <gic supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <vmcoreinfo supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <genid supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <backingStoreInput supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <backup supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <async-teardown supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <s390-pv supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <ps2 supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <tdx supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <sev supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <sgx supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <hyperv supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='features'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>relaxed</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vapic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>spinlocks</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vpindex</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>runtime</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>synic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>stimer</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>reset</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vendor_id</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>frequencies</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>reenlightenment</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tlbflush</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ipi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>avic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>emsr_bitmap</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>xmm_input</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <defaults>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <spinlocks>4095</spinlocks>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <stimer_direct>on</stimer_direct>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </defaults>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </hyperv>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <launchSecurity supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </features>
Jan 23 09:07:46 compute-0 nova_compute[182092]: </domainCapabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.336 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.339 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 09:07:46 compute-0 nova_compute[182092]: <domainCapabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <domain>kvm</domain>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <arch>x86_64</arch>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <vcpu max='240'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <iothreads supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <os supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <enum name='firmware'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <loader supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>rom</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pflash</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='readonly'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>yes</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>no</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='secure'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>no</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </loader>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </os>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='host-passthrough' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='hostPassthroughMigratable'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>on</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>off</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='maximum' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='maximumMigratable'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>on</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>off</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='host-model' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <vendor>AMD</vendor>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='x2apic'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='hypervisor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vaes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vpclmulqdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='stibp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='overflow-recov'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='succor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='lbrv'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc-scale'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='flushbyasid'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='pause-filter'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='pfthreshold'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vgif'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='custom' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='ClearwaterForest'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ddpd-u'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sha512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='ClearwaterForest-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ddpd-u'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sha512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Denverton'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Denverton-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Milan-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Milan-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Turin'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbpb'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Turin-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbpb'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-128'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-256'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-128'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-256'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v6'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v7'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='KnightsMill'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512er'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512pf'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='KnightsMill-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512er'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512pf'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G4-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tbm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G5-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tbm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='athlon'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='athlon-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='core2duo'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='core2duo-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='coreduo'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='coreduo-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='n270'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='n270-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='phenom'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='phenom-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <memoryBacking supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <enum name='sourceType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>file</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>anonymous</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>memfd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </memoryBacking>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <disk supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='diskDevice'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>disk</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>cdrom</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>floppy</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>lun</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='bus'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ide</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>fdc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>scsi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>sata</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-non-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <graphics supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vnc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>egl-headless</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dbus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <video supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='modelType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vga</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>cirrus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>none</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>bochs</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ramfb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </video>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <hostdev supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='mode'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>subsystem</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='startupPolicy'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>default</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>mandatory</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>requisite</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>optional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='subsysType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pci</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>scsi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='capsType'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='pciBackend'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </hostdev>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <rng supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-non-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>random</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>egd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>builtin</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <filesystem supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='driverType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>path</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>handle</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtiofs</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </filesystem>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <tpm supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tpm-tis</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tpm-crb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>emulator</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>external</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendVersion'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>2.0</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </tpm>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <redirdev supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='bus'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </redirdev>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <channel supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pty</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>unix</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </channel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <crypto supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>qemu</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>builtin</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </crypto>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <interface supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>default</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>passt</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <panic supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>isa</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>hyperv</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </panic>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <console supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>null</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pty</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dev</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>file</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pipe</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>stdio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>udp</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tcp</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>unix</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>qemu-vdagent</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dbus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </console>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <gic supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <vmcoreinfo supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <genid supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <backingStoreInput supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <backup supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <async-teardown supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <s390-pv supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <ps2 supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <tdx supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <sev supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <sgx supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <hyperv supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='features'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>relaxed</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vapic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>spinlocks</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vpindex</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>runtime</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>synic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>stimer</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>reset</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vendor_id</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>frequencies</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>reenlightenment</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tlbflush</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ipi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>avic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>emsr_bitmap</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>xmm_input</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <defaults>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <spinlocks>4095</spinlocks>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <stimer_direct>on</stimer_direct>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </defaults>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </hyperv>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <launchSecurity supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </features>
Jan 23 09:07:46 compute-0 nova_compute[182092]: </domainCapabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.393 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 09:07:46 compute-0 nova_compute[182092]: <domainCapabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <domain>kvm</domain>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <arch>x86_64</arch>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <vcpu max='4096'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <iothreads supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <os supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <enum name='firmware'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>efi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <loader supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>rom</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pflash</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='readonly'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>yes</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>no</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='secure'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>yes</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>no</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </loader>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </os>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='host-passthrough' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='hostPassthroughMigratable'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>on</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>off</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='maximum' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='maximumMigratable'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>on</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>off</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='host-model' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <vendor>AMD</vendor>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='x2apic'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='hypervisor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vaes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vpclmulqdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='stibp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='overflow-recov'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='succor'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='lbrv'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='tsc-scale'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='flushbyasid'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='pause-filter'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='pfthreshold'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='vgif'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <mode name='custom' supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Broadwell-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='ClearwaterForest'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ddpd-u'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sha512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='ClearwaterForest-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ddpd-u'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sha512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm3'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sm4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Cooperlake-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Denverton'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Denverton-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Milan-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Milan-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Turin'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbpb'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='EPYC-Turin-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amd-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='auto-ibrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vp2intersect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fs-gs-base-ns'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibpb-brtype'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='no-nested-data-bp'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='null-sel-clr-base'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='perfmon-v2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbpb'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='srso-user-kernel-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='stibp-always-on'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='AMD'>EPYC-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-128'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-256'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='GraniteRapids-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-128'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-256'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx10-512'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='prefetchiti'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Haswell-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v6'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Icelake-Server-v7'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='KnightsMill'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512er'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512pf'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='KnightsMill-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4fmaps'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-4vnniw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512er'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512pf'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G4-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tbm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Opteron_G5-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fma4'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tbm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xop'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SapphireRapids-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='amx-tile'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-bf16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-fp16'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512-vpopcntdq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bitalg'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vbmi2'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrc'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fzrm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='la57'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='taa-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='tsx-ldtrk'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='xfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='SierraForest-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ifma'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-ne-convert'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx-vnni-int8'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bhi-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='bus-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cmpccxadd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fbsdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='fsrs'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ibrs-all'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='intel-psfd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ipred-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='lam'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mcdt-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='pbrsb-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='psdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rrsba-ctrl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='sbdr-ssdp-no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='serialize'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Client-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='hle'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='rtm'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Skylake-Server-v5'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512bw'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512cd'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512dq'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512f'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='avx512vl'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='mpx'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v2'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v3'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='core-capability'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='split-lock-detect'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='Snowridge-v4'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='cldemote'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='gfni'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdir64b'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='movdiri'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='athlon'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='athlon-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='core2duo'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='core2duo-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='coreduo'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='coreduo-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='n270'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='n270-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='ss'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='phenom'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <blockers model='phenom-v1'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnow'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <feature name='3dnowext'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </blockers>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </mode>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <memoryBacking supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <enum name='sourceType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>file</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>anonymous</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <value>memfd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </memoryBacking>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <disk supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='diskDevice'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>disk</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>cdrom</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>floppy</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>lun</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='bus'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>fdc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>scsi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>sata</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-non-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <graphics supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vnc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>egl-headless</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dbus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <video supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='modelType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vga</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>cirrus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>none</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>bochs</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ramfb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </video>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <hostdev supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='mode'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>subsystem</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='startupPolicy'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>default</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>mandatory</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>requisite</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>optional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='subsysType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pci</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>scsi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='capsType'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='pciBackend'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </hostdev>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <rng supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtio-non-transitional</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>random</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>egd</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>builtin</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <filesystem supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='driverType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>path</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>handle</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>virtiofs</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </filesystem>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <tpm supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tpm-tis</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tpm-crb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>emulator</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>external</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendVersion'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>2.0</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </tpm>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <redirdev supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='bus'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>usb</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </redirdev>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <channel supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pty</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>unix</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </channel>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <crypto supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>qemu</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendModel'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>builtin</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </crypto>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <interface supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='backendType'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>default</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>passt</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <panic supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='model'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>isa</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>hyperv</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </panic>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <console supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='type'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>null</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vc</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pty</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dev</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>file</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>pipe</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>stdio</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>udp</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tcp</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>unix</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>qemu-vdagent</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>dbus</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </console>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <features>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <gic supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <vmcoreinfo supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <genid supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <backingStoreInput supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <backup supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <async-teardown supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <s390-pv supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <ps2 supported='yes'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <tdx supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <sev supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <sgx supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <hyperv supported='yes'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <enum name='features'>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>relaxed</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vapic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>spinlocks</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vpindex</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>runtime</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>synic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>stimer</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>reset</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>vendor_id</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>frequencies</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>reenlightenment</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>tlbflush</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>ipi</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>avic</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>emsr_bitmap</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <value>xmm_input</value>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </enum>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       <defaults>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <spinlocks>4095</spinlocks>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <stimer_direct>on</stimer_direct>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 09:07:46 compute-0 nova_compute[182092]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 09:07:46 compute-0 nova_compute[182092]:       </defaults>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     </hyperv>
Jan 23 09:07:46 compute-0 nova_compute[182092]:     <launchSecurity supported='no'/>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   </features>
Jan 23 09:07:46 compute-0 nova_compute[182092]: </domainCapabilities>
Jan 23 09:07:46 compute-0 nova_compute[182092]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.443 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.443 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.443 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.443 182096 INFO nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Secure Boot support detected
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.445 182096 INFO nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.445 182096 INFO nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.451 182096 DEBUG nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <model>Nehalem</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]: </cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.453 182096 DEBUG nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.469 182096 INFO nova.virt.node [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Determined node identity 052a7ae7-9ec7-49ca-a013-73791f9c049a from /var/lib/nova/compute_id
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.486 182096 WARNING nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Compute nodes ['052a7ae7-9ec7-49ca-a013-73791f9c049a'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.504 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.512 182096 WARNING nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.512 182096 DEBUG oslo_concurrency.lockutils [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.512 182096 DEBUG oslo_concurrency.lockutils [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.513 182096 DEBUG oslo_concurrency.lockutils [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.513 182096 DEBUG nova.compute.resource_tracker [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.710 182096 WARNING nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.711 182096 DEBUG nova.compute.resource_tracker [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6126MB free_disk=73.58447647094727GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.711 182096 DEBUG oslo_concurrency.lockutils [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.711 182096 DEBUG oslo_concurrency.lockutils [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.725 182096 WARNING nova.compute.resource_tracker [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] No compute node record for compute-0.ctlplane.example.com:052a7ae7-9ec7-49ca-a013-73791f9c049a: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 052a7ae7-9ec7-49ca-a013-73791f9c049a could not be found.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.738 182096 INFO nova.compute.resource_tracker [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 052a7ae7-9ec7-49ca-a013-73791f9c049a
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.789 182096 DEBUG nova.compute.resource_tracker [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.789 182096 DEBUG nova.compute.resource_tracker [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.892 182096 INFO nova.scheduler.client.report [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [req-cd7e4062-025c-499e-8878-f578e1967123] Created resource provider record via placement API for resource provider with UUID 052a7ae7-9ec7-49ca-a013-73791f9c049a and name compute-0.ctlplane.example.com.
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.934 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 23 09:07:46 compute-0 nova_compute[182092]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.935 182096 INFO nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] kernel doesn't support AMD SEV
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.935 182096 DEBUG nova.compute.provider_tree [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.935 182096 DEBUG nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.937 182096 DEBUG nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Libvirt baseline CPU <cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <arch>x86_64</arch>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <model>Nehalem</model>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <vendor>AMD</vendor>
Jan 23 09:07:46 compute-0 nova_compute[182092]:   <topology sockets="4" cores="1" threads="1"/>
Jan 23 09:07:46 compute-0 nova_compute[182092]: </cpu>
Jan 23 09:07:46 compute-0 nova_compute[182092]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.966 182096 DEBUG nova.scheduler.client.report [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Updated inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.966 182096 DEBUG nova.compute.provider_tree [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Updating resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 09:07:46 compute-0 nova_compute[182092]: 2026-01-23 09:07:46.966 182096 DEBUG nova.compute.provider_tree [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:07:47 compute-0 nova_compute[182092]: 2026-01-23 09:07:47.018 182096 DEBUG nova.compute.provider_tree [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Updating resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 09:07:47 compute-0 nova_compute[182092]: 2026-01-23 09:07:47.036 182096 DEBUG nova.compute.resource_tracker [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:07:47 compute-0 nova_compute[182092]: 2026-01-23 09:07:47.036 182096 DEBUG oslo_concurrency.lockutils [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:07:47 compute-0 nova_compute[182092]: 2026-01-23 09:07:47.036 182096 DEBUG nova.service [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 23 09:07:47 compute-0 nova_compute[182092]: 2026-01-23 09:07:47.072 182096 DEBUG nova.service [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 23 09:07:47 compute-0 nova_compute[182092]: 2026-01-23 09:07:47.072 182096 DEBUG nova.servicegroup.drivers.db [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 23 09:07:49 compute-0 sshd-session[182375]: Accepted publickey for zuul from 192.168.122.30 port 33074 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:07:49 compute-0 systemd-logind[746]: New session 25 of user zuul.
Jan 23 09:07:49 compute-0 systemd[1]: Started Session 25 of User zuul.
Jan 23 09:07:49 compute-0 sshd-session[182375]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:07:50 compute-0 python3.9[182528]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:07:51 compute-0 sudo[182682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huzpvffazjotcvrqcqabypydmwbdyqea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159271.2537544-68-29255621893768/AnsiballZ_systemd_service.py'
Jan 23 09:07:51 compute-0 sudo[182682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:51 compute-0 python3.9[182684]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:07:51 compute-0 systemd[1]: Reloading.
Jan 23 09:07:52 compute-0 systemd-sysv-generator[182708]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:07:52 compute-0 systemd-rc-local-generator[182705]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:07:52 compute-0 sudo[182682]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:52 compute-0 python3.9[182869]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:07:52 compute-0 network[182886]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:07:52 compute-0 network[182887]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:07:52 compute-0 network[182888]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:07:53 compute-0 podman[182895]: 2026-01-23 09:07:53.667672002 +0000 UTC m=+0.104750218 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 09:07:55 compute-0 sudo[183181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enqcicpsztrmxbezreysaidbacuzooeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159275.2200637-125-32376772586379/AnsiballZ_systemd_service.py'
Jan 23 09:07:55 compute-0 sudo[183181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:55 compute-0 python3.9[183183]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:07:55 compute-0 sudo[183181]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:56 compute-0 sudo[183334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwfctyvjpvmoosrylbqkjzwlwbllfviz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159276.0008354-155-135481817340356/AnsiballZ_file.py'
Jan 23 09:07:56 compute-0 sudo[183334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:56 compute-0 python3.9[183336]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:56 compute-0 sudo[183334]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:56 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:07:56 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:07:56 compute-0 sudo[183487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogeqzhuhwrhqfgrwvdwmpgoszbgbtvnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159276.611657-179-72913885548447/AnsiballZ_file.py'
Jan 23 09:07:56 compute-0 sudo[183487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:56 compute-0 python3.9[183489]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:07:56 compute-0 sudo[183487]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:57 compute-0 sudo[183639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofrddhyamuyrmohvsptcixcaizlrvtvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159277.2090929-206-271259360769967/AnsiballZ_command.py'
Jan 23 09:07:57 compute-0 sudo[183639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:57 compute-0 python3.9[183641]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:57 compute-0 sudo[183639]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:58 compute-0 python3.9[183793]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:07:58 compute-0 sudo[183943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esqexqylutbhzullxgswoneyfjtgtouj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159278.5247774-260-806673157031/AnsiballZ_systemd_service.py'
Jan 23 09:07:58 compute-0 sudo[183943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:58 compute-0 python3.9[183945]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:07:58 compute-0 systemd[1]: Reloading.
Jan 23 09:07:59 compute-0 systemd-rc-local-generator[183965]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:07:59 compute-0 systemd-sysv-generator[183969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:07:59 compute-0 sudo[183943]: pam_unix(sudo:session): session closed for user root
Jan 23 09:07:59 compute-0 sudo[184129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzwigotmehkvvoepntrwlowrmmlgdxkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159279.3687544-284-28337516345579/AnsiballZ_command.py'
Jan 23 09:07:59 compute-0 sudo[184129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:07:59 compute-0 python3.9[184131]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:07:59 compute-0 sudo[184129]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:00 compute-0 sudo[184282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzlsfqdbcdsrjfqwouucdvvmlwkmvhqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159280.0555139-311-259204783562770/AnsiballZ_file.py'
Jan 23 09:08:00 compute-0 sudo[184282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:00 compute-0 python3.9[184284]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:00 compute-0 sudo[184282]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:01 compute-0 python3.9[184434]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:01 compute-0 sudo[184586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovaflznixywwidrtwbepkianqdiftxuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159281.357951-359-53480858209640/AnsiballZ_group.py'
Jan 23 09:08:01 compute-0 sudo[184586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:01 compute-0 python3.9[184588]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 23 09:08:01 compute-0 sudo[184586]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:02 compute-0 sudo[184738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlzhqsnwhpkpfgmyymodfvfqrquyjakh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159282.480994-392-264291597833616/AnsiballZ_getent.py'
Jan 23 09:08:02 compute-0 sudo[184738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:03 compute-0 python3.9[184740]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 23 09:08:03 compute-0 sudo[184738]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:03 compute-0 sudo[184891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oulapfekjmhinbgcqmcoggiybuiupxwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159283.304783-416-72677094039018/AnsiballZ_group.py'
Jan 23 09:08:03 compute-0 sudo[184891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:03 compute-0 python3.9[184893]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:08:03 compute-0 groupadd[184894]: group added to /etc/group: name=ceilometer, GID=42405
Jan 23 09:08:03 compute-0 groupadd[184894]: group added to /etc/gshadow: name=ceilometer
Jan 23 09:08:03 compute-0 groupadd[184894]: new group: name=ceilometer, GID=42405
Jan 23 09:08:03 compute-0 sudo[184891]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:04 compute-0 sudo[185049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iunazoavyhgeuipirgnatjrasuvuympa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159284.1705146-440-218571375748958/AnsiballZ_user.py'
Jan 23 09:08:04 compute-0 sudo[185049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:04 compute-0 python3.9[185051]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 09:08:04 compute-0 useradd[185053]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 09:08:04 compute-0 useradd[185053]: add 'ceilometer' to group 'libvirt'
Jan 23 09:08:04 compute-0 useradd[185053]: add 'ceilometer' to shadow group 'libvirt'
Jan 23 09:08:04 compute-0 sudo[185049]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:06 compute-0 python3.9[185209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:06 compute-0 python3.9[185330]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769159285.6410544-518-251584300114703/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:07 compute-0 python3.9[185480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:07 compute-0 python3.9[185601]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769159286.740142-518-89643400665210/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:07 compute-0 python3.9[185751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:08 compute-0 python3.9[185872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769159287.597763-518-101597980151869/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:08 compute-0 python3.9[186022]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:09 compute-0 python3.9[186174]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:09 compute-0 python3.9[186326]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:10 compute-0 nova_compute[182092]: 2026-01-23 09:08:10.076 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:10 compute-0 nova_compute[182092]: 2026-01-23 09:08:10.107 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:10 compute-0 python3.9[186447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159289.5674894-695-252397601569157/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:10 compute-0 python3.9[186597]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:11 compute-0 python3.9[186718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159290.500062-695-106433552087169/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:11 compute-0 python3.9[186868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:12 compute-0 python3.9[186989]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159291.3758945-782-96611295327793/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:12 compute-0 python3.9[187139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:13 compute-0 podman[187234]: 2026-01-23 09:08:13.025185029 +0000 UTC m=+0.037551121 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:08:13 compute-0 python3.9[187272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159292.4288285-830-218249896025576/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:13 compute-0 python3.9[187426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:14 compute-0 python3.9[187547]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159293.3175921-875-175157407749215/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:14 compute-0 python3.9[187697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:14 compute-0 python3.9[187818]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159294.2275848-920-14305137439232/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:15 compute-0 sudo[187968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jreegnltjlosldghitvvzyenhadspigp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159295.1295824-965-183936376561172/AnsiballZ_file.py'
Jan 23 09:08:15 compute-0 sudo[187968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:15 compute-0 python3.9[187970]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:15 compute-0 sudo[187968]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:15 compute-0 sudo[188120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxrcxpxtjsrmmeudgbnsersicrrmcmfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159295.6803365-989-240056885663099/AnsiballZ_file.py'
Jan 23 09:08:15 compute-0 sudo[188120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:16 compute-0 python3.9[188122]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:16 compute-0 sudo[188120]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:16 compute-0 python3.9[188272]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:17 compute-0 python3.9[188424]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:17 compute-0 python3.9[188576]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:17 compute-0 sudo[188728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywcrqbkinvgknjdbogwztclrjbckypos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159297.7639878-1085-43873408523276/AnsiballZ_file.py'
Jan 23 09:08:17 compute-0 sudo[188728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:18 compute-0 python3.9[188730]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:18 compute-0 sudo[188728]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:18 compute-0 sudo[188880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wopstcelguhikqcduyglqgunnzxrhzbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159298.3143635-1109-21166523741214/AnsiballZ_systemd_service.py'
Jan 23 09:08:18 compute-0 sudo[188880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:18 compute-0 python3.9[188882]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:08:18 compute-0 systemd[1]: Reloading.
Jan 23 09:08:18 compute-0 systemd-rc-local-generator[188907]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:08:18 compute-0 systemd-sysv-generator[188910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:08:19 compute-0 systemd[1]: Listening on Podman API Socket.
Jan 23 09:08:19 compute-0 sudo[188880]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:19 compute-0 sudo[189071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upbmmlwvkqmgwcymgazjcqhpcknvucds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159299.3423345-1136-96671800855525/AnsiballZ_stat.py'
Jan 23 09:08:19 compute-0 sudo[189071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:19 compute-0 python3.9[189073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:19 compute-0 sudo[189071]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:19 compute-0 sudo[189194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aexdqnzfnqekzmusbvcsdogngldtpdyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159299.3423345-1136-96671800855525/AnsiballZ_copy.py'
Jan 23 09:08:20 compute-0 sudo[189194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:20 compute-0 python3.9[189196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159299.3423345-1136-96671800855525/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:20 compute-0 sudo[189194]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:20 compute-0 sudo[189270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbyeuaumpcrmsvngmbyxjaqrootvncvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159299.3423345-1136-96671800855525/AnsiballZ_stat.py'
Jan 23 09:08:20 compute-0 sudo[189270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:20 compute-0 python3.9[189272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:20 compute-0 sudo[189270]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:20 compute-0 sudo[189393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzroqicsynhnblfetnlfpxhwdsigzkts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159299.3423345-1136-96671800855525/AnsiballZ_copy.py'
Jan 23 09:08:20 compute-0 sudo[189393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:20 compute-0 python3.9[189395]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159299.3423345-1136-96671800855525/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:20 compute-0 sudo[189393]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:21 compute-0 sudo[189545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxmkscsayxtfzqqpoymvqpnfvvmrndaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159301.6461287-1232-14674707174438/AnsiballZ_file.py'
Jan 23 09:08:21 compute-0 sudo[189545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:22 compute-0 python3.9[189547]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:22 compute-0 sudo[189545]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:22 compute-0 sudo[189697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obseydhkompobmqxntjlulbfujkhbgkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159302.2179964-1256-258176912583595/AnsiballZ_file.py'
Jan 23 09:08:22 compute-0 sudo[189697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:22 compute-0 python3.9[189699]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:22 compute-0 sudo[189697]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:22 compute-0 sudo[189849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnjqlyuurzutmhpuomeofybcyqmebxnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159302.760955-1280-239493309960244/AnsiballZ_stat.py'
Jan 23 09:08:22 compute-0 sudo[189849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:23 compute-0 python3.9[189851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:23 compute-0 sudo[189849]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:23 compute-0 sudo[189972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qesrsfltwhxefxemupynzxqmyevlpeoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159302.760955-1280-239493309960244/AnsiballZ_copy.py'
Jan 23 09:08:23 compute-0 sudo[189972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:23 compute-0 python3.9[189974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159302.760955-1280-239493309960244/.source.json _original_basename=.ku6bev_s follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:23 compute-0 sudo[189972]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:24 compute-0 podman[190098]: 2026-01-23 09:08:24.014244761 +0000 UTC m=+0.071059352 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 09:08:24 compute-0 python3.9[190133]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:25 compute-0 sudo[190568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycytxasjpcmmoxbhtsrqtywavolzpajn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159305.4028833-1400-127036435253236/AnsiballZ_container_config_data.py'
Jan 23 09:08:25 compute-0 sudo[190568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:25 compute-0 python3.9[190570]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 23 09:08:25 compute-0 sudo[190568]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:26 compute-0 sudo[190720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgtwjhgfvfcagzremkvflpyfalejfzic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159306.293706-1433-241536404382895/AnsiballZ_container_config_hash.py'
Jan 23 09:08:26 compute-0 sudo[190720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:26 compute-0 python3.9[190722]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 09:08:26 compute-0 sudo[190720]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:27 compute-0 sudo[190872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfamifbxgyayspycahdtgvbcehczvoat ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769159307.131038-1463-239139326571654/AnsiballZ_edpm_container_manage.py'
Jan 23 09:08:27 compute-0 sudo[190872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:27 compute-0 python3[190874]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 09:08:27 compute-0 podman[190903]: 2026-01-23 09:08:27.839801216 +0000 UTC m=+0.029943888 container create 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:08:27 compute-0 podman[190903]: 2026-01-23 09:08:27.826018772 +0000 UTC m=+0.016161464 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 23 09:08:27 compute-0 python3[190874]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 23 09:08:27 compute-0 sudo[190872]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:28 compute-0 sudo[191079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izeqrsbppccicmwmofizjocnhamvjxvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159308.1119146-1487-160387976755134/AnsiballZ_stat.py'
Jan 23 09:08:28 compute-0 sudo[191079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:28 compute-0 python3.9[191081]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:28 compute-0 sudo[191079]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:29 compute-0 sudo[191233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqfooffmgvqtpiocyuoztydobmzftsws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159308.8094566-1514-210392249426088/AnsiballZ_file.py'
Jan 23 09:08:29 compute-0 sudo[191233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:29 compute-0 python3.9[191235]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:29 compute-0 sudo[191233]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:29 compute-0 sudo[191309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txbozbeskszhqixqibffpgdyjpncnpon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159308.8094566-1514-210392249426088/AnsiballZ_stat.py'
Jan 23 09:08:29 compute-0 sudo[191309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:29 compute-0 python3.9[191311]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:29 compute-0 sudo[191309]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:29 compute-0 sudo[191460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huhehewvuppsvwcthnxkdjshmzkuujxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159309.5576906-1514-125548992607559/AnsiballZ_copy.py'
Jan 23 09:08:29 compute-0 sudo[191460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:30 compute-0 python3.9[191462]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159309.5576906-1514-125548992607559/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:30 compute-0 sudo[191460]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:30 compute-0 sudo[191536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoozjyzcpziepempaoqgllujxbsfrdas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159309.5576906-1514-125548992607559/AnsiballZ_systemd.py'
Jan 23 09:08:30 compute-0 sudo[191536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:30 compute-0 python3.9[191538]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:08:30 compute-0 systemd[1]: Reloading.
Jan 23 09:08:30 compute-0 systemd-rc-local-generator[191557]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:08:30 compute-0 systemd-sysv-generator[191564]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:08:31 compute-0 sudo[191536]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:31 compute-0 sudo[191647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjmmzakyswtnzbhlsvgbzwybvunlrhxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159309.5576906-1514-125548992607559/AnsiballZ_systemd.py'
Jan 23 09:08:31 compute-0 sudo[191647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:31 compute-0 python3.9[191649]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:08:31 compute-0 systemd[1]: Reloading.
Jan 23 09:08:31 compute-0 systemd-rc-local-generator[191672]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:08:31 compute-0 systemd-sysv-generator[191678]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:08:31 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Jan 23 09:08:31 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:08:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/747390e4701f2608bc4947afa96df0043cbcdb820643c6b36568b2eb1a86c1f0/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 23 09:08:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/747390e4701f2608bc4947afa96df0043cbcdb820643c6b36568b2eb1a86c1f0/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 23 09:08:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/747390e4701f2608bc4947afa96df0043cbcdb820643c6b36568b2eb1a86c1f0/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 23 09:08:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/747390e4701f2608bc4947afa96df0043cbcdb820643c6b36568b2eb1a86c1f0/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 23 09:08:31 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc.
Jan 23 09:08:31 compute-0 podman[191689]: 2026-01-23 09:08:31.936286553 +0000 UTC m=+0.088214809 container init 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: + sudo -E kolla_set_configs
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: sudo: unable to send audit message: Operation not permitted
Jan 23 09:08:31 compute-0 sudo[191708]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Jan 23 09:08:31 compute-0 sudo[191708]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 23 09:08:31 compute-0 sudo[191708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 23 09:08:31 compute-0 podman[191689]: 2026-01-23 09:08:31.967624881 +0000 UTC m=+0.119553117 container start 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:08:31 compute-0 podman[191689]: ceilometer_agent_compute
Jan 23 09:08:31 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Validating config file
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Copying service configuration files
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 23 09:08:31 compute-0 ceilometer_agent_compute[191702]: INFO:__main__:Writing out command to execute
Jan 23 09:08:32 compute-0 sudo[191647]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:32 compute-0 sudo[191708]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: ++ cat /run_command
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: + ARGS=
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: + sudo kolla_copy_cacerts
Jan 23 09:08:32 compute-0 sudo[191722]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: sudo: unable to send audit message: Operation not permitted
Jan 23 09:08:32 compute-0 sudo[191722]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 23 09:08:32 compute-0 sudo[191722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 23 09:08:32 compute-0 sudo[191722]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: + [[ ! -n '' ]]
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: + . kolla_extend_start
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: + umask 0022
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 23 09:08:32 compute-0 podman[191709]: 2026-01-23 09:08:32.052140269 +0000 UTC m=+0.083790662 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 23 09:08:32 compute-0 systemd[1]: 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc-2e6b171b25f52548.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 09:08:32 compute-0 systemd[1]: 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc-2e6b171b25f52548.service: Failed with result 'exit-code'.
Jan 23 09:08:32 compute-0 python3.9[191880]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.796 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.796 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.796 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.796 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.796 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.796 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.796 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.797 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.798 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.799 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.800 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.801 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.804 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.805 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.806 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.811 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.811 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.811 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.811 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.828 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.829 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.830 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.899 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.968 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.968 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.968 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.968 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.968 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.968 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.969 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.970 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.971 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.972 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.973 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.978 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.987 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.989 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.995 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:08:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:08:33 compute-0 sudo[192036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evsiwxlwwvrhihnzqszaxzsvmklfysms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159313.462027-1649-196019740773164/AnsiballZ_stat.py'
Jan 23 09:08:33 compute-0 sudo[192036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:33 compute-0 python3.9[192038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:33 compute-0 sudo[192036]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:34 compute-0 sudo[192161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbmzqotnphqnkkiabwprmnfhtqzwqfac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159313.462027-1649-196019740773164/AnsiballZ_copy.py'
Jan 23 09:08:34 compute-0 sudo[192161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:34 compute-0 python3.9[192163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159313.462027-1649-196019740773164/.source.yaml _original_basename=.ncvo4_fj follow=False checksum=6239a88e873faec6fad31a1dd0ac2738ee6c3cf3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:34 compute-0 sudo[192161]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:34 compute-0 sudo[192313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klhenfzoemkqnaqyhipuzzpsxpmhgxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159314.518516-1694-177742842220694/AnsiballZ_stat.py'
Jan 23 09:08:34 compute-0 sudo[192313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:34 compute-0 python3.9[192315]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:34 compute-0 sudo[192313]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:35 compute-0 sudo[192436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnnyevhzxuplguzxrlutltdsbiyvpgxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159314.518516-1694-177742842220694/AnsiballZ_copy.py'
Jan 23 09:08:35 compute-0 sudo[192436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:35 compute-0 python3.9[192438]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159314.518516-1694-177742842220694/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:35 compute-0 sudo[192436]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:36 compute-0 sudo[192588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehmshxvdoxuffljkzcybxsluvaaciqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159315.9851367-1757-205701856032537/AnsiballZ_file.py'
Jan 23 09:08:36 compute-0 sudo[192588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:36 compute-0 python3.9[192590]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:36 compute-0 sudo[192588]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:36 compute-0 sudo[192740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niyflkteovgaknukdwyomxvlaildqhun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159316.5529702-1781-75793122621474/AnsiballZ_file.py'
Jan 23 09:08:36 compute-0 sudo[192740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:36 compute-0 python3.9[192742]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:36 compute-0 sudo[192740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:37 compute-0 sudo[192892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yerjuwmsictdqkseupxucgjwjzfirzzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159317.142052-1805-113039475733137/AnsiballZ_stat.py'
Jan 23 09:08:37 compute-0 sudo[192892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:37 compute-0 python3.9[192894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:37 compute-0 sudo[192892]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:37 compute-0 sudo[192970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sexlpeylcsmlqbvthkbdyfsmyfiskswt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159317.142052-1805-113039475733137/AnsiballZ_file.py'
Jan 23 09:08:37 compute-0 sudo[192970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:37 compute-0 python3.9[192972]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.z55qseu7 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:37 compute-0 sudo[192970]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:38 compute-0 python3.9[193122]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:08:39.846 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:08:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:08:39.847 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:08:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:08:39.847 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:08:40 compute-0 sudo[193543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koevsjimypbonelroqkiozqopyaijqyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159319.859857-1916-279647242043030/AnsiballZ_container_config_data.py'
Jan 23 09:08:40 compute-0 sudo[193543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:40 compute-0 python3.9[193545]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 23 09:08:40 compute-0 sudo[193543]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:40 compute-0 sudo[193695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usivjgfqwgepiupbfxudjzfbutooupxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159320.6576104-1949-213093826282559/AnsiballZ_container_config_hash.py'
Jan 23 09:08:40 compute-0 sudo[193695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:41 compute-0 python3.9[193697]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 09:08:41 compute-0 sudo[193695]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:41 compute-0 sudo[193847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agniilgsrlyovqkyybqlpvlmnkhyogsn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769159321.3982058-1979-201534896265461/AnsiballZ_edpm_container_manage.py'
Jan 23 09:08:41 compute-0 sudo[193847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:41 compute-0 python3[193849]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 09:08:41 compute-0 podman[193877]: 2026-01-23 09:08:41.943703709 +0000 UTC m=+0.029107439 container create d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:08:41 compute-0 podman[193877]: 2026-01-23 09:08:41.930903639 +0000 UTC m=+0.016307388 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 23 09:08:41 compute-0 python3[193849]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 23 09:08:42 compute-0 sudo[193847]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:42 compute-0 sudo[194053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbpyxhhvhiteclzfjzhteccuhnzfcczk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159322.2525616-2003-27539809507486/AnsiballZ_stat.py'
Jan 23 09:08:42 compute-0 sudo[194053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:42 compute-0 python3.9[194055]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:42 compute-0 sudo[194053]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:43 compute-0 sudo[194217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqmwtppqddccesejcfswxtvrcyjnpdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159322.8723097-2030-135518357029597/AnsiballZ_file.py'
Jan 23 09:08:43 compute-0 sudo[194217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:43 compute-0 podman[194181]: 2026-01-23 09:08:43.112210569 +0000 UTC m=+0.045773524 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:08:43 compute-0 python3.9[194226]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:43 compute-0 sudo[194217]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:43 compute-0 sudo[194300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqfjdksxkdaojanllfpiogysxdukevan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159322.8723097-2030-135518357029597/AnsiballZ_stat.py'
Jan 23 09:08:43 compute-0 sudo[194300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:43 compute-0 python3.9[194302]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:43 compute-0 sudo[194300]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:44 compute-0 sudo[194451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzdnvkawqailpvkjnsogerxoippjmmcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159323.7024171-2030-123147323882608/AnsiballZ_copy.py'
Jan 23 09:08:44 compute-0 sudo[194451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:44 compute-0 python3.9[194453]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159323.7024171-2030-123147323882608/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:44 compute-0 sudo[194451]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:44 compute-0 sudo[194527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nublhyvscvwdvpfqyvwbqcxvzscduwga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159323.7024171-2030-123147323882608/AnsiballZ_systemd.py'
Jan 23 09:08:44 compute-0 sudo[194527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:44 compute-0 python3.9[194529]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:08:44 compute-0 systemd[1]: Reloading.
Jan 23 09:08:44 compute-0 systemd-sysv-generator[194552]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:08:44 compute-0 systemd-rc-local-generator[194549]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:08:44 compute-0 sudo[194527]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:45 compute-0 sudo[194637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiojvtvuxyaxutndhozurkpmyzavhxph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159323.7024171-2030-123147323882608/AnsiballZ_systemd.py'
Jan 23 09:08:45 compute-0 sudo[194637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:45 compute-0 python3.9[194639]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:08:45 compute-0 systemd[1]: Reloading.
Jan 23 09:08:45 compute-0 systemd-sysv-generator[194665]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:08:45 compute-0 systemd-rc-local-generator[194662]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.651 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.653 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.653 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.653 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:08:45 compute-0 systemd[1]: Starting node_exporter container...
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.665 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.665 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.665 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.665 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.665 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.666 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.666 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.666 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.666 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.683 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.684 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.684 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.684 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:08:45 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:08:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09dbcf2a3cb7a5cda257a750a177775d1da1da4c064f26920f3301490f3d3d42/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 23 09:08:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09dbcf2a3cb7a5cda257a750a177775d1da1da4c064f26920f3301490f3d3d42/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 23 09:08:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd.
Jan 23 09:08:45 compute-0 podman[194679]: 2026-01-23 09:08:45.771238357 +0000 UTC m=+0.098883038 container init d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.782Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.782Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.782Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.782Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.783Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.783Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.783Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=arp
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=bcache
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=bonding
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=cpu
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=edac
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=filefd
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=netclass
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=netdev
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=netstat
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=nfs
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=nvme
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=softnet
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=systemd
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=xfs
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.784Z caller=node_exporter.go:117 level=info collector=zfs
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.786Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 23 09:08:45 compute-0 node_exporter[194691]: ts=2026-01-23T09:08:45.786Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 23 09:08:45 compute-0 podman[194679]: 2026-01-23 09:08:45.789518687 +0000 UTC m=+0.117163349 container start d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:08:45 compute-0 podman[194679]: node_exporter
Jan 23 09:08:45 compute-0 systemd[1]: Started node_exporter container.
Jan 23 09:08:45 compute-0 sudo[194637]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:45 compute-0 podman[194700]: 2026-01-23 09:08:45.849249754 +0000 UTC m=+0.049870603 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.960 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.961 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5975MB free_disk=73.58437728881836GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.961 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:08:45 compute-0 nova_compute[182092]: 2026-01-23 09:08:45.962 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:08:46 compute-0 nova_compute[182092]: 2026-01-23 09:08:46.047 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:08:46 compute-0 nova_compute[182092]: 2026-01-23 09:08:46.047 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:08:46 compute-0 nova_compute[182092]: 2026-01-23 09:08:46.080 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:08:46 compute-0 nova_compute[182092]: 2026-01-23 09:08:46.093 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:08:46 compute-0 nova_compute[182092]: 2026-01-23 09:08:46.094 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:08:46 compute-0 nova_compute[182092]: 2026-01-23 09:08:46.094 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:08:46 compute-0 python3.9[194870]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 09:08:47 compute-0 sudo[195020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enwqpgwxrmxexolspjzslttyzqaikavb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159327.3012724-2165-51186531361048/AnsiballZ_stat.py'
Jan 23 09:08:47 compute-0 sudo[195020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:47 compute-0 python3.9[195022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:47 compute-0 sudo[195020]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:48 compute-0 sudo[195145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwiczmvacbrtklqzmgtxgoofeykvgrib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159327.3012724-2165-51186531361048/AnsiballZ_copy.py'
Jan 23 09:08:48 compute-0 sudo[195145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:48 compute-0 python3.9[195147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159327.3012724-2165-51186531361048/.source.yaml _original_basename=.gy8tisn1 follow=False checksum=930dfa97719b238b6de00b8c2ef32b3283f74b2c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:48 compute-0 sudo[195145]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:48 compute-0 sudo[195297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axaxkdfcjtderlwlgwnjdrnyfwnnzchm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159328.4902976-2210-9602745079019/AnsiballZ_stat.py'
Jan 23 09:08:48 compute-0 sudo[195297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:48 compute-0 python3.9[195299]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:48 compute-0 sudo[195297]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:49 compute-0 sudo[195420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxrwwzyrdpgjwrhkazezexpsneryvgtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159328.4902976-2210-9602745079019/AnsiballZ_copy.py'
Jan 23 09:08:49 compute-0 sudo[195420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:49 compute-0 python3.9[195422]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159328.4902976-2210-9602745079019/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:49 compute-0 sudo[195420]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:50 compute-0 sudo[195572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxqjfbgsoaebywforgfcbomnpbajlfhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159329.9237506-2273-22651056494167/AnsiballZ_file.py'
Jan 23 09:08:50 compute-0 sudo[195572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:50 compute-0 python3.9[195574]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:50 compute-0 sudo[195572]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:50 compute-0 sudo[195724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wayvrssmiyuiihrrhtfqolzwvvramhgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159330.476297-2297-43627185007071/AnsiballZ_file.py'
Jan 23 09:08:50 compute-0 sudo[195724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:50 compute-0 python3.9[195726]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:08:50 compute-0 sudo[195724]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:51 compute-0 sudo[195876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvqlnchzppmumoferqhhwxhkvqnnnjdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159331.0492687-2321-186083678130973/AnsiballZ_stat.py'
Jan 23 09:08:51 compute-0 sudo[195876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:51 compute-0 python3.9[195878]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:08:51 compute-0 sudo[195876]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:51 compute-0 sudo[195954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmyhqndpcgfabkbvxnjhfxjitkkhzpcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159331.0492687-2321-186083678130973/AnsiballZ_file.py'
Jan 23 09:08:51 compute-0 sudo[195954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:51 compute-0 python3.9[195956]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.2_oohn3_ recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:51 compute-0 sudo[195954]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:52 compute-0 python3.9[196106]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:54 compute-0 sudo[196527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvdeiqjpirmzexdpdzsjgbfotzpbtcyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159333.8288214-2432-4168174132494/AnsiballZ_container_config_data.py'
Jan 23 09:08:54 compute-0 sudo[196527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:54 compute-0 python3.9[196529]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 23 09:08:54 compute-0 sudo[196527]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:54 compute-0 podman[196530]: 2026-01-23 09:08:54.218678309 +0000 UTC m=+0.055784441 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:08:54 compute-0 sudo[196702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdeuufbasqokmwufztciuwrvfevwmygc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159334.6074367-2465-192865880555435/AnsiballZ_container_config_hash.py'
Jan 23 09:08:54 compute-0 sudo[196702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:54 compute-0 python3.9[196704]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 09:08:54 compute-0 sudo[196702]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:55 compute-0 sudo[196854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syurzowzfylnfnhdvjrxtowhcphnggpr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769159335.3879747-2495-29388141182627/AnsiballZ_edpm_container_manage.py'
Jan 23 09:08:55 compute-0 sudo[196854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:55 compute-0 python3[196856]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 09:08:57 compute-0 podman[196867]: 2026-01-23 09:08:57.161804608 +0000 UTC m=+1.325765988 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 23 09:08:57 compute-0 podman[196946]: 2026-01-23 09:08:57.252080488 +0000 UTC m=+0.027080661 container create e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:08:57 compute-0 podman[196946]: 2026-01-23 09:08:57.238957351 +0000 UTC m=+0.013957534 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 23 09:08:57 compute-0 python3[196856]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 23 09:08:57 compute-0 sudo[196854]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:57 compute-0 sudo[197123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtgnurkmdstgqvocfgjnajcoveuinxob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159337.5762343-2519-87424108666999/AnsiballZ_stat.py'
Jan 23 09:08:57 compute-0 sudo[197123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:58 compute-0 python3.9[197125]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:58 compute-0 sudo[197123]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:58 compute-0 sudo[197277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iikssheobdfpqgsuoformncvfdhocrzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159338.3874817-2546-113433324497683/AnsiballZ_file.py'
Jan 23 09:08:58 compute-0 sudo[197277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:58 compute-0 python3.9[197279]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:58 compute-0 sudo[197277]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:58 compute-0 sudo[197353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owtboleaidphaqwgkgpqgpwmrnuomdeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159338.3874817-2546-113433324497683/AnsiballZ_stat.py'
Jan 23 09:08:58 compute-0 sudo[197353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:59 compute-0 python3.9[197355]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:08:59 compute-0 sudo[197353]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:59 compute-0 sudo[197504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqucmyfplpnbboshdpfxegeclqquxwzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159339.086449-2546-158175029106987/AnsiballZ_copy.py'
Jan 23 09:08:59 compute-0 sudo[197504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:59 compute-0 python3.9[197506]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159339.086449-2546-158175029106987/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:08:59 compute-0 sudo[197504]: pam_unix(sudo:session): session closed for user root
Jan 23 09:08:59 compute-0 sudo[197580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sikhurycntgvulxhfzdritsnemuxbuqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159339.086449-2546-158175029106987/AnsiballZ_systemd.py'
Jan 23 09:08:59 compute-0 sudo[197580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:08:59 compute-0 python3.9[197582]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:08:59 compute-0 systemd[1]: Reloading.
Jan 23 09:09:00 compute-0 systemd-sysv-generator[197605]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:09:00 compute-0 systemd-rc-local-generator[197602]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:09:00 compute-0 sudo[197580]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:00 compute-0 sudo[197691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggenlilykrauxpvccuqifochtcasuqgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159339.086449-2546-158175029106987/AnsiballZ_systemd.py'
Jan 23 09:09:00 compute-0 sudo[197691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:00 compute-0 python3.9[197693]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:09:00 compute-0 systemd[1]: Reloading.
Jan 23 09:09:00 compute-0 systemd-rc-local-generator[197715]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:09:00 compute-0 systemd-sysv-generator[197718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:09:00 compute-0 systemd[1]: Starting podman_exporter container...
Jan 23 09:09:01 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:09:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d9fa2f0bfe73a20e49ef72d85f547edf51b7c79e2f0889398192f7507fa6a5b/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 23 09:09:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d9fa2f0bfe73a20e49ef72d85f547edf51b7c79e2f0889398192f7507fa6a5b/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 23 09:09:01 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3.
Jan 23 09:09:01 compute-0 podman[197733]: 2026-01-23 09:09:01.044294429 +0000 UTC m=+0.073571968 container init e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:09:01 compute-0 podman_exporter[197745]: ts=2026-01-23T09:09:01.054Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 23 09:09:01 compute-0 podman_exporter[197745]: ts=2026-01-23T09:09:01.055Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 23 09:09:01 compute-0 podman_exporter[197745]: ts=2026-01-23T09:09:01.055Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 23 09:09:01 compute-0 podman_exporter[197745]: ts=2026-01-23T09:09:01.055Z caller=handler.go:105 level=info collector=container
Jan 23 09:09:01 compute-0 podman[197733]: 2026-01-23 09:09:01.056536164 +0000 UTC m=+0.085813683 container start e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:09:01 compute-0 podman[197733]: podman_exporter
Jan 23 09:09:01 compute-0 systemd[1]: Starting Podman API Service...
Jan 23 09:09:01 compute-0 systemd[1]: Started Podman API Service.
Jan 23 09:09:01 compute-0 systemd[1]: Started podman_exporter container.
Jan 23 09:09:01 compute-0 sudo[197691]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:01 compute-0 podman[197756]: time="2026-01-23T09:09:01Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 23 09:09:01 compute-0 podman[197756]: time="2026-01-23T09:09:01Z" level=info msg="Setting parallel job count to 13"
Jan 23 09:09:01 compute-0 podman[197756]: time="2026-01-23T09:09:01Z" level=info msg="Using sqlite as database backend"
Jan 23 09:09:01 compute-0 podman[197756]: time="2026-01-23T09:09:01Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 23 09:09:01 compute-0 podman[197756]: time="2026-01-23T09:09:01Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 23 09:09:01 compute-0 podman[197756]: time="2026-01-23T09:09:01Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 23 09:09:01 compute-0 podman[197756]: @ - - [23/Jan/2026:09:09:01 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 23 09:09:01 compute-0 podman[197756]: time="2026-01-23T09:09:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 23 09:09:01 compute-0 podman[197756]: @ - - [23/Jan/2026:09:09:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18075 "" "Go-http-client/1.1"
Jan 23 09:09:01 compute-0 podman_exporter[197745]: ts=2026-01-23T09:09:01.121Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 23 09:09:01 compute-0 podman_exporter[197745]: ts=2026-01-23T09:09:01.122Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 23 09:09:01 compute-0 podman_exporter[197745]: ts=2026-01-23T09:09:01.122Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 23 09:09:01 compute-0 podman[197754]: 2026-01-23 09:09:01.139216594 +0000 UTC m=+0.073405093 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:09:01 compute-0 systemd[1]: e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3-37f6e869d38f1178.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 09:09:01 compute-0 systemd[1]: e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3-37f6e869d38f1178.service: Failed with result 'exit-code'.
Jan 23 09:09:01 compute-0 python3.9[197935]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 09:09:02 compute-0 podman[197960]: 2026-01-23 09:09:02.199246779 +0000 UTC m=+0.037289448 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:09:02 compute-0 systemd[1]: 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc-2e6b171b25f52548.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 09:09:02 compute-0 systemd[1]: 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc-2e6b171b25f52548.service: Failed with result 'exit-code'.
Jan 23 09:09:02 compute-0 sudo[198101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sifhfesackbpvwlityqsflyffksrbwtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159342.5475998-2681-67787993691482/AnsiballZ_stat.py'
Jan 23 09:09:02 compute-0 sudo[198101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:03 compute-0 python3.9[198103]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:09:03 compute-0 sudo[198101]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:03 compute-0 sudo[198226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iferbeeenjtnkhcqtupjicrecyxkhzqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159342.5475998-2681-67787993691482/AnsiballZ_copy.py'
Jan 23 09:09:03 compute-0 sudo[198226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:03 compute-0 python3.9[198228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159342.5475998-2681-67787993691482/.source.yaml _original_basename=.lidspqvi follow=False checksum=50c5852710e3871f6b2875bada7aace13a19c0eb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:03 compute-0 sudo[198226]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:03 compute-0 sudo[198378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrdfhjqxszhcfnrogrusohkzlbmzcbzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159343.7541184-2726-101939990001064/AnsiballZ_stat.py'
Jan 23 09:09:03 compute-0 sudo[198378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:04 compute-0 python3.9[198380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:09:04 compute-0 sudo[198378]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:04 compute-0 sudo[198501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwhmmcvogzhdpsninzmftewtyfvbbbis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159343.7541184-2726-101939990001064/AnsiballZ_copy.py'
Jan 23 09:09:04 compute-0 sudo[198501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:04 compute-0 python3.9[198503]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159343.7541184-2726-101939990001064/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:09:04 compute-0 sudo[198501]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:05 compute-0 sudo[198653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgxugshhiicfepjgtrpjjmszdolgpreb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159345.2244773-2789-43116512897684/AnsiballZ_file.py'
Jan 23 09:09:05 compute-0 sudo[198653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:05 compute-0 python3.9[198655]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:05 compute-0 sudo[198653]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:05 compute-0 sudo[198805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xllkhrzsglthcsqbxoyjrzhgkntnjcsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159345.7878873-2813-249635060536796/AnsiballZ_file.py'
Jan 23 09:09:05 compute-0 sudo[198805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:06 compute-0 python3.9[198807]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:09:06 compute-0 sudo[198805]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:06 compute-0 sudo[198957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djjfbpnkrfxqbozjkkpfqtihjqmkqnlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159346.3122344-2837-191524262316661/AnsiballZ_stat.py'
Jan 23 09:09:06 compute-0 sudo[198957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:06 compute-0 python3.9[198959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:09:06 compute-0 sudo[198957]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:06 compute-0 sudo[199035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukemxwrxpmxtazvgzewnaakbjscjdtmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159346.3122344-2837-191524262316661/AnsiballZ_file.py'
Jan 23 09:09:06 compute-0 sudo[199035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:07 compute-0 python3.9[199037]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.0b6qi_r3 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:07 compute-0 sudo[199035]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:07 compute-0 python3.9[199187]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:09 compute-0 sudo[199608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgfnwwlgxjlqtwbmnethnsgjcvoyppha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159348.9842346-2948-73869458933036/AnsiballZ_container_config_data.py'
Jan 23 09:09:09 compute-0 sudo[199608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:09 compute-0 python3.9[199610]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 23 09:09:09 compute-0 sudo[199608]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:09 compute-0 sudo[199760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwqkroabfawlppcxbwpoxhoxwsjhqlat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159349.7531652-2981-228629743735610/AnsiballZ_container_config_hash.py'
Jan 23 09:09:09 compute-0 sudo[199760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:10 compute-0 python3.9[199762]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 09:09:10 compute-0 sudo[199760]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:10 compute-0 sudo[199912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atrtlzbmdvbnqshajtwlpkrmedyptujt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769159350.5403059-3011-149774799299369/AnsiballZ_edpm_container_manage.py'
Jan 23 09:09:10 compute-0 sudo[199912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:10 compute-0 python3[199914]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 09:09:13 compute-0 podman[199965]: 2026-01-23 09:09:13.309455645 +0000 UTC m=+0.143089876 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:09:13 compute-0 podman[199926]: 2026-01-23 09:09:13.537046986 +0000 UTC m=+2.547166153 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 23 09:09:13 compute-0 podman[200024]: 2026-01-23 09:09:13.631006114 +0000 UTC m=+0.028534663 container create 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64)
Jan 23 09:09:13 compute-0 podman[200024]: 2026-01-23 09:09:13.617075754 +0000 UTC m=+0.014604313 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 23 09:09:13 compute-0 python3[199914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 23 09:09:13 compute-0 sudo[199912]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:14 compute-0 sudo[200202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppporcwclsfqelcyyvdtmeatqfjvqpbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159353.92568-3035-27518274792608/AnsiballZ_stat.py'
Jan 23 09:09:14 compute-0 sudo[200202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:14 compute-0 python3.9[200204]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:09:14 compute-0 sudo[200202]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:14 compute-0 sudo[200356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgeovnymbvifmlammlvocdtiakxrsjqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159354.6384535-3062-207237411651670/AnsiballZ_file.py'
Jan 23 09:09:14 compute-0 sudo[200356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:15 compute-0 python3.9[200358]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:15 compute-0 sudo[200356]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:15 compute-0 sudo[200432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrerhkphwezcalyvhvqcsjmcnoqufrks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159354.6384535-3062-207237411651670/AnsiballZ_stat.py'
Jan 23 09:09:15 compute-0 sudo[200432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:15 compute-0 python3.9[200434]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:09:15 compute-0 sudo[200432]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:15 compute-0 sudo[200583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aatluymewskvsynanhbskqmxsedgieyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159355.4467454-3062-246049889006020/AnsiballZ_copy.py'
Jan 23 09:09:15 compute-0 sudo[200583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:15 compute-0 python3.9[200585]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159355.4467454-3062-246049889006020/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:15 compute-0 sudo[200583]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:16 compute-0 sudo[200669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjvggromfetcjfemzmjnpsklkqvqbtrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159355.4467454-3062-246049889006020/AnsiballZ_systemd.py'
Jan 23 09:09:16 compute-0 sudo[200669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:16 compute-0 podman[200633]: 2026-01-23 09:09:16.156253033 +0000 UTC m=+0.040264081 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:09:16 compute-0 python3.9[200682]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:09:16 compute-0 systemd[1]: Reloading.
Jan 23 09:09:16 compute-0 systemd-sysv-generator[200709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:09:16 compute-0 systemd-rc-local-generator[200705]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:09:16 compute-0 sudo[200669]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:16 compute-0 sudo[200791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knhudgkwxuqswocwxumhwrymsuoytiew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159355.4467454-3062-246049889006020/AnsiballZ_systemd.py'
Jan 23 09:09:16 compute-0 sudo[200791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:17 compute-0 python3.9[200793]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:09:17 compute-0 systemd[1]: Reloading.
Jan 23 09:09:17 compute-0 systemd-rc-local-generator[200815]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:09:17 compute-0 systemd-sysv-generator[200819]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:09:17 compute-0 systemd[1]: Starting openstack_network_exporter container...
Jan 23 09:09:17 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:09:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12ac7797a815d59c4ca163c2b05d53cf64a6537e7e304ab92d45970ca4f2b401/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 23 09:09:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12ac7797a815d59c4ca163c2b05d53cf64a6537e7e304ab92d45970ca4f2b401/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 23 09:09:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12ac7797a815d59c4ca163c2b05d53cf64a6537e7e304ab92d45970ca4f2b401/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 23 09:09:17 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df.
Jan 23 09:09:17 compute-0 podman[200833]: 2026-01-23 09:09:17.45094614 +0000 UTC m=+0.087204497 container init 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *bridge.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *coverage.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *datapath.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *iface.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *memory.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *ovn.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *pmd_perf.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *pmd_rxq.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: INFO    09:09:17 main.go:48: registering *vswitch.Collector
Jan 23 09:09:17 compute-0 openstack_network_exporter[200845]: NOTICE  09:09:17 main.go:76: listening on https://:9105/metrics
Jan 23 09:09:17 compute-0 podman[200833]: 2026-01-23 09:09:17.468601839 +0000 UTC m=+0.104860184 container start 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Jan 23 09:09:17 compute-0 podman[200833]: openstack_network_exporter
Jan 23 09:09:17 compute-0 systemd[1]: Started openstack_network_exporter container.
Jan 23 09:09:17 compute-0 sudo[200791]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:17 compute-0 podman[200856]: 2026-01-23 09:09:17.524255672 +0000 UTC m=+0.048536267 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 23 09:09:18 compute-0 python3.9[201025]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 09:09:19 compute-0 sudo[201175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-salpflgfwijpkbfdantakmxuksitztgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159358.876822-3197-101822928083788/AnsiballZ_stat.py'
Jan 23 09:09:19 compute-0 sudo[201175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:19 compute-0 python3.9[201177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:09:19 compute-0 sudo[201175]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:19 compute-0 sudo[201300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nofksekhmcpkwsafsxnqpcmetqscjomq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159358.876822-3197-101822928083788/AnsiballZ_copy.py'
Jan 23 09:09:19 compute-0 sudo[201300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:19 compute-0 python3.9[201302]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159358.876822-3197-101822928083788/.source.yaml _original_basename=.s3j90_h6 follow=False checksum=d660a48fb6fc313229f1e5b0a2a20f28b49fddff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:19 compute-0 sudo[201300]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:20 compute-0 sudo[201452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ususkgwajdvqqahsmtipgowtatsasptj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159359.8370476-3242-9359319749300/AnsiballZ_find.py'
Jan 23 09:09:20 compute-0 sudo[201452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:20 compute-0 python3.9[201454]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:09:20 compute-0 sudo[201452]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:20 compute-0 sudo[201604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aatzbbjjmwubwnwbchwimoknrlmlfpcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159360.619315-3270-80786680796555/AnsiballZ_podman_container_info.py'
Jan 23 09:09:20 compute-0 sudo[201604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:21 compute-0 python3.9[201606]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 23 09:09:21 compute-0 sudo[201604]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:21 compute-0 sudo[201766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izgbjvjbpyaishfblhavwdreqavbexki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159361.2695198-3278-172453857188320/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:21 compute-0 sudo[201766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:21 compute-0 python3.9[201768]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:21 compute-0 systemd[1]: Started libpod-conmon-f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d.scope.
Jan 23 09:09:21 compute-0 podman[201769]: 2026-01-23 09:09:21.794759989 +0000 UTC m=+0.045764587 container exec f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:09:21 compute-0 podman[201785]: 2026-01-23 09:09:21.904766331 +0000 UTC m=+0.100463208 container exec_died f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 09:09:21 compute-0 podman[201769]: 2026-01-23 09:09:21.907012919 +0000 UTC m=+0.158017527 container exec_died f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Jan 23 09:09:21 compute-0 systemd[1]: libpod-conmon-f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d.scope: Deactivated successfully.
Jan 23 09:09:21 compute-0 sudo[201766]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:22 compute-0 sudo[201943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyvfxrlssdcsyewknljvpmlaunyzvtur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159362.0690749-3286-17096031751525/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:22 compute-0 sudo[201943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:22 compute-0 python3.9[201945]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:22 compute-0 systemd[1]: Started libpod-conmon-f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d.scope.
Jan 23 09:09:22 compute-0 podman[201946]: 2026-01-23 09:09:22.485891841 +0000 UTC m=+0.042933608 container exec f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:09:22 compute-0 podman[201946]: 2026-01-23 09:09:22.488040384 +0000 UTC m=+0.045082141 container exec_died f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:09:22 compute-0 systemd[1]: libpod-conmon-f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d.scope: Deactivated successfully.
Jan 23 09:09:22 compute-0 sudo[201943]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:22 compute-0 auditd[673]: Audit daemon rotating log files
Jan 23 09:09:22 compute-0 sudo[202120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rimrayflrocpdfqusrvzxrsxaukbmpot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159362.6576412-3294-137864648288354/AnsiballZ_file.py'
Jan 23 09:09:22 compute-0 sudo[202120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:22 compute-0 python3.9[202122]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:23 compute-0 sudo[202120]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:23 compute-0 sudo[202272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vngweqcjxpzegywjpsrltbspfoefxgav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159363.1951044-3303-220768122948687/AnsiballZ_podman_container_info.py'
Jan 23 09:09:23 compute-0 sudo[202272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:23 compute-0 python3.9[202274]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 23 09:09:23 compute-0 sudo[202272]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:23 compute-0 sudo[202434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eslsdshugwqwpbmldravzqajhhyivmlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159363.7162573-3311-100538296405872/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:23 compute-0 sudo[202434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:24 compute-0 python3.9[202436]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:24 compute-0 systemd[1]: Started libpod-conmon-a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d.scope.
Jan 23 09:09:24 compute-0 podman[202437]: 2026-01-23 09:09:24.116280902 +0000 UTC m=+0.044017892 container exec a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 09:09:24 compute-0 podman[202452]: 2026-01-23 09:09:24.170784917 +0000 UTC m=+0.043183018 container exec_died a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 09:09:24 compute-0 podman[202437]: 2026-01-23 09:09:24.173205734 +0000 UTC m=+0.100942724 container exec_died a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:09:24 compute-0 systemd[1]: libpod-conmon-a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d.scope: Deactivated successfully.
Jan 23 09:09:24 compute-0 sudo[202434]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:24 compute-0 sudo[202621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjvzsvknmudxxuqzhchpqddnynvmmedg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159364.3246236-3319-1072619491027/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:24 compute-0 sudo[202621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:24 compute-0 podman[202585]: 2026-01-23 09:09:24.540713775 +0000 UTC m=+0.067420914 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 23 09:09:24 compute-0 python3.9[202629]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:24 compute-0 systemd[1]: Started libpod-conmon-a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d.scope.
Jan 23 09:09:24 compute-0 podman[202637]: 2026-01-23 09:09:24.737810386 +0000 UTC m=+0.048044068 container exec a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:09:24 compute-0 podman[202654]: 2026-01-23 09:09:24.791767508 +0000 UTC m=+0.045156460 container exec_died a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:09:24 compute-0 podman[202637]: 2026-01-23 09:09:24.795128599 +0000 UTC m=+0.105362281 container exec_died a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:09:24 compute-0 systemd[1]: libpod-conmon-a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d.scope: Deactivated successfully.
Jan 23 09:09:24 compute-0 sudo[202621]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:25 compute-0 sudo[202813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqbtbsswminemtdakazgxkdqafwiexiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159364.9471865-3327-139929434635092/AnsiballZ_file.py'
Jan 23 09:09:25 compute-0 sudo[202813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:25 compute-0 python3.9[202815]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:25 compute-0 sudo[202813]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:25 compute-0 sudo[202965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmfyozcqavqjnfkzgwvthbdctbhydhva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159365.469336-3336-185858919980257/AnsiballZ_podman_container_info.py'
Jan 23 09:09:25 compute-0 sudo[202965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:25 compute-0 python3.9[202967]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 23 09:09:25 compute-0 sudo[202965]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:26 compute-0 sudo[203127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwjbqigjcctrdaknjflcdziiotigbupq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159365.9659445-3344-91517785382255/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:26 compute-0 sudo[203127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:26 compute-0 python3.9[203129]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:26 compute-0 systemd[1]: Started libpod-conmon-29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc.scope.
Jan 23 09:09:26 compute-0 podman[203130]: 2026-01-23 09:09:26.355740233 +0000 UTC m=+0.046822465 container exec 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:09:26 compute-0 podman[203130]: 2026-01-23 09:09:26.361842365 +0000 UTC m=+0.052924596 container exec_died 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:09:26 compute-0 sudo[203127]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:26 compute-0 systemd[1]: libpod-conmon-29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc.scope: Deactivated successfully.
Jan 23 09:09:26 compute-0 sudo[203304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvkpledabawuzfhlagwaxnlbikutcxsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159366.5187724-3352-112029087977976/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:26 compute-0 sudo[203304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:26 compute-0 python3.9[203306]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:26 compute-0 systemd[1]: Started libpod-conmon-29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc.scope.
Jan 23 09:09:26 compute-0 podman[203307]: 2026-01-23 09:09:26.932768453 +0000 UTC m=+0.046525563 container exec 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 23 09:09:26 compute-0 podman[203323]: 2026-01-23 09:09:26.986778894 +0000 UTC m=+0.044000959 container exec_died 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:09:26 compute-0 podman[203307]: 2026-01-23 09:09:26.989212716 +0000 UTC m=+0.102969847 container exec_died 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 09:09:26 compute-0 systemd[1]: libpod-conmon-29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc.scope: Deactivated successfully.
Jan 23 09:09:27 compute-0 sudo[203304]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:27 compute-0 sudo[203482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgvjtarxxhivkmhyotgscioeboebcymb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159367.1386533-3360-127425853630862/AnsiballZ_file.py'
Jan 23 09:09:27 compute-0 sudo[203482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:27 compute-0 python3.9[203484]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:27 compute-0 sudo[203482]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:27 compute-0 sudo[203634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzugfrmkwtjwjboszqahuqrxsnondykg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159367.639079-3369-199822424453861/AnsiballZ_podman_container_info.py'
Jan 23 09:09:27 compute-0 sudo[203634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:27 compute-0 python3.9[203636]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 23 09:09:28 compute-0 sudo[203634]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:28 compute-0 sudo[203797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chovjtksuaexxfbyybfwjwccksjlgsov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159368.1301076-3377-265532874811883/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:28 compute-0 sudo[203797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:28 compute-0 python3.9[203799]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:28 compute-0 systemd[1]: Started libpod-conmon-d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd.scope.
Jan 23 09:09:28 compute-0 podman[203800]: 2026-01-23 09:09:28.540441389 +0000 UTC m=+0.045623561 container exec d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:09:28 compute-0 podman[203816]: 2026-01-23 09:09:28.592722419 +0000 UTC m=+0.043342439 container exec_died d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:09:28 compute-0 podman[203800]: 2026-01-23 09:09:28.596127063 +0000 UTC m=+0.101309236 container exec_died d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:09:28 compute-0 systemd[1]: libpod-conmon-d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd.scope: Deactivated successfully.
Jan 23 09:09:28 compute-0 sudo[203797]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:28 compute-0 sudo[203975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saopyibhchppzmnstmgwrkneorhistgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159368.7424548-3385-267977336042531/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:28 compute-0 sudo[203975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:29 compute-0 python3.9[203977]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:29 compute-0 systemd[1]: Started libpod-conmon-d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd.scope.
Jan 23 09:09:29 compute-0 podman[203978]: 2026-01-23 09:09:29.140192457 +0000 UTC m=+0.044872154 container exec d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:09:29 compute-0 podman[203994]: 2026-01-23 09:09:29.194748529 +0000 UTC m=+0.045889464 container exec_died d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:09:29 compute-0 podman[203978]: 2026-01-23 09:09:29.196813345 +0000 UTC m=+0.101493022 container exec_died d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:09:29 compute-0 systemd[1]: libpod-conmon-d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd.scope: Deactivated successfully.
Jan 23 09:09:29 compute-0 sudo[203975]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:29 compute-0 sudo[204152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvaflmkddvvdbqxjjsyjohfqqejtsmlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159369.3719237-3393-108574551936931/AnsiballZ_file.py'
Jan 23 09:09:29 compute-0 sudo[204152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:29 compute-0 python3.9[204154]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:29 compute-0 sudo[204152]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:30 compute-0 sudo[204304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmyjyjwsjrvynhdpbptdqukoajbodlrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159369.9041042-3402-75366048518698/AnsiballZ_podman_container_info.py'
Jan 23 09:09:30 compute-0 sudo[204304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:30 compute-0 python3.9[204306]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 23 09:09:30 compute-0 sudo[204304]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:30 compute-0 sudo[204466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtvjqqydbjhaujfcafywsscfcnlorxev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159370.4085753-3410-157811624280618/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:30 compute-0 sudo[204466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:30 compute-0 python3.9[204468]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:30 compute-0 systemd[1]: Started libpod-conmon-e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3.scope.
Jan 23 09:09:30 compute-0 podman[204469]: 2026-01-23 09:09:30.810600867 +0000 UTC m=+0.042497344 container exec e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:09:30 compute-0 podman[204484]: 2026-01-23 09:09:30.864760411 +0000 UTC m=+0.045479329 container exec_died e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:09:30 compute-0 podman[204469]: 2026-01-23 09:09:30.868947361 +0000 UTC m=+0.100843838 container exec_died e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:09:30 compute-0 systemd[1]: libpod-conmon-e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3.scope: Deactivated successfully.
Jan 23 09:09:30 compute-0 sudo[204466]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:31 compute-0 sudo[204663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gejrtbrbkhactnfmkofmvdfkzzvbtyxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159371.0441542-3418-264267463320157/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:31 compute-0 sudo[204663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:31 compute-0 podman[204592]: 2026-01-23 09:09:31.232433703 +0000 UTC m=+0.065907128 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:09:31 compute-0 python3.9[204665]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:31 compute-0 systemd[1]: Started libpod-conmon-e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3.scope.
Jan 23 09:09:31 compute-0 podman[204666]: 2026-01-23 09:09:31.468266243 +0000 UTC m=+0.058687187 container exec e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:09:31 compute-0 podman[204682]: 2026-01-23 09:09:31.520737601 +0000 UTC m=+0.044260881 container exec_died e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:09:31 compute-0 podman[204666]: 2026-01-23 09:09:31.523529879 +0000 UTC m=+0.113950813 container exec_died e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:09:31 compute-0 systemd[1]: libpod-conmon-e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3.scope: Deactivated successfully.
Jan 23 09:09:31 compute-0 sudo[204663]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:31 compute-0 sudo[204840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvlokmzyassnesqytykqvppwttvfvrkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159371.6705291-3426-173326504995183/AnsiballZ_file.py'
Jan 23 09:09:31 compute-0 sudo[204840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:32 compute-0 python3.9[204842]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:32 compute-0 sudo[204840]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:32 compute-0 sudo[205003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjizanhaqiknllnqgsgerwspjsvvfnur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159372.2407832-3435-249293162913944/AnsiballZ_podman_container_info.py'
Jan 23 09:09:32 compute-0 sudo[205003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:32 compute-0 podman[204966]: 2026-01-23 09:09:32.444268875 +0000 UTC m=+0.045068506 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 09:09:32 compute-0 python3.9[205010]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 23 09:09:32 compute-0 sudo[205003]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:32 compute-0 sudo[205171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yagnaymfwpfmhfimambnmaohqubdrnmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159372.7773435-3443-258523692838549/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:32 compute-0 sudo[205171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:33 compute-0 python3.9[205173]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:33 compute-0 systemd[1]: Started libpod-conmon-7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df.scope.
Jan 23 09:09:33 compute-0 podman[205174]: 2026-01-23 09:09:33.173250091 +0000 UTC m=+0.046285010 container exec 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:09:33 compute-0 podman[205190]: 2026-01-23 09:09:33.22776239 +0000 UTC m=+0.044126949 container exec_died 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 23 09:09:33 compute-0 podman[205174]: 2026-01-23 09:09:33.231164018 +0000 UTC m=+0.104198947 container exec_died 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, version=9.6, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Jan 23 09:09:33 compute-0 systemd[1]: libpod-conmon-7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df.scope: Deactivated successfully.
Jan 23 09:09:33 compute-0 sudo[205171]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:33 compute-0 sudo[205349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khkkfxjfawvaxemajkzwwmskimuaxrta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159373.3854978-3451-88096193803299/AnsiballZ_podman_container_exec.py'
Jan 23 09:09:33 compute-0 sudo[205349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:33 compute-0 python3.9[205351]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 23 09:09:33 compute-0 systemd[1]: Started libpod-conmon-7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df.scope.
Jan 23 09:09:33 compute-0 podman[205352]: 2026-01-23 09:09:33.77679779 +0000 UTC m=+0.042088093 container exec 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6)
Jan 23 09:09:33 compute-0 podman[205369]: 2026-01-23 09:09:33.82776391 +0000 UTC m=+0.042346910 container exec_died 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vendor=Red Hat, Inc.)
Jan 23 09:09:33 compute-0 podman[205352]: 2026-01-23 09:09:33.830041046 +0000 UTC m=+0.095331349 container exec_died 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 23 09:09:33 compute-0 systemd[1]: libpod-conmon-7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df.scope: Deactivated successfully.
Jan 23 09:09:33 compute-0 sudo[205349]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:34 compute-0 sudo[205528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-japzrpuqmsokruqosntdqgciqsrxubpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159373.9913023-3459-98883485177022/AnsiballZ_file.py'
Jan 23 09:09:34 compute-0 sudo[205528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:09:34 compute-0 python3.9[205530]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:09:34 compute-0 sudo[205528]: pam_unix(sudo:session): session closed for user root
Jan 23 09:09:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:09:39.847 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:09:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:09:39.848 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:09:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:09:39.848 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:09:44 compute-0 podman[205555]: 2026-01-23 09:09:44.228166276 +0000 UTC m=+0.064501657 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 09:09:46 compute-0 nova_compute[182092]: 2026-01-23 09:09:46.086 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:46 compute-0 nova_compute[182092]: 2026-01-23 09:09:46.101 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:46 compute-0 nova_compute[182092]: 2026-01-23 09:09:46.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:46 compute-0 nova_compute[182092]: 2026-01-23 09:09:46.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:46 compute-0 nova_compute[182092]: 2026-01-23 09:09:46.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:46 compute-0 nova_compute[182092]: 2026-01-23 09:09:46.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:46 compute-0 nova_compute[182092]: 2026-01-23 09:09:46.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:09:47 compute-0 podman[205571]: 2026-01-23 09:09:47.20013643 +0000 UTC m=+0.039116196 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.663 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.663 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.663 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.680 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.681 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.681 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.681 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.881 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.882 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5911MB free_disk=73.41640853881836GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.882 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.882 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.940 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.941 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.964 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.974 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.975 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:09:47 compute-0 nova_compute[182092]: 2026-01-23 09:09:47.975 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:09:48 compute-0 podman[205592]: 2026-01-23 09:09:48.201148784 +0000 UTC m=+0.039526220 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Jan 23 09:09:55 compute-0 podman[205610]: 2026-01-23 09:09:55.243487122 +0000 UTC m=+0.081392663 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:10:02 compute-0 podman[205708]: 2026-01-23 09:10:02.197236952 +0000 UTC m=+0.036970164 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:10:02 compute-0 sudo[205780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmgbazwmrgqfgmrzkjuttnmhbgiirvtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159402.0639226-3866-188376793678582/AnsiballZ_file.py'
Jan 23 09:10:02 compute-0 sudo[205780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:02 compute-0 python3.9[205782]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:02 compute-0 sudo[205780]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:02 compute-0 sudo[205940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhbeocgpjpbqjfaqquuletceofqzzzqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159402.5950544-3890-5421272779366/AnsiballZ_stat.py'
Jan 23 09:10:02 compute-0 sudo[205940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:02 compute-0 podman[205906]: 2026-01-23 09:10:02.811241024 +0000 UTC m=+0.046007393 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 23 09:10:02 compute-0 python3.9[205950]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:02 compute-0 sudo[205940]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:03 compute-0 sudo[206072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igfxepvrnpjumubyzuignyxbzcqzyjtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159402.5950544-3890-5421272779366/AnsiballZ_copy.py'
Jan 23 09:10:03 compute-0 sudo[206072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:03 compute-0 python3.9[206074]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159402.5950544-3890-5421272779366/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:03 compute-0 sudo[206072]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:03 compute-0 sudo[206224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqacojlkplqovhpahlvflinoolxgceer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159403.674191-3938-172240906742044/AnsiballZ_file.py'
Jan 23 09:10:03 compute-0 sudo[206224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:04 compute-0 python3.9[206226]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:04 compute-0 sudo[206224]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:04 compute-0 sudo[206376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcwydsffsqzdktqfclswfljgwqzvldgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159404.2137225-3962-250833255200845/AnsiballZ_stat.py'
Jan 23 09:10:04 compute-0 sudo[206376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:04 compute-0 python3.9[206378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:04 compute-0 sudo[206376]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:04 compute-0 sudo[206454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxmimrzaifsxzfvesgxufxcjnyckzlvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159404.2137225-3962-250833255200845/AnsiballZ_file.py'
Jan 23 09:10:04 compute-0 sudo[206454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:04 compute-0 python3.9[206456]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:04 compute-0 sudo[206454]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:05 compute-0 sudo[206606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyywjqeuxtoxhyerypdvzrtjedsayexo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159405.0252008-3998-235384290850026/AnsiballZ_stat.py'
Jan 23 09:10:05 compute-0 sudo[206606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:05 compute-0 python3.9[206608]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:05 compute-0 sudo[206606]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:05 compute-0 sudo[206684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhjexifgsxlqohtwlywxmvyhkkzlhifk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159405.0252008-3998-235384290850026/AnsiballZ_file.py'
Jan 23 09:10:05 compute-0 sudo[206684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:05 compute-0 python3.9[206686]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.17a56fps recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:05 compute-0 sudo[206684]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:06 compute-0 sudo[206836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tstcqpzcfgerscisajnmiojjqztrymzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159405.8725512-4034-264496817938411/AnsiballZ_stat.py'
Jan 23 09:10:06 compute-0 sudo[206836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:06 compute-0 python3.9[206838]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:06 compute-0 sudo[206836]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:06 compute-0 sudo[206914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvfvjgelapbglrgistzuepgubgplxxct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159405.8725512-4034-264496817938411/AnsiballZ_file.py'
Jan 23 09:10:06 compute-0 sudo[206914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:06 compute-0 python3.9[206916]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:06 compute-0 sudo[206914]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:07 compute-0 sudo[207066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lizrnjiyvljrsavpbdqtictoyaqrgtzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159406.8492525-4073-83924681134464/AnsiballZ_command.py'
Jan 23 09:10:07 compute-0 sudo[207066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:07 compute-0 python3.9[207068]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:10:07 compute-0 sudo[207066]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:07 compute-0 sudo[207219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvsekrnkzfokxluhfzaomlsqnuveqvju ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769159407.594816-4097-154918412131806/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 09:10:07 compute-0 sudo[207219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:08 compute-0 python3[207221]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 09:10:08 compute-0 sudo[207219]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:08 compute-0 sudo[207371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chqlezgkkocahsxslekdrxoalkcuwgtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159408.2540736-4121-6366409971701/AnsiballZ_stat.py'
Jan 23 09:10:08 compute-0 sudo[207371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:08 compute-0 python3.9[207373]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:08 compute-0 sudo[207371]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:08 compute-0 sudo[207449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldwwpioulknospikdkoffhtqtogbzyjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159408.2540736-4121-6366409971701/AnsiballZ_file.py'
Jan 23 09:10:08 compute-0 sudo[207449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:08 compute-0 python3.9[207451]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:08 compute-0 sudo[207449]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:09 compute-0 sudo[207601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmojvsyktjfpplryltcbklqmxbgpdobd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159409.142334-4157-66791074025615/AnsiballZ_stat.py'
Jan 23 09:10:09 compute-0 sudo[207601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:09 compute-0 python3.9[207603]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:09 compute-0 sudo[207601]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:09 compute-0 sudo[207679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vatkyegkycspwfozseivvxoewwndojtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159409.142334-4157-66791074025615/AnsiballZ_file.py'
Jan 23 09:10:09 compute-0 sudo[207679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:09 compute-0 python3.9[207681]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:09 compute-0 sudo[207679]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:10 compute-0 sudo[207831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkzynmknjyqopoelepmhibtxlyslfsvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159410.0936937-4193-212592736899182/AnsiballZ_stat.py'
Jan 23 09:10:10 compute-0 sudo[207831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:10 compute-0 python3.9[207833]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:10 compute-0 sudo[207831]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:10 compute-0 sudo[207909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxvgguomvdigvnowrsrkvuodzheeekwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159410.0936937-4193-212592736899182/AnsiballZ_file.py'
Jan 23 09:10:10 compute-0 sudo[207909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:10 compute-0 python3.9[207911]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:10 compute-0 sudo[207909]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:11 compute-0 sudo[208061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyjtvpivhmorxmnhnczcjycaosluvzmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159410.9596648-4229-147544886080791/AnsiballZ_stat.py'
Jan 23 09:10:11 compute-0 sudo[208061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:11 compute-0 python3.9[208063]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:11 compute-0 sudo[208061]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:11 compute-0 sudo[208139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pclbddcqmjonjqlqgqgufedmggquzxwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159410.9596648-4229-147544886080791/AnsiballZ_file.py'
Jan 23 09:10:11 compute-0 sudo[208139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:11 compute-0 python3.9[208141]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:11 compute-0 sudo[208139]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:12 compute-0 sudo[208291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfrthfqhnpjqmclgjlguhkxppccgigpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159411.8781316-4265-216329956493328/AnsiballZ_stat.py'
Jan 23 09:10:12 compute-0 sudo[208291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:12 compute-0 python3.9[208293]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:10:12 compute-0 sudo[208291]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:12 compute-0 sudo[208416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivpawbxnikodlzsivmfukqgtdrqojygl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159411.8781316-4265-216329956493328/AnsiballZ_copy.py'
Jan 23 09:10:12 compute-0 sudo[208416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:12 compute-0 python3.9[208418]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159411.8781316-4265-216329956493328/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:12 compute-0 sudo[208416]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:13 compute-0 sudo[208568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnddbmiujxnlftofvqhzqooaouhvxzdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159412.934669-4310-246377261889510/AnsiballZ_file.py'
Jan 23 09:10:13 compute-0 sudo[208568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:13 compute-0 python3.9[208570]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:13 compute-0 sudo[208568]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:13 compute-0 sudo[208720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkaurzemlowuiroznugtjnnbsrzoqjue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159413.4200044-4334-238706894558944/AnsiballZ_command.py'
Jan 23 09:10:13 compute-0 sudo[208720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:13 compute-0 python3.9[208722]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:10:13 compute-0 sudo[208720]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:14 compute-0 sudo[208875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijcanqdcfkjokzqaehzisxuugxuoljvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159413.9404914-4358-478766579761/AnsiballZ_blockinfile.py'
Jan 23 09:10:14 compute-0 sudo[208875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:14 compute-0 podman[208877]: 2026-01-23 09:10:14.293195927 +0000 UTC m=+0.033584662 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 09:10:14 compute-0 python3.9[208878]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:14 compute-0 sudo[208875]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:14 compute-0 sudo[209043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twxznlscdsnmlqlqlkwsxdiswpqrgqbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159414.6684623-4385-273110756768150/AnsiballZ_command.py'
Jan 23 09:10:14 compute-0 sudo[209043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:14 compute-0 python3.9[209045]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:10:15 compute-0 sudo[209043]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:15 compute-0 sudo[209196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flisfgevfcyapakfrvcsmapphphonbgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159415.3895156-4409-59120800376255/AnsiballZ_stat.py'
Jan 23 09:10:15 compute-0 sudo[209196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:15 compute-0 python3.9[209198]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:10:15 compute-0 sudo[209196]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:16 compute-0 sudo[209350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yganfgtqborjfbsorgjdqvzodzijmrmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159415.9203725-4433-91699413000766/AnsiballZ_command.py'
Jan 23 09:10:16 compute-0 sudo[209350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:16 compute-0 python3.9[209352]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:10:16 compute-0 sudo[209350]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:16 compute-0 sudo[209505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxxmwfqdnkeaobybromzthtbdoiqsioa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769159416.4634116-4457-226359172862794/AnsiballZ_file.py'
Jan 23 09:10:16 compute-0 sudo[209505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:10:16 compute-0 python3.9[209507]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:10:16 compute-0 sudo[209505]: pam_unix(sudo:session): session closed for user root
Jan 23 09:10:17 compute-0 sshd-session[182378]: Connection closed by 192.168.122.30 port 33074
Jan 23 09:10:17 compute-0 sshd-session[182375]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:10:17 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Jan 23 09:10:17 compute-0 systemd[1]: session-25.scope: Consumed 1min 16.415s CPU time.
Jan 23 09:10:17 compute-0 systemd-logind[746]: Session 25 logged out. Waiting for processes to exit.
Jan 23 09:10:17 compute-0 systemd-logind[746]: Removed session 25.
Jan 23 09:10:17 compute-0 podman[209532]: 2026-01-23 09:10:17.331292483 +0000 UTC m=+0.039198235 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:10:19 compute-0 podman[209553]: 2026-01-23 09:10:19.228490905 +0000 UTC m=+0.067296253 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=)
Jan 23 09:10:26 compute-0 podman[209572]: 2026-01-23 09:10:26.241347514 +0000 UTC m=+0.079468472 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:10:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:10:33 compute-0 podman[209596]: 2026-01-23 09:10:33.194742784 +0000 UTC m=+0.031211216 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:10:33 compute-0 podman[209595]: 2026-01-23 09:10:33.201256716 +0000 UTC m=+0.039283106 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 23 09:10:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:10:39.848 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:10:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:10:39.848 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:10:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:10:39.849 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:10:45 compute-0 podman[209633]: 2026-01-23 09:10:45.192258433 +0000 UTC m=+0.030945065 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:10:46 compute-0 nova_compute[182092]: 2026-01-23 09:10:46.962 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:47 compute-0 nova_compute[182092]: 2026-01-23 09:10:47.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:47 compute-0 nova_compute[182092]: 2026-01-23 09:10:47.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:48 compute-0 podman[209649]: 2026-01-23 09:10:48.195126378 +0000 UTC m=+0.034713690 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.671 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.839 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.840 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6019MB free_disk=73.41617965698242GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.840 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.840 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.891 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.891 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.919 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.932 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.933 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:10:48 compute-0 nova_compute[182092]: 2026-01-23 09:10:48.934 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:10:49 compute-0 nova_compute[182092]: 2026-01-23 09:10:49.934 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:49 compute-0 nova_compute[182092]: 2026-01-23 09:10:49.935 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:10:49 compute-0 nova_compute[182092]: 2026-01-23 09:10:49.935 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:10:49 compute-0 nova_compute[182092]: 2026-01-23 09:10:49.945 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:10:49 compute-0 nova_compute[182092]: 2026-01-23 09:10:49.946 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:10:50 compute-0 podman[209670]: 2026-01-23 09:10:50.205946083 +0000 UTC m=+0.043455211 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 23 09:10:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:10:53.285 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:10:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:10:53.286 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:10:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:10:53.287 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:10:57 compute-0 podman[209688]: 2026-01-23 09:10:57.217706574 +0000 UTC m=+0.052290700 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:11:04 compute-0 podman[209712]: 2026-01-23 09:11:04.203213267 +0000 UTC m=+0.038775219 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:11:04 compute-0 podman[209711]: 2026-01-23 09:11:04.203239908 +0000 UTC m=+0.040825364 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:11:16 compute-0 podman[209749]: 2026-01-23 09:11:16.222199396 +0000 UTC m=+0.060617105 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 09:11:19 compute-0 podman[209766]: 2026-01-23 09:11:19.222168838 +0000 UTC m=+0.061208050 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:11:21 compute-0 podman[209788]: 2026-01-23 09:11:21.199183655 +0000 UTC m=+0.038138177 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41)
Jan 23 09:11:28 compute-0 podman[209806]: 2026-01-23 09:11:28.213546867 +0000 UTC m=+0.052412375 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 09:11:35 compute-0 podman[209830]: 2026-01-23 09:11:35.198200347 +0000 UTC m=+0.033728533 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:11:35 compute-0 podman[209829]: 2026-01-23 09:11:35.203165039 +0000 UTC m=+0.041385190 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 23 09:11:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:11:39.849 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:11:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:11:39.850 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:11:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:11:39.850 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:11:47 compute-0 podman[209867]: 2026-01-23 09:11:47.203242803 +0000 UTC m=+0.037346013 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.669 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.670 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.851 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.852 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6088MB free_disk=73.41617965698242GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.852 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.852 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.890 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.891 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.906 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.918 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.919 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:11:48 compute-0 nova_compute[182092]: 2026-01-23 09:11:48.919 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:11:49 compute-0 nova_compute[182092]: 2026-01-23 09:11:49.914 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:49 compute-0 nova_compute[182092]: 2026-01-23 09:11:49.932 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:49 compute-0 nova_compute[182092]: 2026-01-23 09:11:49.932 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:11:49 compute-0 nova_compute[182092]: 2026-01-23 09:11:49.932 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:11:49 compute-0 nova_compute[182092]: 2026-01-23 09:11:49.942 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:11:49 compute-0 nova_compute[182092]: 2026-01-23 09:11:49.942 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:49 compute-0 nova_compute[182092]: 2026-01-23 09:11:49.943 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:49 compute-0 nova_compute[182092]: 2026-01-23 09:11:49.943 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:11:50 compute-0 podman[209883]: 2026-01-23 09:11:50.193836746 +0000 UTC m=+0.031663308 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:11:50 compute-0 nova_compute[182092]: 2026-01-23 09:11:50.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:50 compute-0 nova_compute[182092]: 2026-01-23 09:11:50.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:11:52 compute-0 podman[209904]: 2026-01-23 09:11:52.19591712 +0000 UTC m=+0.034199491 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 23 09:11:59 compute-0 podman[209922]: 2026-01-23 09:11:59.221154226 +0000 UTC m=+0.053722805 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:12:06 compute-0 podman[209947]: 2026-01-23 09:12:06.200250989 +0000 UTC m=+0.036963071 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:12:06 compute-0 podman[209946]: 2026-01-23 09:12:06.203358548 +0000 UTC m=+0.041850116 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 23 09:12:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:13.986 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:12:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:13.987 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:12:18 compute-0 podman[209985]: 2026-01-23 09:12:18.195186125 +0000 UTC m=+0.034308769 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 09:12:21 compute-0 podman[210001]: 2026-01-23 09:12:21.18945872 +0000 UTC m=+0.027968118 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:12:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:21.988 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:23 compute-0 podman[210022]: 2026-01-23 09:12:23.215159726 +0000 UTC m=+0.051757224 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.357 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "258a0672-5288-41e7-977b-bfb206a39d13" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.357 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.381 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.519 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.519 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.523 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.523 182096 INFO nova.compute.claims [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.680 182096 DEBUG nova.compute.provider_tree [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.697 182096 DEBUG nova.scheduler.client.report [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.721 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.721 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.801 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.801 182096 DEBUG nova.network.neutron [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.824 182096 INFO nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.847 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.948 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.949 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.949 182096 INFO nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Creating image(s)
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.950 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "/var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.950 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "/var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.951 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "/var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.951 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:26 compute-0 nova_compute[182092]: 2026-01-23 09:12:26.952 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:27 compute-0 nova_compute[182092]: 2026-01-23 09:12:27.603 182096 DEBUG nova.network.neutron [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Automatically allocating a network for project 3efab85291684108a8ee2220e9c67932. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.575 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.619 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147.part --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.620 182096 DEBUG nova.virt.images [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] 84bf9744-ebe0-4357-9697-347a3a1a297e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.621 182096 DEBUG nova.privsep.utils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.621 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147.part /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.678 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147.part /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147.converted" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.681 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.724 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147.converted --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.725 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:28 compute-0 nova_compute[182092]: 2026-01-23 09:12:28.735 182096 INFO oslo.privsep.daemon [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp0yvg_teo/privsep.sock']
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.250 182096 INFO oslo.privsep.daemon [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Spawned new privsep daemon via rootwrap
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.175 210059 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.178 210059 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.179 210059 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.180 210059 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210059
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.312 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.364 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.365 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.366 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.375 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.426 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.427 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.445 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.446 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.446 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.488 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.488 182096 DEBUG nova.virt.disk.api [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Checking if we can resize image /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.489 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.530 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.531 182096 DEBUG nova.virt.disk.api [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Cannot resize image /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.531 182096 DEBUG nova.objects.instance [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lazy-loading 'migration_context' on Instance uuid 258a0672-5288-41e7-977b-bfb206a39d13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.544 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.544 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Ensure instance console log exists: /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.545 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.545 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:29 compute-0 nova_compute[182092]: 2026-01-23 09:12:29.545 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:30 compute-0 podman[210076]: 2026-01-23 09:12:30.22329305 +0000 UTC m=+0.062073593 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:12:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:12:37 compute-0 podman[210101]: 2026-01-23 09:12:37.197849597 +0000 UTC m=+0.034455074 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:12:37 compute-0 podman[210100]: 2026-01-23 09:12:37.203048013 +0000 UTC m=+0.041130027 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 23 09:12:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:39.850 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:39.850 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:39.851 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:42 compute-0 nova_compute[182092]: 2026-01-23 09:12:42.607 182096 DEBUG nova.network.neutron [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Automatically allocated network: {'id': '11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'name': 'auto_allocated_network', 'tenant_id': '3efab85291684108a8ee2220e9c67932', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['14620e29-4e88-430f-b8a4-aa4012ee4f47', 'd3f0e777-e17d-4105-aad2-66c2489612ee'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2026-01-23T09:12:28Z', 'updated_at': '2026-01-23T09:12:35Z', 'revision_number': 4, 'project_id': '3efab85291684108a8ee2220e9c67932'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Jan 23 09:12:42 compute-0 nova_compute[182092]: 2026-01-23 09:12:42.614 182096 WARNING oslo_policy.policy [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 23 09:12:42 compute-0 nova_compute[182092]: 2026-01-23 09:12:42.614 182096 WARNING oslo_policy.policy [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 23 09:12:42 compute-0 nova_compute[182092]: 2026-01-23 09:12:42.616 182096 DEBUG nova.policy [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4390e6999da44096ad45afe3bbe97b11', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3efab85291684108a8ee2220e9c67932', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:12:44 compute-0 nova_compute[182092]: 2026-01-23 09:12:44.174 182096 DEBUG nova.network.neutron [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Successfully created port: 06b16ef7-c2eb-4bf9-a688-01fe2726a190 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:12:45 compute-0 nova_compute[182092]: 2026-01-23 09:12:45.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:45 compute-0 nova_compute[182092]: 2026-01-23 09:12:45.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:12:45 compute-0 nova_compute[182092]: 2026-01-23 09:12:45.676 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:12:45 compute-0 nova_compute[182092]: 2026-01-23 09:12:45.677 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:45 compute-0 nova_compute[182092]: 2026-01-23 09:12:45.677 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:12:45 compute-0 nova_compute[182092]: 2026-01-23 09:12:45.689 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:46 compute-0 nova_compute[182092]: 2026-01-23 09:12:46.040 182096 DEBUG nova.network.neutron [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Successfully updated port: 06b16ef7-c2eb-4bf9-a688-01fe2726a190 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:12:46 compute-0 nova_compute[182092]: 2026-01-23 09:12:46.052 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "refresh_cache-258a0672-5288-41e7-977b-bfb206a39d13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:12:46 compute-0 nova_compute[182092]: 2026-01-23 09:12:46.053 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquired lock "refresh_cache-258a0672-5288-41e7-977b-bfb206a39d13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:12:46 compute-0 nova_compute[182092]: 2026-01-23 09:12:46.053 182096 DEBUG nova.network.neutron [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:12:46 compute-0 nova_compute[182092]: 2026-01-23 09:12:46.300 182096 DEBUG nova.network.neutron [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:12:47 compute-0 nova_compute[182092]: 2026-01-23 09:12:47.792 182096 DEBUG nova.compute.manager [req-16614232-cef3-4045-8227-5154534559c8 req-dc652c49-8530-4e41-a14e-3e3b809e1c21 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received event network-changed-06b16ef7-c2eb-4bf9-a688-01fe2726a190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:12:47 compute-0 nova_compute[182092]: 2026-01-23 09:12:47.792 182096 DEBUG nova.compute.manager [req-16614232-cef3-4045-8227-5154534559c8 req-dc652c49-8530-4e41-a14e-3e3b809e1c21 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Refreshing instance network info cache due to event network-changed-06b16ef7-c2eb-4bf9-a688-01fe2726a190. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:12:47 compute-0 nova_compute[182092]: 2026-01-23 09:12:47.792 182096 DEBUG oslo_concurrency.lockutils [req-16614232-cef3-4045-8227-5154534559c8 req-dc652c49-8530-4e41-a14e-3e3b809e1c21 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-258a0672-5288-41e7-977b-bfb206a39d13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.177 182096 DEBUG nova.network.neutron [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Updating instance_info_cache with network_info: [{"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.199 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Releasing lock "refresh_cache-258a0672-5288-41e7-977b-bfb206a39d13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.199 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Instance network_info: |[{"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.200 182096 DEBUG oslo_concurrency.lockutils [req-16614232-cef3-4045-8227-5154534559c8 req-dc652c49-8530-4e41-a14e-3e3b809e1c21 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-258a0672-5288-41e7-977b-bfb206a39d13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.200 182096 DEBUG nova.network.neutron [req-16614232-cef3-4045-8227-5154534559c8 req-dc652c49-8530-4e41-a14e-3e3b809e1c21 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Refreshing network info cache for port 06b16ef7-c2eb-4bf9-a688-01fe2726a190 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.202 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Start _get_guest_xml network_info=[{"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.206 182096 WARNING nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.210 182096 DEBUG nova.virt.libvirt.host [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.210 182096 DEBUG nova.virt.libvirt.host [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.216 182096 DEBUG nova.virt.libvirt.host [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.216 182096 DEBUG nova.virt.libvirt.host [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.217 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.217 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.218 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.218 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.218 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.218 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.219 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.219 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.219 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.220 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.220 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.220 182096 DEBUG nova.virt.hardware [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.223 182096 DEBUG nova.privsep.utils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.224 182096 DEBUG nova.virt.libvirt.vif [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-172295586-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-172295586-2',id=3,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3efab85291684108a8ee2220e9c67932',ramdisk_id='',reservation_id='r-x0tyc13s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1985052390',owner_user_name='tempest-AutoAllocateNetworkTest-1985052390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:12:26Z,user_data=None,user_id='4390e6999da44096ad45afe3bbe97b11',uuid=258a0672-5288-41e7-977b-bfb206a39d13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.224 182096 DEBUG nova.network.os_vif_util [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Converting VIF {"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.225 182096 DEBUG nova.network.os_vif_util [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:cd:a3,bridge_name='br-int',has_traffic_filtering=True,id=06b16ef7-c2eb-4bf9-a688-01fe2726a190,network=Network(11b3dcdb-6885-4a72-84ec-a7def2b972c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b16ef7-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.226 182096 DEBUG nova.objects.instance [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lazy-loading 'pci_devices' on Instance uuid 258a0672-5288-41e7-977b-bfb206a39d13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.243 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <uuid>258a0672-5288-41e7-977b-bfb206a39d13</uuid>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <name>instance-00000003</name>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <nova:name>tempest-tempest.common.compute-instance-172295586-2</nova:name>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:12:48</nova:creationTime>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:12:48 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:12:48 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:12:48 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:12:48 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:12:48 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:12:48 compute-0 nova_compute[182092]:         <nova:user uuid="4390e6999da44096ad45afe3bbe97b11">tempest-AutoAllocateNetworkTest-1985052390-project-member</nova:user>
Jan 23 09:12:48 compute-0 nova_compute[182092]:         <nova:project uuid="3efab85291684108a8ee2220e9c67932">tempest-AutoAllocateNetworkTest-1985052390</nova:project>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:12:48 compute-0 nova_compute[182092]:         <nova:port uuid="06b16ef7-c2eb-4bf9-a688-01fe2726a190">
Jan 23 09:12:48 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="fdfe:381f:8400::162" ipVersion="6"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.1.0.37" ipVersion="4"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <system>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <entry name="serial">258a0672-5288-41e7-977b-bfb206a39d13</entry>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <entry name="uuid">258a0672-5288-41e7-977b-bfb206a39d13</entry>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </system>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <os>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   </os>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <features>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   </features>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk.config"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:b1:cd:a3"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <target dev="tap06b16ef7-c2"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/console.log" append="off"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <video>
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </video>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:12:48 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:12:48 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:12:48 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:12:48 compute-0 nova_compute[182092]: </domain>
Jan 23 09:12:48 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.244 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Preparing to wait for external event network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.244 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "258a0672-5288-41e7-977b-bfb206a39d13-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.245 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.245 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.246 182096 DEBUG nova.virt.libvirt.vif [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-172295586-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-172295586-2',id=3,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3efab85291684108a8ee2220e9c67932',ramdisk_id='',reservation_id='r-x0tyc13s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1985052390',owner_user_name='tempest-AutoAllocateNetworkTest-1985052390-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:12:26Z,user_data=None,user_id='4390e6999da44096ad45afe3bbe97b11',uuid=258a0672-5288-41e7-977b-bfb206a39d13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.246 182096 DEBUG nova.network.os_vif_util [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Converting VIF {"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.247 182096 DEBUG nova.network.os_vif_util [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:cd:a3,bridge_name='br-int',has_traffic_filtering=True,id=06b16ef7-c2eb-4bf9-a688-01fe2726a190,network=Network(11b3dcdb-6885-4a72-84ec-a7def2b972c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b16ef7-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.247 182096 DEBUG os_vif [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:cd:a3,bridge_name='br-int',has_traffic_filtering=True,id=06b16ef7-c2eb-4bf9-a688-01fe2726a190,network=Network(11b3dcdb-6885-4a72-84ec-a7def2b972c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b16ef7-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.272 182096 DEBUG ovsdbapp.backend.ovs_idl [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.273 182096 DEBUG ovsdbapp.backend.ovs_idl [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.273 182096 DEBUG ovsdbapp.backend.ovs_idl [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.273 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.274 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.274 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.274 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.276 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.277 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.285 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.285 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.285 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.286 182096 INFO oslo.privsep.daemon [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp1x_c689b/privsep.sock']
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.700 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.701 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.701 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.720 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.720 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.720 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.721 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.850 182096 INFO oslo.privsep.daemon [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Spawned new privsep daemon via rootwrap
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.747 210142 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.751 210142 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.753 210142 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.753 210142 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210142
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.928 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.929 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6002MB free_disk=73.38175582885742GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.929 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:48 compute-0 nova_compute[182092]: 2026-01-23 09:12:48.929 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.129 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.130 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06b16ef7-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.130 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06b16ef7-c2, col_values=(('external_ids', {'iface-id': '06b16ef7-c2eb-4bf9-a688-01fe2726a190', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:cd:a3', 'vm-uuid': '258a0672-5288-41e7-977b-bfb206a39d13'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.131 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:49 compute-0 NetworkManager[54920]: <info>  [1769159569.1327] manager: (tap06b16ef7-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.134 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.136 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.137 182096 INFO os_vif [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:cd:a3,bridge_name='br-int',has_traffic_filtering=True,id=06b16ef7-c2eb-4bf9-a688-01fe2726a190,network=Network(11b3dcdb-6885-4a72-84ec-a7def2b972c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b16ef7-c2')
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.140 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 258a0672-5288-41e7-977b-bfb206a39d13 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.140 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.140 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.166 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.166 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.166 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] No VIF found with MAC fa:16:3e:b1:cd:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.167 182096 INFO nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Using config drive
Jan 23 09:12:49 compute-0 podman[210148]: 2026-01-23 09:12:49.200197029 +0000 UTC m=+0.039795801 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.292 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.429 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.429 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.444 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.471 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.514 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.556 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updated inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.556 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.557 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.584 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.584 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.650 182096 INFO nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Creating config drive at /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk.config
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.654 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc3_5i435 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.771 182096 DEBUG oslo_concurrency.processutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc3_5i435" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:49 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 23 09:12:49 compute-0 kernel: tap06b16ef7-c2: entered promiscuous mode
Jan 23 09:12:49 compute-0 NetworkManager[54920]: <info>  [1769159569.8181] manager: (tap06b16ef7-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Jan 23 09:12:49 compute-0 ovn_controller[94697]: 2026-01-23T09:12:49Z|00027|binding|INFO|Claiming lport 06b16ef7-c2eb-4bf9-a688-01fe2726a190 for this chassis.
Jan 23 09:12:49 compute-0 ovn_controller[94697]: 2026-01-23T09:12:49Z|00028|binding|INFO|06b16ef7-c2eb-4bf9-a688-01fe2726a190: Claiming fa:16:3e:b1:cd:a3 10.1.0.37 fdfe:381f:8400::162
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.821 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.824 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:49.835 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:cd:a3 10.1.0.37 fdfe:381f:8400::162'], port_security=['fa:16:3e:b1:cd:a3 10.1.0.37 fdfe:381f:8400::162'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.37/26 fdfe:381f:8400::162/64', 'neutron:device_id': '258a0672-5288-41e7-977b-bfb206a39d13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3efab85291684108a8ee2220e9c67932', 'neutron:revision_number': '2', 'neutron:security_group_ids': '79c15c41-0873-47e2-bbf8-f0e826f053ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea511455-5e09-4416-97c6-e3f649747f17, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=06b16ef7-c2eb-4bf9-a688-01fe2726a190) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:12:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:49.836 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 06b16ef7-c2eb-4bf9-a688-01fe2726a190 in datapath 11b3dcdb-6885-4a72-84ec-a7def2b972c9 bound to our chassis
Jan 23 09:12:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:49.838 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 11b3dcdb-6885-4a72-84ec-a7def2b972c9
Jan 23 09:12:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:49.838 103978 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpcloegrwt/privsep.sock']
Jan 23 09:12:49 compute-0 systemd-udevd[210184]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:12:49 compute-0 NetworkManager[54920]: <info>  [1769159569.8630] device (tap06b16ef7-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:12:49 compute-0 NetworkManager[54920]: <info>  [1769159569.8636] device (tap06b16ef7-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:12:49 compute-0 systemd-machined[153562]: New machine qemu-1-instance-00000003.
Jan 23 09:12:49 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.895 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:49 compute-0 ovn_controller[94697]: 2026-01-23T09:12:49Z|00029|binding|INFO|Setting lport 06b16ef7-c2eb-4bf9-a688-01fe2726a190 ovn-installed in OVS
Jan 23 09:12:49 compute-0 ovn_controller[94697]: 2026-01-23T09:12:49Z|00030|binding|INFO|Setting lport 06b16ef7-c2eb-4bf9-a688-01fe2726a190 up in Southbound
Jan 23 09:12:49 compute-0 nova_compute[182092]: 2026-01-23 09:12:49.900 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.184 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159570.1833925, 258a0672-5288-41e7-977b-bfb206a39d13 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.185 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] VM Started (Lifecycle Event)
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.199 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.202 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159570.184411, 258a0672-5288-41e7-977b-bfb206a39d13 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.202 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] VM Paused (Lifecycle Event)
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.219 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.221 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.232 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.389 103978 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.389 103978 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpcloegrwt/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.304 210209 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.307 210209 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.309 210209 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.309 210209 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210209
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.392 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5018ec-5d32-43d2-99ca-cf91db53811a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.486 182096 DEBUG nova.network.neutron [req-16614232-cef3-4045-8227-5154534559c8 req-dc652c49-8530-4e41-a14e-3e3b809e1c21 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Updated VIF entry in instance network info cache for port 06b16ef7-c2eb-4bf9-a688-01fe2726a190. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.487 182096 DEBUG nova.network.neutron [req-16614232-cef3-4045-8227-5154534559c8 req-dc652c49-8530-4e41-a14e-3e3b809e1c21 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Updating instance_info_cache with network_info: [{"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.514 182096 DEBUG oslo_concurrency.lockutils [req-16614232-cef3-4045-8227-5154534559c8 req-dc652c49-8530-4e41-a14e-3e3b809e1c21 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-258a0672-5288-41e7-977b-bfb206a39d13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.532 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.533 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.533 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.542 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.542 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.543 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.543 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.543 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:12:50 compute-0 nova_compute[182092]: 2026-01-23 09:12:50.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.815 210209 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.816 210209 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:50.816 210209 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.280 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[47cabcbc-cbc9-4005-99af-44658f3e6b78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.281 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap11b3dcdb-61 in ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.282 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap11b3dcdb-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.283 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ec59d386-3cc6-4f60-ba31-f2414992abc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.285 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[685699ec-6980-426c-b542-d9df68c2129e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.300 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[db550bdb-147d-433a-aa1d-9fe07e3a5b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.309 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[847c9e06-7cc1-43f4-b01f-0bc027007e38]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.310 103978 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpzlfx7f7p/privsep.sock']
Jan 23 09:12:51 compute-0 podman[210218]: 2026-01-23 09:12:51.389151501 +0000 UTC m=+0.068870565 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:12:51 compute-0 nova_compute[182092]: 2026-01-23 09:12:51.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:51 compute-0 nova_compute[182092]: 2026-01-23 09:12:51.822 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.834 103978 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.834 103978 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpzlfx7f7p/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.755 210245 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.758 210245 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.760 210245 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.760 210245 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210245
Jan 23 09:12:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:51.836 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f9523e-21a6-46c2-b86a-d0e6316a7b33]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.239 210245 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.239 210245 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.239 210245 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.602 182096 DEBUG nova.compute.manager [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received event network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.603 182096 DEBUG oslo_concurrency.lockutils [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "258a0672-5288-41e7-977b-bfb206a39d13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.603 182096 DEBUG oslo_concurrency.lockutils [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.603 182096 DEBUG oslo_concurrency.lockutils [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.604 182096 DEBUG nova.compute.manager [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Processing event network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.604 182096 DEBUG nova.compute.manager [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received event network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.604 182096 DEBUG oslo_concurrency.lockutils [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "258a0672-5288-41e7-977b-bfb206a39d13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.604 182096 DEBUG oslo_concurrency.lockutils [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.604 182096 DEBUG oslo_concurrency.lockutils [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.604 182096 DEBUG nova.compute.manager [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] No waiting events found dispatching network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.605 182096 WARNING nova.compute.manager [req-b11f85ac-96e9-414e-bc72-e64325dc0ae4 req-23a64529-700e-4a4e-acce-202aa84405c0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received unexpected event network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 for instance with vm_state building and task_state spawning.
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.605 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.607 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159572.6074266, 258a0672-5288-41e7-977b-bfb206a39d13 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.607 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] VM Resumed (Lifecycle Event)
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.609 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.614 182096 INFO nova.virt.libvirt.driver [-] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Instance spawned successfully.
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.615 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.629 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.632 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.635 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.635 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.636 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.636 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.637 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.637 182096 DEBUG nova.virt.libvirt.driver [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.651 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.652 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.711 182096 INFO nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Took 25.76 seconds to spawn the instance on the hypervisor.
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.711 182096 DEBUG nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.711 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5829f61c-b706-48bb-b090-da57f995349f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 NetworkManager[54920]: <info>  [1769159572.7254] manager: (tap11b3dcdb-60): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.724 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[59ed245a-4a5a-4b9e-b33d-3a48fa53dcf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 systemd-udevd[210187]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.748 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1de13392-e47f-4a42-8fb0-434ec6acb751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.750 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a782eb1b-c9bd-4d86-bd5b-ebb26d4d9dfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.762 182096 INFO nova.compute.manager [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Took 26.30 seconds to build instance.
Jan 23 09:12:52 compute-0 NetworkManager[54920]: <info>  [1769159572.7686] device (tap11b3dcdb-60): carrier: link connected
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.772 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[36886af1-8a11-43a8-9cf6-ac6369679d65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.773 182096 DEBUG oslo_concurrency.lockutils [None req-f9e910a7-bddb-4997-9e99-9e38c4f039c2 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.784 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce4add4-1115-4126-a40b-35598e5828e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap11b3dcdb-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:86:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 307825, 'reachable_time': 29931, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210260, 'error': None, 'target': 'ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.792 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4b10ae-6074-48aa-a67a-de1c6480b093]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe11:8654'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 307825, 'tstamp': 307825}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210261, 'error': None, 'target': 'ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.802 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[517c1383-a81d-4287-8faa-1393c2866d87]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap11b3dcdb-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:86:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 307825, 'reachable_time': 29931, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210262, 'error': None, 'target': 'ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.822 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[83c3c515-dfda-4caf-a48b-710db371368a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.858 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f2fe21ca-f58f-4c37-ad92-2fc2fce2d73e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.859 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11b3dcdb-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.860 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.860 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11b3dcdb-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:52 compute-0 NetworkManager[54920]: <info>  [1769159572.8627] manager: (tap11b3dcdb-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 23 09:12:52 compute-0 kernel: tap11b3dcdb-60: entered promiscuous mode
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.865 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.867 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap11b3dcdb-60, col_values=(('external_ids', {'iface-id': '0670c6e4-6f65-4709-b7f7-edcac9754fdb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:52 compute-0 ovn_controller[94697]: 2026-01-23T09:12:52Z|00031|binding|INFO|Releasing lport 0670c6e4-6f65-4709-b7f7-edcac9754fdb from this chassis (sb_readonly=0)
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.867 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.870 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/11b3dcdb-6885-4a72-84ec-a7def2b972c9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/11b3dcdb-6885-4a72-84ec-a7def2b972c9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.870 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ff37bc-36b9-47cf-8705-2c73f2af0370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.871 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-11b3dcdb-6885-4a72-84ec-a7def2b972c9
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/11b3dcdb-6885-4a72-84ec-a7def2b972c9.pid.haproxy
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 11b3dcdb-6885-4a72-84ec-a7def2b972c9
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:12:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:52.872 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'env', 'PROCESS_TAG=haproxy-11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/11b3dcdb-6885-4a72-84ec-a7def2b972c9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:12:52 compute-0 nova_compute[182092]: 2026-01-23 09:12:52.880 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:53 compute-0 podman[210291]: 2026-01-23 09:12:53.153306091 +0000 UTC m=+0.037467006 container create 5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:12:53 compute-0 systemd[1]: Started libpod-conmon-5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8.scope.
Jan 23 09:12:53 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:12:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6509d37ed0f149b3772fe7fdfcba274c237ffb9037db397aa012fd2f0cc21961/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:12:53 compute-0 podman[210291]: 2026-01-23 09:12:53.218103242 +0000 UTC m=+0.102264157 container init 5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:12:53 compute-0 podman[210291]: 2026-01-23 09:12:53.222542065 +0000 UTC m=+0.106702980 container start 5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:12:53 compute-0 podman[210291]: 2026-01-23 09:12:53.134753515 +0000 UTC m=+0.018914450 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:12:53 compute-0 neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9[210303]: [NOTICE]   (210307) : New worker (210309) forked
Jan 23 09:12:53 compute-0 neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9[210303]: [NOTICE]   (210307) : Loading success.
Jan 23 09:12:54 compute-0 nova_compute[182092]: 2026-01-23 09:12:54.131 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:54 compute-0 podman[210314]: 2026-01-23 09:12:54.203557712 +0000 UTC m=+0.041469106 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.430 182096 DEBUG nova.compute.manager [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.554 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.555 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.576 182096 DEBUG nova.objects.instance [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.604 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.604 182096 INFO nova.compute.claims [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.604 182096 DEBUG nova.objects.instance [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lazy-loading 'resources' on Instance uuid 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.615 182096 DEBUG nova.objects.instance [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.624 182096 DEBUG nova.objects.instance [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.651 182096 INFO nova.compute.resource_tracker [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Updating resource usage from migration 83ed9d4c-3050-4dac-bb1c-6b60524470db
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.651 182096 DEBUG nova.compute.resource_tracker [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Starting to track incoming migration 83ed9d4c-3050-4dac-bb1c-6b60524470db with flavor 98e818ca-8ca1-4177-8a64-bde266c399d2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.709 182096 DEBUG nova.compute.provider_tree [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.720 182096 DEBUG nova.scheduler.client.report [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.731 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.731 182096 INFO nova.compute.manager [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Migrating
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.731 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.732 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.742 182096 INFO nova.compute.rpcapi [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.742 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.825 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.881 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.882 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.893 182096 DEBUG nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.970 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.972 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.977 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:12:56 compute-0 nova_compute[182092]: 2026-01-23 09:12:56.977 182096 INFO nova.compute.claims [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.094 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "258a0672-5288-41e7-977b-bfb206a39d13" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.095 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.095 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "258a0672-5288-41e7-977b-bfb206a39d13-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.095 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.095 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.106 182096 INFO nova.compute.manager [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Terminating instance
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.113 182096 DEBUG nova.compute.manager [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:12:57 compute-0 kernel: tap06b16ef7-c2 (unregistering): left promiscuous mode
Jan 23 09:12:57 compute-0 NetworkManager[54920]: <info>  [1769159577.1309] device (tap06b16ef7-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:12:57 compute-0 ovn_controller[94697]: 2026-01-23T09:12:57Z|00032|binding|INFO|Releasing lport 06b16ef7-c2eb-4bf9-a688-01fe2726a190 from this chassis (sb_readonly=0)
Jan 23 09:12:57 compute-0 ovn_controller[94697]: 2026-01-23T09:12:57Z|00033|binding|INFO|Setting lport 06b16ef7-c2eb-4bf9-a688-01fe2726a190 down in Southbound
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.140 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:57 compute-0 ovn_controller[94697]: 2026-01-23T09:12:57Z|00034|binding|INFO|Removing iface tap06b16ef7-c2 ovn-installed in OVS
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.146 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:cd:a3 10.1.0.37 fdfe:381f:8400::162'], port_security=['fa:16:3e:b1:cd:a3 10.1.0.37 fdfe:381f:8400::162'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.37/26 fdfe:381f:8400::162/64', 'neutron:device_id': '258a0672-5288-41e7-977b-bfb206a39d13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3efab85291684108a8ee2220e9c67932', 'neutron:revision_number': '4', 'neutron:security_group_ids': '79c15c41-0873-47e2-bbf8-f0e826f053ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea511455-5e09-4416-97c6-e3f649747f17, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=06b16ef7-c2eb-4bf9-a688-01fe2726a190) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.148 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 06b16ef7-c2eb-4bf9-a688-01fe2726a190 in datapath 11b3dcdb-6885-4a72-84ec-a7def2b972c9 unbound from our chassis
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.149 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 11b3dcdb-6885-4a72-84ec-a7def2b972c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.149 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c8832b46-c0bd-497f-ad79-04d71fa3738e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.150 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9 namespace which is not needed anymore
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.157 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:57 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 23 09:12:57 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 4.748s CPU time.
Jan 23 09:12:57 compute-0 systemd-machined[153562]: Machine qemu-1-instance-00000003 terminated.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.190 182096 DEBUG nova.compute.provider_tree [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.205 182096 DEBUG nova.scheduler.client.report [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.222 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.222 182096 DEBUG nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:12:57 compute-0 neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9[210303]: [NOTICE]   (210307) : haproxy version is 2.8.14-c23fe91
Jan 23 09:12:57 compute-0 neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9[210303]: [NOTICE]   (210307) : path to executable is /usr/sbin/haproxy
Jan 23 09:12:57 compute-0 neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9[210303]: [ALERT]    (210307) : Current worker (210309) exited with code 143 (Terminated)
Jan 23 09:12:57 compute-0 neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9[210303]: [WARNING]  (210307) : All workers exited. Exiting... (0)
Jan 23 09:12:57 compute-0 systemd[1]: libpod-5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8.scope: Deactivated successfully.
Jan 23 09:12:57 compute-0 podman[210353]: 2026-01-23 09:12:57.252198332 +0000 UTC m=+0.035977217 container died 5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:12:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8-userdata-shm.mount: Deactivated successfully.
Jan 23 09:12:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-6509d37ed0f149b3772fe7fdfcba274c237ffb9037db397aa012fd2f0cc21961-merged.mount: Deactivated successfully.
Jan 23 09:12:57 compute-0 podman[210353]: 2026-01-23 09:12:57.273806383 +0000 UTC m=+0.057585268 container cleanup 5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:12:57 compute-0 systemd[1]: libpod-conmon-5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8.scope: Deactivated successfully.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.294 182096 DEBUG nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.294 182096 DEBUG nova.network.neutron [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.305 182096 DEBUG nova.compute.manager [req-b9bc86f0-f013-4bd0-830a-37ba7380c4a2 req-7ded01c5-850a-45ad-b59a-14b6e816bd42 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received event network-vif-unplugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.305 182096 DEBUG oslo_concurrency.lockutils [req-b9bc86f0-f013-4bd0-830a-37ba7380c4a2 req-7ded01c5-850a-45ad-b59a-14b6e816bd42 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "258a0672-5288-41e7-977b-bfb206a39d13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.305 182096 DEBUG oslo_concurrency.lockutils [req-b9bc86f0-f013-4bd0-830a-37ba7380c4a2 req-7ded01c5-850a-45ad-b59a-14b6e816bd42 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.305 182096 DEBUG oslo_concurrency.lockutils [req-b9bc86f0-f013-4bd0-830a-37ba7380c4a2 req-7ded01c5-850a-45ad-b59a-14b6e816bd42 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.305 182096 DEBUG nova.compute.manager [req-b9bc86f0-f013-4bd0-830a-37ba7380c4a2 req-7ded01c5-850a-45ad-b59a-14b6e816bd42 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] No waiting events found dispatching network-vif-unplugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.306 182096 DEBUG nova.compute.manager [req-b9bc86f0-f013-4bd0-830a-37ba7380c4a2 req-7ded01c5-850a-45ad-b59a-14b6e816bd42 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received event network-vif-unplugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.309 182096 INFO nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:12:57 compute-0 podman[210378]: 2026-01-23 09:12:57.315681944 +0000 UTC m=+0.026307065 container remove 5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.320 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[74c72396-2a50-4934-a79a-307ceb6b3131]: (4, ('Fri Jan 23 09:12:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9 (5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8)\n5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8\nFri Jan 23 09:12:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9 (5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8)\n5682d0964804dbabafdb5409d2bcb18809cb06fe0226331b0fc659af3f7631b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.321 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[20c6af9a-2b61-4c82-b8c5-7a43797d3e00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.322 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11b3dcdb-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.324 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.336 182096 DEBUG nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.337 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:57 compute-0 kernel: tap11b3dcdb-60: left promiscuous mode
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.343 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.344 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ee2b1b54-22ea-44f4-af73-7c8085a9b7e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.354 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[50663b6d-ea2b-4617-bfc6-45b82f1f4c5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.355 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7e9623-f3bc-453f-8d32-b26ee129c431]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.366 182096 INFO nova.virt.libvirt.driver [-] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Instance destroyed successfully.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.367 182096 DEBUG nova.objects.instance [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lazy-loading 'resources' on Instance uuid 258a0672-5288-41e7-977b-bfb206a39d13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.368 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[46312470-8207-4dae-84dc-7e3270253fa5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 307819, 'reachable_time': 27628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210406, 'error': None, 'target': 'ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d11b3dcdb\x2d6885\x2d4a72\x2d84ec\x2da7def2b972c9.mount: Deactivated successfully.
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.375 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-11b3dcdb-6885-4a72-84ec-a7def2b972c9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:12:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:12:57.375 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[c3441ea0-b968-4589-8d72-11e09f3076ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.379 182096 DEBUG nova.virt.libvirt.vif [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-172295586-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-172295586-2',id=3,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-23T09:12:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3efab85291684108a8ee2220e9c67932',ramdisk_id='',reservation_id='r-x0tyc13s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-1985052390',owner_user_name='tempest-AutoAllocateNetworkTest-1985052390-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:12:52Z,user_data=None,user_id='4390e6999da44096ad45afe3bbe97b11',uuid=258a0672-5288-41e7-977b-bfb206a39d13,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.379 182096 DEBUG nova.network.os_vif_util [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Converting VIF {"id": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "address": "fa:16:3e:b1:cd:a3", "network": {"id": "11b3dcdb-6885-4a72-84ec-a7def2b972c9", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::162", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3efab85291684108a8ee2220e9c67932", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b16ef7-c2", "ovs_interfaceid": "06b16ef7-c2eb-4bf9-a688-01fe2726a190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.380 182096 DEBUG nova.network.os_vif_util [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:cd:a3,bridge_name='br-int',has_traffic_filtering=True,id=06b16ef7-c2eb-4bf9-a688-01fe2726a190,network=Network(11b3dcdb-6885-4a72-84ec-a7def2b972c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b16ef7-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.380 182096 DEBUG os_vif [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:cd:a3,bridge_name='br-int',has_traffic_filtering=True,id=06b16ef7-c2eb-4bf9-a688-01fe2726a190,network=Network(11b3dcdb-6885-4a72-84ec-a7def2b972c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b16ef7-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.382 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.382 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06b16ef7-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.385 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.386 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.387 182096 INFO os_vif [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:cd:a3,bridge_name='br-int',has_traffic_filtering=True,id=06b16ef7-c2eb-4bf9-a688-01fe2726a190,network=Network(11b3dcdb-6885-4a72-84ec-a7def2b972c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b16ef7-c2')
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.387 182096 INFO nova.virt.libvirt.driver [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Deleting instance files /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13_del
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.388 182096 INFO nova.virt.libvirt.driver [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Deletion of /var/lib/nova/instances/258a0672-5288-41e7-977b-bfb206a39d13_del complete
Jan 23 09:12:57 compute-0 sshd-session[210401]: Accepted publickey for nova from 192.168.122.101 port 59788 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.433 182096 DEBUG nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.434 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.434 182096 INFO nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Creating image(s)
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.434 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "/var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.435 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "/var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.435 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "/var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:57 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:12:57 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.450 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:57 compute-0 systemd-logind[746]: New session 26 of user nova.
Jan 23 09:12:57 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.463 182096 DEBUG nova.virt.libvirt.host [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.464 182096 INFO nova.virt.libvirt.host [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] UEFI support detected
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.465 182096 INFO nova.compute.manager [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.465 182096 DEBUG oslo.service.loopingcall [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.466 182096 DEBUG nova.compute.manager [-] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.466 182096 DEBUG nova.network.neutron [-] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:12:57 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:12:57 compute-0 systemd[210413]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.499 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.499 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.500 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.510 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:57 compute-0 systemd[210413]: Queued start job for default target Main User Target.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.565 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.566 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:57 compute-0 systemd[210413]: Created slice User Application Slice.
Jan 23 09:12:57 compute-0 systemd[210413]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:12:57 compute-0 systemd[210413]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:12:57 compute-0 systemd[210413]: Reached target Paths.
Jan 23 09:12:57 compute-0 systemd[210413]: Reached target Timers.
Jan 23 09:12:57 compute-0 systemd[210413]: Starting D-Bus User Message Bus Socket...
Jan 23 09:12:57 compute-0 systemd[210413]: Starting Create User's Volatile Files and Directories...
Jan 23 09:12:57 compute-0 systemd[210413]: Finished Create User's Volatile Files and Directories.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.586 182096 DEBUG nova.network.neutron [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.587 182096 DEBUG nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:12:57 compute-0 systemd[210413]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:12:57 compute-0 systemd[210413]: Reached target Sockets.
Jan 23 09:12:57 compute-0 systemd[210413]: Reached target Basic System.
Jan 23 09:12:57 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.590 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:57 compute-0 systemd[210413]: Reached target Main User Target.
Jan 23 09:12:57 compute-0 systemd[210413]: Startup finished in 102ms.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.591 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.591 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:57 compute-0 systemd[1]: Started Session 26 of User nova.
Jan 23 09:12:57 compute-0 sshd-session[210401]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.638 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.638 182096 DEBUG nova.virt.disk.api [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Checking if we can resize image /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.639 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:57 compute-0 sshd-session[210439]: Received disconnect from 192.168.122.101 port 59788:11: disconnected by user
Jan 23 09:12:57 compute-0 sshd-session[210439]: Disconnected from user nova 192.168.122.101 port 59788
Jan 23 09:12:57 compute-0 sshd-session[210401]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:12:57 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Jan 23 09:12:57 compute-0 systemd-logind[746]: Session 26 logged out. Waiting for processes to exit.
Jan 23 09:12:57 compute-0 systemd-logind[746]: Removed session 26.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.685 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.686 182096 DEBUG nova.virt.disk.api [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Cannot resize image /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.687 182096 DEBUG nova.objects.instance [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lazy-loading 'migration_context' on Instance uuid 7fab73f4-ebb4-47b3-a487-7eeecb2e5295 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.699 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.699 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Ensure instance console log exists: /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.699 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.699 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.700 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.701 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.704 182096 WARNING nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.708 182096 DEBUG nova.virt.libvirt.host [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.708 182096 DEBUG nova.virt.libvirt.host [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.710 182096 DEBUG nova.virt.libvirt.host [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.711 182096 DEBUG nova.virt.libvirt.host [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.712 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.712 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.712 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.712 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.713 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.713 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.713 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.713 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.713 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.713 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.713 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.714 182096 DEBUG nova.virt.hardware [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.716 182096 DEBUG nova.objects.instance [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7fab73f4-ebb4-47b3-a487-7eeecb2e5295 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.727 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <uuid>7fab73f4-ebb4-47b3-a487-7eeecb2e5295</uuid>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <name>instance-00000007</name>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <nova:name>tempest-LiveMigrationNegativeTest-server-1290226659</nova:name>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:12:57</nova:creationTime>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:12:57 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:12:57 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:12:57 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:12:57 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:12:57 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:12:57 compute-0 nova_compute[182092]:         <nova:user uuid="3c390a2519314730b83c3740b021300b">tempest-LiveMigrationNegativeTest-243957153-project-member</nova:user>
Jan 23 09:12:57 compute-0 nova_compute[182092]:         <nova:project uuid="eebdcae71b144c709bb308e3489f8dd9">tempest-LiveMigrationNegativeTest-243957153</nova:project>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <system>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <entry name="serial">7fab73f4-ebb4-47b3-a487-7eeecb2e5295</entry>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <entry name="uuid">7fab73f4-ebb4-47b3-a487-7eeecb2e5295</entry>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     </system>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <os>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   </os>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <features>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   </features>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk.config"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/console.log" append="off"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <video>
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     </video>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:12:57 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:12:57 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:12:57 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:12:57 compute-0 nova_compute[182092]: </domain>
Jan 23 09:12:57 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:12:57 compute-0 sshd-session[210444]: Accepted publickey for nova from 192.168.122.101 port 59790 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:12:57 compute-0 systemd-logind[746]: New session 28 of user nova.
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.772 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.772 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:12:57 compute-0 nova_compute[182092]: 2026-01-23 09:12:57.773 182096 INFO nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Using config drive
Jan 23 09:12:57 compute-0 systemd[1]: Started Session 28 of User nova.
Jan 23 09:12:57 compute-0 sshd-session[210444]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:12:57 compute-0 sshd-session[210450]: Received disconnect from 192.168.122.101 port 59790:11: disconnected by user
Jan 23 09:12:57 compute-0 sshd-session[210450]: Disconnected from user nova 192.168.122.101 port 59790
Jan 23 09:12:57 compute-0 sshd-session[210444]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:12:57 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Jan 23 09:12:57 compute-0 systemd-logind[746]: Session 28 logged out. Waiting for processes to exit.
Jan 23 09:12:57 compute-0 systemd-logind[746]: Removed session 28.
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.099 182096 INFO nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Creating config drive at /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk.config
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.104 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprst5no2g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.222 182096 DEBUG oslo_concurrency.processutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprst5no2g" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:12:58 compute-0 systemd-machined[153562]: New machine qemu-2-instance-00000007.
Jan 23 09:12:58 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000007.
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.315 182096 DEBUG nova.network.neutron [-] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.325 182096 INFO nova.compute.manager [-] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Took 0.86 seconds to deallocate network for instance.
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.383 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.384 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.407 182096 DEBUG nova.compute.manager [req-53495a72-1b14-4ad2-90ee-ac8190405621 req-673b86c7-3e2c-4617-b9e3-c3f3884c8a55 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received event network-vif-deleted-06b16ef7-c2eb-4bf9-a688-01fe2726a190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.451 182096 DEBUG nova.compute.provider_tree [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.461 182096 DEBUG nova.scheduler.client.report [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.474 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.495 182096 INFO nova.scheduler.client.report [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Deleted allocations for instance 258a0672-5288-41e7-977b-bfb206a39d13
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.540 182096 DEBUG oslo_concurrency.lockutils [None req-e5cec14c-6afd-4e45-9166-e00e4cd5f498 4390e6999da44096ad45afe3bbe97b11 3efab85291684108a8ee2220e9c67932 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.566 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159578.5650513, 7fab73f4-ebb4-47b3-a487-7eeecb2e5295 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.566 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] VM Resumed (Lifecycle Event)
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.567 182096 DEBUG nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.567 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.570 182096 INFO nova.virt.libvirt.driver [-] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Instance spawned successfully.
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.570 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.578 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.579 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.585 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.585 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.585 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.585 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.586 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.586 182096 DEBUG nova.virt.libvirt.driver [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.593 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.593 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159578.5662715, 7fab73f4-ebb4-47b3-a487-7eeecb2e5295 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.593 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] VM Started (Lifecycle Event)
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.615 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.617 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.631 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.656 182096 INFO nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Took 1.22 seconds to spawn the instance on the hypervisor.
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.656 182096 DEBUG nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.750 182096 INFO nova.compute.manager [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Took 1.80 seconds to build instance.
Jan 23 09:12:58 compute-0 nova_compute[182092]: 2026-01-23 09:12:58.828 182096 DEBUG oslo_concurrency.lockutils [None req-d1b82ee4-c6fb-4802-a6d3-1299c802df2f 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.946s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.425 182096 DEBUG nova.compute.manager [req-69bd0b9d-7f5e-4c0d-819c-77daefee7701 req-955da908-8236-46ba-8a45-3f2bbf3dcee4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received event network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.425 182096 DEBUG oslo_concurrency.lockutils [req-69bd0b9d-7f5e-4c0d-819c-77daefee7701 req-955da908-8236-46ba-8a45-3f2bbf3dcee4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "258a0672-5288-41e7-977b-bfb206a39d13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.426 182096 DEBUG oslo_concurrency.lockutils [req-69bd0b9d-7f5e-4c0d-819c-77daefee7701 req-955da908-8236-46ba-8a45-3f2bbf3dcee4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.426 182096 DEBUG oslo_concurrency.lockutils [req-69bd0b9d-7f5e-4c0d-819c-77daefee7701 req-955da908-8236-46ba-8a45-3f2bbf3dcee4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "258a0672-5288-41e7-977b-bfb206a39d13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.426 182096 DEBUG nova.compute.manager [req-69bd0b9d-7f5e-4c0d-819c-77daefee7701 req-955da908-8236-46ba-8a45-3f2bbf3dcee4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] No waiting events found dispatching network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.426 182096 WARNING nova.compute.manager [req-69bd0b9d-7f5e-4c0d-819c-77daefee7701 req-955da908-8236-46ba-8a45-3f2bbf3dcee4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Received unexpected event network-vif-plugged-06b16ef7-c2eb-4bf9-a688-01fe2726a190 for instance with vm_state deleted and task_state None.
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.788 182096 DEBUG nova.objects.instance [None req-2018c6d9-6b8f-4827-bb80-a57b589cf597 a2118ce30c3f4ba2b975ec4cc3f9a52d 0f3b5ca1d70a40d498b104af49eef5a0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7fab73f4-ebb4-47b3-a487-7eeecb2e5295 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.803 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159579.8037672, 7fab73f4-ebb4-47b3-a487-7eeecb2e5295 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.804 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] VM Paused (Lifecycle Event)
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.824 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.826 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:12:59 compute-0 nova_compute[182092]: 2026-01-23 09:12:59.856 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 23 09:13:00 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 23 09:13:00 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Consumed 1.582s CPU time.
Jan 23 09:13:00 compute-0 systemd-machined[153562]: Machine qemu-2-instance-00000007 terminated.
Jan 23 09:13:00 compute-0 nova_compute[182092]: 2026-01-23 09:13:00.375 182096 DEBUG nova.compute.manager [None req-2018c6d9-6b8f-4827-bb80-a57b589cf597 a2118ce30c3f4ba2b975ec4cc3f9a52d 0f3b5ca1d70a40d498b104af49eef5a0 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:01 compute-0 podman[210491]: 2026-01-23 09:13:01.224255157 +0000 UTC m=+0.060523870 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:13:01 compute-0 nova_compute[182092]: 2026-01-23 09:13:01.825 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.383 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.625 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.625 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.626 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.626 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.626 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.632 182096 INFO nova.compute.manager [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Terminating instance
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.637 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "refresh_cache-7fab73f4-ebb4-47b3-a487-7eeecb2e5295" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.637 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquired lock "refresh_cache-7fab73f4-ebb4-47b3-a487-7eeecb2e5295" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.637 182096 DEBUG nova.network.neutron [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:13:02 compute-0 nova_compute[182092]: 2026-01-23 09:13:02.798 182096 DEBUG nova.network.neutron [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.064 182096 DEBUG nova.network.neutron [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.078 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Releasing lock "refresh_cache-7fab73f4-ebb4-47b3-a487-7eeecb2e5295" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.078 182096 DEBUG nova.compute.manager [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.083 182096 INFO nova.virt.libvirt.driver [-] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Instance destroyed successfully.
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.083 182096 DEBUG nova.objects.instance [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lazy-loading 'resources' on Instance uuid 7fab73f4-ebb4-47b3-a487-7eeecb2e5295 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.240 182096 INFO nova.virt.libvirt.driver [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Deleting instance files /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295_del
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.241 182096 INFO nova.virt.libvirt.driver [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Deletion of /var/lib/nova/instances/7fab73f4-ebb4-47b3-a487-7eeecb2e5295_del complete
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.407 182096 INFO nova.compute.manager [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.407 182096 DEBUG oslo.service.loopingcall [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.408 182096 DEBUG nova.compute.manager [-] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.408 182096 DEBUG nova.network.neutron [-] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.664 182096 DEBUG nova.network.neutron [-] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.674 182096 DEBUG nova.network.neutron [-] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.682 182096 INFO nova.compute.manager [-] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Took 0.27 seconds to deallocate network for instance.
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.727 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.728 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.791 182096 DEBUG nova.compute.provider_tree [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.802 182096 DEBUG nova.scheduler.client.report [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.816 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.847 182096 INFO nova.scheduler.client.report [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Deleted allocations for instance 7fab73f4-ebb4-47b3-a487-7eeecb2e5295
Jan 23 09:13:03 compute-0 nova_compute[182092]: 2026-01-23 09:13:03.908 182096 DEBUG oslo_concurrency.lockutils [None req-5bbcfc39-bf44-457f-bdef-c7e955cc9211 3c390a2519314730b83c3740b021300b eebdcae71b144c709bb308e3489f8dd9 - - default default] Lock "7fab73f4-ebb4-47b3-a487-7eeecb2e5295" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:04 compute-0 nova_compute[182092]: 2026-01-23 09:13:04.661 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:06 compute-0 nova_compute[182092]: 2026-01-23 09:13:06.827 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:07 compute-0 nova_compute[182092]: 2026-01-23 09:13:07.384 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:07 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:13:07 compute-0 systemd[210413]: Activating special unit Exit the Session...
Jan 23 09:13:07 compute-0 systemd[210413]: Stopped target Main User Target.
Jan 23 09:13:07 compute-0 systemd[210413]: Stopped target Basic System.
Jan 23 09:13:07 compute-0 systemd[210413]: Stopped target Paths.
Jan 23 09:13:07 compute-0 systemd[210413]: Stopped target Sockets.
Jan 23 09:13:07 compute-0 systemd[210413]: Stopped target Timers.
Jan 23 09:13:07 compute-0 systemd[210413]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:13:07 compute-0 systemd[210413]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:13:07 compute-0 systemd[210413]: Closed D-Bus User Message Bus Socket.
Jan 23 09:13:07 compute-0 systemd[210413]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:13:07 compute-0 systemd[210413]: Removed slice User Application Slice.
Jan 23 09:13:07 compute-0 systemd[210413]: Reached target Shutdown.
Jan 23 09:13:07 compute-0 systemd[210413]: Finished Exit the Session.
Jan 23 09:13:07 compute-0 systemd[210413]: Reached target Exit the Session.
Jan 23 09:13:07 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:13:07 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:13:07 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:13:07 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:13:07 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:13:07 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:13:07 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:13:07 compute-0 podman[210514]: 2026-01-23 09:13:07.971926358 +0000 UTC m=+0.058077434 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:13:07 compute-0 podman[210515]: 2026-01-23 09:13:07.976559588 +0000 UTC m=+0.051434252 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:13:11 compute-0 sshd-session[210553]: Accepted publickey for nova from 192.168.122.101 port 52990 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:11 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:13:11 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:13:11 compute-0 systemd-logind[746]: New session 29 of user nova.
Jan 23 09:13:11 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:13:11 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:13:11 compute-0 systemd[210557]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:11 compute-0 systemd[210557]: Queued start job for default target Main User Target.
Jan 23 09:13:11 compute-0 systemd[210557]: Created slice User Application Slice.
Jan 23 09:13:11 compute-0 systemd[210557]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:13:11 compute-0 systemd[210557]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:13:11 compute-0 systemd[210557]: Reached target Paths.
Jan 23 09:13:11 compute-0 systemd[210557]: Reached target Timers.
Jan 23 09:13:11 compute-0 systemd[210557]: Starting D-Bus User Message Bus Socket...
Jan 23 09:13:11 compute-0 systemd[210557]: Starting Create User's Volatile Files and Directories...
Jan 23 09:13:11 compute-0 systemd[210557]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:13:11 compute-0 systemd[210557]: Reached target Sockets.
Jan 23 09:13:11 compute-0 systemd[210557]: Finished Create User's Volatile Files and Directories.
Jan 23 09:13:11 compute-0 systemd[210557]: Reached target Basic System.
Jan 23 09:13:11 compute-0 systemd[210557]: Reached target Main User Target.
Jan 23 09:13:11 compute-0 systemd[210557]: Startup finished in 90ms.
Jan 23 09:13:11 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:13:11 compute-0 systemd[1]: Started Session 29 of User nova.
Jan 23 09:13:11 compute-0 sshd-session[210553]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:11 compute-0 sshd-session[210572]: Received disconnect from 192.168.122.101 port 52990:11: disconnected by user
Jan 23 09:13:11 compute-0 sshd-session[210572]: Disconnected from user nova 192.168.122.101 port 52990
Jan 23 09:13:11 compute-0 sshd-session[210553]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:13:11 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Jan 23 09:13:11 compute-0 systemd-logind[746]: Session 29 logged out. Waiting for processes to exit.
Jan 23 09:13:11 compute-0 systemd-logind[746]: Removed session 29.
Jan 23 09:13:11 compute-0 sshd-session[210574]: Accepted publickey for nova from 192.168.122.101 port 53002 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:11 compute-0 systemd-logind[746]: New session 31 of user nova.
Jan 23 09:13:11 compute-0 systemd[1]: Started Session 31 of User nova.
Jan 23 09:13:11 compute-0 sshd-session[210574]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:11 compute-0 sshd-session[210577]: Received disconnect from 192.168.122.101 port 53002:11: disconnected by user
Jan 23 09:13:11 compute-0 sshd-session[210577]: Disconnected from user nova 192.168.122.101 port 53002
Jan 23 09:13:11 compute-0 sshd-session[210574]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:13:11 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Jan 23 09:13:11 compute-0 systemd-logind[746]: Session 31 logged out. Waiting for processes to exit.
Jan 23 09:13:11 compute-0 systemd-logind[746]: Removed session 31.
Jan 23 09:13:11 compute-0 sshd-session[210579]: Accepted publickey for nova from 192.168.122.101 port 53006 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:11 compute-0 systemd-logind[746]: New session 32 of user nova.
Jan 23 09:13:11 compute-0 nova_compute[182092]: 2026-01-23 09:13:11.828 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:11 compute-0 systemd[1]: Started Session 32 of User nova.
Jan 23 09:13:11 compute-0 sshd-session[210579]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:11 compute-0 sshd-session[210582]: Received disconnect from 192.168.122.101 port 53006:11: disconnected by user
Jan 23 09:13:11 compute-0 sshd-session[210582]: Disconnected from user nova 192.168.122.101 port 53006
Jan 23 09:13:11 compute-0 sshd-session[210579]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:13:11 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Jan 23 09:13:11 compute-0 systemd-logind[746]: Session 32 logged out. Waiting for processes to exit.
Jan 23 09:13:11 compute-0 systemd-logind[746]: Removed session 32.
Jan 23 09:13:12 compute-0 nova_compute[182092]: 2026-01-23 09:13:12.363 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159577.362514, 258a0672-5288-41e7-977b-bfb206a39d13 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:12 compute-0 nova_compute[182092]: 2026-01-23 09:13:12.363 182096 INFO nova.compute.manager [-] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] VM Stopped (Lifecycle Event)
Jan 23 09:13:12 compute-0 nova_compute[182092]: 2026-01-23 09:13:12.440 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:12 compute-0 nova_compute[182092]: 2026-01-23 09:13:12.456 182096 DEBUG nova.compute.manager [None req-773b965e-6617-473c-8906-abd1f4f7933e - - - - - -] [instance: 258a0672-5288-41e7-977b-bfb206a39d13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:12 compute-0 nova_compute[182092]: 2026-01-23 09:13:12.508 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquiring lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:13:12 compute-0 nova_compute[182092]: 2026-01-23 09:13:12.508 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquired lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:13:12 compute-0 nova_compute[182092]: 2026-01-23 09:13:12.508 182096 DEBUG nova.network.neutron [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.067 182096 DEBUG nova.network.neutron [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.389 182096 DEBUG nova.network.neutron [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.402 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Releasing lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.667 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.668 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.668 182096 INFO nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Creating image(s)
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.669 182096 DEBUG nova.objects.instance [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.680 182096 DEBUG oslo_concurrency.processutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.732 182096 DEBUG oslo_concurrency.processutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.733 182096 DEBUG nova.virt.disk.api [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Checking if we can resize image /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.733 182096 DEBUG oslo_concurrency.processutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.778 182096 DEBUG oslo_concurrency.processutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.779 182096 DEBUG nova.virt.disk.api [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Cannot resize image /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.791 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.792 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Ensure instance console log exists: /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.792 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.792 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.793 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.794 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.797 182096 WARNING nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.801 182096 DEBUG nova.virt.libvirt.host [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.802 182096 DEBUG nova.virt.libvirt.host [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.805 182096 DEBUG nova.virt.libvirt.host [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.805 182096 DEBUG nova.virt.libvirt.host [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.806 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.806 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.807 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.807 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.807 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.807 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.807 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.808 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.808 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.808 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.808 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.808 182096 DEBUG nova.virt.hardware [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.809 182096 DEBUG nova.objects.instance [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.822 182096 DEBUG oslo_concurrency.processutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.869 182096 DEBUG oslo_concurrency.processutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.config --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.870 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquiring lock "/var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.870 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lock "/var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.871 182096 DEBUG oslo_concurrency.lockutils [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lock "/var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.873 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <uuid>7bf6771c-28de-4cd3-a95a-d8a3e8b25928</uuid>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <name>instance-00000005</name>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <nova:name>tempest-MigrationsAdminTest-server-1223466208</nova:name>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:13:13</nova:creationTime>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:13:13 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:13:13 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:13:13 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:13:13 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:13:13 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:13:13 compute-0 nova_compute[182092]:         <nova:user uuid="03b66d4c15354adcaae5e4e4cd711e1b">tempest-MigrationsAdminTest-1469473097-project-member</nova:user>
Jan 23 09:13:13 compute-0 nova_compute[182092]:         <nova:project uuid="0ce2bc741e474401a3f9c2ef00c19693">tempest-MigrationsAdminTest-1469473097</nova:project>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <system>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <entry name="serial">7bf6771c-28de-4cd3-a95a-d8a3e8b25928</entry>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <entry name="uuid">7bf6771c-28de-4cd3-a95a-d8a3e8b25928</entry>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     </system>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <os>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   </os>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <features>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   </features>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.config"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/console.log" append="off"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <video>
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     </video>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:13:13 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:13:13 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:13:13 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:13:13 compute-0 nova_compute[182092]: </domain>
Jan 23 09:13:13 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.907 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.908 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:13:13 compute-0 nova_compute[182092]: 2026-01-23 09:13:13.908 182096 INFO nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Using config drive
Jan 23 09:13:13 compute-0 systemd-machined[153562]: New machine qemu-3-instance-00000005.
Jan 23 09:13:13 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.373 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159594.3731582, 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.374 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] VM Resumed (Lifecycle Event)
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.377 182096 DEBUG nova.compute.manager [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.380 182096 INFO nova.virt.libvirt.driver [-] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Instance running successfully.
Jan 23 09:13:14 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.382 182096 DEBUG nova.virt.libvirt.guest [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.382 182096 DEBUG nova.virt.libvirt.driver [None req-5bbd4f81-28a7-44a1-89c8-6709389624a5 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.413 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.418 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.460 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.461 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159594.3769906, 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.461 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] VM Started (Lifecycle Event)
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.482 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.484 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:13:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:14.607 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:13:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:14.608 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:13:14 compute-0 nova_compute[182092]: 2026-01-23 09:13:14.608 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:14.609 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:15 compute-0 nova_compute[182092]: 2026-01-23 09:13:15.376 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159580.3753989, 7fab73f4-ebb4-47b3-a487-7eeecb2e5295 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:15 compute-0 nova_compute[182092]: 2026-01-23 09:13:15.377 182096 INFO nova.compute.manager [-] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] VM Stopped (Lifecycle Event)
Jan 23 09:13:15 compute-0 nova_compute[182092]: 2026-01-23 09:13:15.402 182096 DEBUG nova.compute.manager [None req-63501c9a-62d1-4466-8663-beac166d92d6 - - - - - -] [instance: 7fab73f4-ebb4-47b3-a487-7eeecb2e5295] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:16 compute-0 nova_compute[182092]: 2026-01-23 09:13:16.829 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:17 compute-0 nova_compute[182092]: 2026-01-23 09:13:17.441 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:20 compute-0 podman[210619]: 2026-01-23 09:13:20.206151531 +0000 UTC m=+0.041831185 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 09:13:21 compute-0 nova_compute[182092]: 2026-01-23 09:13:21.830 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:21 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:13:21 compute-0 systemd[210557]: Activating special unit Exit the Session...
Jan 23 09:13:21 compute-0 systemd[210557]: Stopped target Main User Target.
Jan 23 09:13:21 compute-0 systemd[210557]: Stopped target Basic System.
Jan 23 09:13:21 compute-0 systemd[210557]: Stopped target Paths.
Jan 23 09:13:21 compute-0 systemd[210557]: Stopped target Sockets.
Jan 23 09:13:21 compute-0 systemd[210557]: Stopped target Timers.
Jan 23 09:13:21 compute-0 systemd[210557]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:13:21 compute-0 systemd[210557]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:13:21 compute-0 systemd[210557]: Closed D-Bus User Message Bus Socket.
Jan 23 09:13:21 compute-0 systemd[210557]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:13:21 compute-0 systemd[210557]: Removed slice User Application Slice.
Jan 23 09:13:21 compute-0 systemd[210557]: Reached target Shutdown.
Jan 23 09:13:21 compute-0 systemd[210557]: Finished Exit the Session.
Jan 23 09:13:21 compute-0 systemd[210557]: Reached target Exit the Session.
Jan 23 09:13:21 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:13:21 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:13:21 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:13:21 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:13:21 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:13:21 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:13:21 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:13:21 compute-0 podman[210635]: 2026-01-23 09:13:21.975756885 +0000 UTC m=+0.052689539 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:13:22 compute-0 nova_compute[182092]: 2026-01-23 09:13:22.443 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.026 182096 DEBUG nova.compute.manager [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.160 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.160 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.181 182096 DEBUG nova.objects.instance [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.190 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.190 182096 INFO nova.compute.claims [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.190 182096 DEBUG nova.objects.instance [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'resources' on Instance uuid 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.201 182096 DEBUG nova.objects.instance [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:25 compute-0 podman[210665]: 2026-01-23 09:13:25.219773634 +0000 UTC m=+0.051849985 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.226 182096 INFO nova.compute.resource_tracker [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Updating resource usage from migration 26706623-00f8-4fb9-8367-7f1939825631
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.226 182096 DEBUG nova.compute.resource_tracker [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Starting to track incoming migration 26706623-00f8-4fb9-8367-7f1939825631 with flavor 9e575731-b613-4b19-83e1-51cae9e2c5da _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.281 182096 DEBUG nova.compute.provider_tree [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.290 182096 DEBUG nova.scheduler.client.report [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.303 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:25 compute-0 nova_compute[182092]: 2026-01-23 09:13:25.304 182096 INFO nova.compute.manager [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Migrating
Jan 23 09:13:26 compute-0 sshd-session[210683]: Accepted publickey for nova from 192.168.122.102 port 52744 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:26 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:13:26 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:13:26 compute-0 systemd-logind[746]: New session 33 of user nova.
Jan 23 09:13:26 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:13:26 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:13:26 compute-0 systemd[210687]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:26 compute-0 systemd[210687]: Queued start job for default target Main User Target.
Jan 23 09:13:26 compute-0 systemd[210687]: Created slice User Application Slice.
Jan 23 09:13:26 compute-0 systemd[210687]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:13:26 compute-0 systemd[210687]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:13:26 compute-0 systemd[210687]: Reached target Paths.
Jan 23 09:13:26 compute-0 systemd[210687]: Reached target Timers.
Jan 23 09:13:26 compute-0 systemd[210687]: Starting D-Bus User Message Bus Socket...
Jan 23 09:13:26 compute-0 systemd[210687]: Starting Create User's Volatile Files and Directories...
Jan 23 09:13:26 compute-0 systemd[210687]: Finished Create User's Volatile Files and Directories.
Jan 23 09:13:26 compute-0 systemd[210687]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:13:26 compute-0 systemd[210687]: Reached target Sockets.
Jan 23 09:13:26 compute-0 systemd[210687]: Reached target Basic System.
Jan 23 09:13:26 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:13:26 compute-0 systemd[210687]: Reached target Main User Target.
Jan 23 09:13:26 compute-0 systemd[210687]: Startup finished in 96ms.
Jan 23 09:13:26 compute-0 systemd[1]: Started Session 33 of User nova.
Jan 23 09:13:26 compute-0 sshd-session[210683]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:26 compute-0 sshd-session[210703]: Received disconnect from 192.168.122.102 port 52744:11: disconnected by user
Jan 23 09:13:26 compute-0 sshd-session[210703]: Disconnected from user nova 192.168.122.102 port 52744
Jan 23 09:13:26 compute-0 sshd-session[210683]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:13:26 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Jan 23 09:13:26 compute-0 systemd-logind[746]: Session 33 logged out. Waiting for processes to exit.
Jan 23 09:13:26 compute-0 systemd-logind[746]: Removed session 33.
Jan 23 09:13:26 compute-0 sshd-session[210705]: Accepted publickey for nova from 192.168.122.102 port 52750 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:26 compute-0 systemd-logind[746]: New session 35 of user nova.
Jan 23 09:13:26 compute-0 systemd[1]: Started Session 35 of User nova.
Jan 23 09:13:26 compute-0 sshd-session[210705]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:26 compute-0 sshd-session[210708]: Received disconnect from 192.168.122.102 port 52750:11: disconnected by user
Jan 23 09:13:26 compute-0 sshd-session[210708]: Disconnected from user nova 192.168.122.102 port 52750
Jan 23 09:13:26 compute-0 sshd-session[210705]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:13:26 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Jan 23 09:13:26 compute-0 systemd-logind[746]: Session 35 logged out. Waiting for processes to exit.
Jan 23 09:13:26 compute-0 systemd-logind[746]: Removed session 35.
Jan 23 09:13:26 compute-0 nova_compute[182092]: 2026-01-23 09:13:26.831 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:27 compute-0 nova_compute[182092]: 2026-01-23 09:13:27.445 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:31 compute-0 nova_compute[182092]: 2026-01-23 09:13:31.833 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:32 compute-0 podman[210710]: 2026-01-23 09:13:32.228155174 +0000 UTC m=+0.066869696 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:13:32 compute-0 nova_compute[182092]: 2026-01-23 09:13:32.446 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.173 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "99ca87e0-9a78-444e-bbf7-2c2e20731100" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.173 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.189 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.271 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.271 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.277 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.277 182096 INFO nova.compute.claims [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.407 182096 DEBUG nova.compute.provider_tree [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.418 182096 DEBUG nova.scheduler.client.report [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.435 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.435 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.470 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.471 182096 DEBUG nova.network.neutron [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.481 182096 INFO nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.493 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.567 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.568 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.568 182096 INFO nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Creating image(s)
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.569 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "/var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.569 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "/var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.569 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "/var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.580 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.639 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.640 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.640 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.650 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.705 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.706 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.729 182096 DEBUG nova.policy [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '50480754599d4ae387d8c846a334d4bb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7866c4af706d4fac8aba28da7683a209', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.736 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.736 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.737 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.793 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.794 182096 DEBUG nova.virt.disk.api [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Checking if we can resize image /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.795 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.851 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.852 182096 DEBUG nova.virt.disk.api [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Cannot resize image /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.852 182096 DEBUG nova.objects.instance [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lazy-loading 'migration_context' on Instance uuid 99ca87e0-9a78-444e-bbf7-2c2e20731100 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.865 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.865 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Ensure instance console log exists: /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.866 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.866 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:33 compute-0 nova_compute[182092]: 2026-01-23 09:13:33.866 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:34 compute-0 nova_compute[182092]: 2026-01-23 09:13:34.424 182096 DEBUG nova.network.neutron [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Successfully created port: a34494c7-736a-4fd1-8d10-8cfcaebdfa0c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:13:35 compute-0 nova_compute[182092]: 2026-01-23 09:13:35.063 182096 DEBUG nova.network.neutron [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Successfully updated port: a34494c7-736a-4fd1-8d10-8cfcaebdfa0c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:13:35 compute-0 nova_compute[182092]: 2026-01-23 09:13:35.075 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:13:35 compute-0 nova_compute[182092]: 2026-01-23 09:13:35.076 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquired lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:13:35 compute-0 nova_compute[182092]: 2026-01-23 09:13:35.076 182096 DEBUG nova.network.neutron [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:13:35 compute-0 nova_compute[182092]: 2026-01-23 09:13:35.163 182096 DEBUG nova.compute.manager [req-ee8e3c7f-c66c-4fd5-a449-16b39c41bacf req-d15b8537-6b83-4c16-9289-1f2be2e65fb7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Received event network-changed-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:13:35 compute-0 nova_compute[182092]: 2026-01-23 09:13:35.163 182096 DEBUG nova.compute.manager [req-ee8e3c7f-c66c-4fd5-a449-16b39c41bacf req-d15b8537-6b83-4c16-9289-1f2be2e65fb7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Refreshing instance network info cache due to event network-changed-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:13:35 compute-0 nova_compute[182092]: 2026-01-23 09:13:35.164 182096 DEBUG oslo_concurrency.lockutils [req-ee8e3c7f-c66c-4fd5-a449-16b39c41bacf req-d15b8537-6b83-4c16-9289-1f2be2e65fb7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:13:35 compute-0 nova_compute[182092]: 2026-01-23 09:13:35.621 182096 DEBUG nova.network.neutron [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.285 182096 DEBUG nova.network.neutron [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Updating instance_info_cache with network_info: [{"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.299 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Releasing lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.299 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Instance network_info: |[{"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.299 182096 DEBUG oslo_concurrency.lockutils [req-ee8e3c7f-c66c-4fd5-a449-16b39c41bacf req-d15b8537-6b83-4c16-9289-1f2be2e65fb7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.300 182096 DEBUG nova.network.neutron [req-ee8e3c7f-c66c-4fd5-a449-16b39c41bacf req-d15b8537-6b83-4c16-9289-1f2be2e65fb7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Refreshing network info cache for port a34494c7-736a-4fd1-8d10-8cfcaebdfa0c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.302 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Start _get_guest_xml network_info=[{"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.306 182096 WARNING nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.311 182096 DEBUG nova.virt.libvirt.host [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.312 182096 DEBUG nova.virt.libvirt.host [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.314 182096 DEBUG nova.virt.libvirt.host [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.314 182096 DEBUG nova.virt.libvirt.host [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.315 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.315 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:13:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='420267028',id=26,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1325629211',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.316 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.316 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.316 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.316 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.317 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.317 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.317 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.317 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.317 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.318 182096 DEBUG nova.virt.hardware [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.320 182096 DEBUG nova.virt.libvirt.vif [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:13:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-557311420',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-557311420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-557311420',id=10,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnF9M/pmB8riKp/krNOOuokjWx9uWi7HlYZOvg1RAqabrh1siUuCWMfdq65vPoDZvjaYeqGc3d2cGJJGvJe4w2EAkygSiKea9MRplM49W4SX9X/cxJDh1vE8FS91IQ+UA==',key_name='tempest-keypair-1201646728',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7866c4af706d4fac8aba28da7683a209',ramdisk_id='',reservation_id='r-w20q3kv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:13:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='50480754599d4ae387d8c846a334d4bb',uuid=99ca87e0-9a78-444e-bbf7-2c2e20731100,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.320 182096 DEBUG nova.network.os_vif_util [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converting VIF {"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.321 182096 DEBUG nova.network.os_vif_util [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:62:f9,bridge_name='br-int',has_traffic_filtering=True,id=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa34494c7-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.322 182096 DEBUG nova.objects.instance [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lazy-loading 'pci_devices' on Instance uuid 99ca87e0-9a78-444e-bbf7-2c2e20731100 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.330 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <uuid>99ca87e0-9a78-444e-bbf7-2c2e20731100</uuid>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <name>instance-0000000a</name>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-557311420</nova:name>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:13:36</nova:creationTime>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <nova:flavor name="tempest-flavor_with_ephemeral_0-1325629211">
Jan 23 09:13:36 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:13:36 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:13:36 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:13:36 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:13:36 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:13:36 compute-0 nova_compute[182092]:         <nova:user uuid="50480754599d4ae387d8c846a334d4bb">tempest-ServersWithSpecificFlavorTestJSON-1599791507-project-member</nova:user>
Jan 23 09:13:36 compute-0 nova_compute[182092]:         <nova:project uuid="7866c4af706d4fac8aba28da7683a209">tempest-ServersWithSpecificFlavorTestJSON-1599791507</nova:project>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:13:36 compute-0 nova_compute[182092]:         <nova:port uuid="a34494c7-736a-4fd1-8d10-8cfcaebdfa0c">
Jan 23 09:13:36 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <system>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <entry name="serial">99ca87e0-9a78-444e-bbf7-2c2e20731100</entry>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <entry name="uuid">99ca87e0-9a78-444e-bbf7-2c2e20731100</entry>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </system>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <os>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   </os>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <features>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   </features>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk.config"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:18:62:f9"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <target dev="tapa34494c7-73"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/console.log" append="off"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <video>
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </video>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:13:36 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:13:36 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:13:36 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:13:36 compute-0 nova_compute[182092]: </domain>
Jan 23 09:13:36 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.331 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Preparing to wait for external event network-vif-plugged-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.332 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.332 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.332 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.333 182096 DEBUG nova.virt.libvirt.vif [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:13:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-557311420',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-557311420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-557311420',id=10,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnF9M/pmB8riKp/krNOOuokjWx9uWi7HlYZOvg1RAqabrh1siUuCWMfdq65vPoDZvjaYeqGc3d2cGJJGvJe4w2EAkygSiKea9MRplM49W4SX9X/cxJDh1vE8FS91IQ+UA==',key_name='tempest-keypair-1201646728',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7866c4af706d4fac8aba28da7683a209',ramdisk_id='',reservation_id='r-w20q3kv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:13:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='50480754599d4ae387d8c846a334d4bb',uuid=99ca87e0-9a78-444e-bbf7-2c2e20731100,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.333 182096 DEBUG nova.network.os_vif_util [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converting VIF {"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.334 182096 DEBUG nova.network.os_vif_util [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:62:f9,bridge_name='br-int',has_traffic_filtering=True,id=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa34494c7-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.334 182096 DEBUG os_vif [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:62:f9,bridge_name='br-int',has_traffic_filtering=True,id=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa34494c7-73') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.335 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.335 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.336 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.338 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.338 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa34494c7-73, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.339 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa34494c7-73, col_values=(('external_ids', {'iface-id': 'a34494c7-736a-4fd1-8d10-8cfcaebdfa0c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:62:f9', 'vm-uuid': '99ca87e0-9a78-444e-bbf7-2c2e20731100'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.340 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 NetworkManager[54920]: <info>  [1769159616.3419] manager: (tapa34494c7-73): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.344 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.347 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.348 182096 INFO os_vif [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:62:f9,bridge_name='br-int',has_traffic_filtering=True,id=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa34494c7-73')
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.385 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.385 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.385 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] No VIF found with MAC fa:16:3e:18:62:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.386 182096 INFO nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Using config drive
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.686 182096 INFO nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Creating config drive at /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk.config
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.692 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp052amf8i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.820 182096 DEBUG oslo_concurrency.processutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp052amf8i" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.834 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 kernel: tapa34494c7-73: entered promiscuous mode
Jan 23 09:13:36 compute-0 NetworkManager[54920]: <info>  [1769159616.8741] manager: (tapa34494c7-73): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Jan 23 09:13:36 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.878 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 systemd[210687]: Activating special unit Exit the Session...
Jan 23 09:13:36 compute-0 systemd[210687]: Stopped target Main User Target.
Jan 23 09:13:36 compute-0 systemd[210687]: Stopped target Basic System.
Jan 23 09:13:36 compute-0 systemd[210687]: Stopped target Paths.
Jan 23 09:13:36 compute-0 systemd[210687]: Stopped target Sockets.
Jan 23 09:13:36 compute-0 systemd[210687]: Stopped target Timers.
Jan 23 09:13:36 compute-0 systemd[210687]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:13:36 compute-0 systemd[210687]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:13:36 compute-0 systemd[210687]: Closed D-Bus User Message Bus Socket.
Jan 23 09:13:36 compute-0 systemd[210687]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.880 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 systemd[210687]: Removed slice User Application Slice.
Jan 23 09:13:36 compute-0 systemd[210687]: Reached target Shutdown.
Jan 23 09:13:36 compute-0 systemd[210687]: Finished Exit the Session.
Jan 23 09:13:36 compute-0 systemd[210687]: Reached target Exit the Session.
Jan 23 09:13:36 compute-0 ovn_controller[94697]: 2026-01-23T09:13:36Z|00035|binding|INFO|Claiming lport a34494c7-736a-4fd1-8d10-8cfcaebdfa0c for this chassis.
Jan 23 09:13:36 compute-0 ovn_controller[94697]: 2026-01-23T09:13:36Z|00036|binding|INFO|a34494c7-736a-4fd1-8d10-8cfcaebdfa0c: Claiming fa:16:3e:18:62:f9 10.100.0.7
Jan 23 09:13:36 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:13:36 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.890 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:62:f9 10.100.0.7'], port_security=['fa:16:3e:18:62:f9 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '99ca87e0-9a78-444e-bbf7-2c2e20731100', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d6a822-c968-4a80-a119-b33b7666b94b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7866c4af706d4fac8aba28da7683a209', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb83c6ea-b921-4a8a-b1b2-bf3532b122b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dad8fa26-a511-4939-9e59-86d36d58a8b2, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.891 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a34494c7-736a-4fd1-8d10-8cfcaebdfa0c in datapath 58d6a822-c968-4a80-a119-b33b7666b94b bound to our chassis
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.892 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58d6a822-c968-4a80-a119-b33b7666b94b
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.907 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[202311aa-f808-4b03-9b97-2612493fac54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.907 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58d6a822-c1 in ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.909 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58d6a822-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.909 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dec08756-6954-4c94-b765-96f850e1a020]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.910 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3d5b49c9-5ccc-4cba-b3ba-ea51f7975702]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:36 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.921 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f48431-e83d-4a89-bb2e-c97d39776626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:36 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.938 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:13:36 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:13:36 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:13:36 compute-0 ovn_controller[94697]: 2026-01-23T09:13:36Z|00037|binding|INFO|Setting lport a34494c7-736a-4fd1-8d10-8cfcaebdfa0c ovn-installed in OVS
Jan 23 09:13:36 compute-0 ovn_controller[94697]: 2026-01-23T09:13:36Z|00038|binding|INFO|Setting lport a34494c7-736a-4fd1-8d10-8cfcaebdfa0c up in Southbound
Jan 23 09:13:36 compute-0 nova_compute[182092]: 2026-01-23 09:13:36.944 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:36 compute-0 systemd-udevd[210771]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.958 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7fccea-9900-4b7f-8a01-177d9050d7e0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:36 compute-0 systemd-machined[153562]: New machine qemu-4-instance-0000000a.
Jan 23 09:13:36 compute-0 NetworkManager[54920]: <info>  [1769159616.9695] device (tapa34494c7-73): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:13:36 compute-0 NetworkManager[54920]: <info>  [1769159616.9704] device (tapa34494c7-73): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:13:36 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.988 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[aa464c89-7c39-4d7d-8f00-a23edc7ddcf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:36 compute-0 NetworkManager[54920]: <info>  [1769159616.9934] manager: (tap58d6a822-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Jan 23 09:13:36 compute-0 systemd-udevd[210776]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:13:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:36.993 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4bed45f5-6a44-49a2-9c59-403996ff98bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.025 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[499d06a8-ba26-45e3-a273-f2530f45392c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.028 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5d656ae0-8abf-4333-9e5f-d0e5c20e8c56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 NetworkManager[54920]: <info>  [1769159617.0498] device (tap58d6a822-c0): carrier: link connected
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.054 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[7d985923-ca25-4fb6-8f3d-4b2cbff93710]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.072 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f4989f4f-84b5-47ed-b8fa-2f4c06bb510a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d6a822-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:81:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312253, 'reachable_time': 36504, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210794, 'error': None, 'target': 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.088 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3fe177-7ccf-4bd2-a4d5-1d8f3a573519]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:8133'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 312253, 'tstamp': 312253}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210796, 'error': None, 'target': 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.102 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c17fb5fc-01eb-4016-b969-0e9e941a76c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d6a822-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:81:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312253, 'reachable_time': 36504, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210797, 'error': None, 'target': 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.130 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[56748314-e7ed-4325-9b7c-e3e20fc1fc6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.176 182096 DEBUG nova.compute.manager [req-c880d253-ded2-409c-80c6-ac6973096345 req-2274c0b8-4021-4099-8628-d2e39a0bae99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Received event network-vif-plugged-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.177 182096 DEBUG oslo_concurrency.lockutils [req-c880d253-ded2-409c-80c6-ac6973096345 req-2274c0b8-4021-4099-8628-d2e39a0bae99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.177 182096 DEBUG oslo_concurrency.lockutils [req-c880d253-ded2-409c-80c6-ac6973096345 req-2274c0b8-4021-4099-8628-d2e39a0bae99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.177 182096 DEBUG oslo_concurrency.lockutils [req-c880d253-ded2-409c-80c6-ac6973096345 req-2274c0b8-4021-4099-8628-d2e39a0bae99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.178 182096 DEBUG nova.compute.manager [req-c880d253-ded2-409c-80c6-ac6973096345 req-2274c0b8-4021-4099-8628-d2e39a0bae99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Processing event network-vif-plugged-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.182 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b20fdff6-f103-4671-8b7e-606ca66282a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.184 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d6a822-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.184 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.184 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58d6a822-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:37 compute-0 kernel: tap58d6a822-c0: entered promiscuous mode
Jan 23 09:13:37 compute-0 NetworkManager[54920]: <info>  [1769159617.1864] manager: (tap58d6a822-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.187 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.188 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58d6a822-c0, col_values=(('external_ids', {'iface-id': '4b01708c-9f7f-4701-bf11-ec5e700921db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:37 compute-0 ovn_controller[94697]: 2026-01-23T09:13:37Z|00039|binding|INFO|Releasing lport 4b01708c-9f7f-4701-bf11-ec5e700921db from this chassis (sb_readonly=0)
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.191 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58d6a822-c968-4a80-a119-b33b7666b94b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58d6a822-c968-4a80-a119-b33b7666b94b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.195 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8ee2c64d-47d8-4515-b277-e94e85aaf7c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.196 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-58d6a822-c968-4a80-a119-b33b7666b94b
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/58d6a822-c968-4a80-a119-b33b7666b94b.pid.haproxy
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 58d6a822-c968-4a80-a119-b33b7666b94b
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:13:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:37.197 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'env', 'PROCESS_TAG=haproxy-58d6a822-c968-4a80-a119-b33b7666b94b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58d6a822-c968-4a80-a119-b33b7666b94b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.210 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:37 compute-0 podman[210825]: 2026-01-23 09:13:37.53598249 +0000 UTC m=+0.036482861 container create 86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 09:13:37 compute-0 systemd[1]: Started libpod-conmon-86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398.scope.
Jan 23 09:13:37 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:13:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e903add69059fd238b44228c31d982fe6bd1338b8fd86354997bc22613c6d316/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:13:37 compute-0 podman[210825]: 2026-01-23 09:13:37.598116506 +0000 UTC m=+0.098616897 container init 86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:13:37 compute-0 podman[210825]: 2026-01-23 09:13:37.603268581 +0000 UTC m=+0.103768951 container start 86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:13:37 compute-0 podman[210825]: 2026-01-23 09:13:37.518774805 +0000 UTC m=+0.019275195 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:13:37 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[210840]: [NOTICE]   (210847) : New worker (210850) forked
Jan 23 09:13:37 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[210840]: [NOTICE]   (210847) : Loading success.
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.663 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159617.662815, 99ca87e0-9a78-444e-bbf7-2c2e20731100 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.663 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] VM Started (Lifecycle Event)
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.666 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.675 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.677 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.680 182096 INFO nova.virt.libvirt.driver [-] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Instance spawned successfully.
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.680 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.682 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.695 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.695 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159617.6636927, 99ca87e0-9a78-444e-bbf7-2c2e20731100 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.695 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] VM Paused (Lifecycle Event)
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.698 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.699 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.699 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.699 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.700 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.700 182096 DEBUG nova.virt.libvirt.driver [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.706 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.708 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159617.667936, 99ca87e0-9a78-444e-bbf7-2c2e20731100 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.708 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] VM Resumed (Lifecycle Event)
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.726 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.727 182096 DEBUG nova.network.neutron [req-ee8e3c7f-c66c-4fd5-a449-16b39c41bacf req-d15b8537-6b83-4c16-9289-1f2be2e65fb7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Updated VIF entry in instance network info cache for port a34494c7-736a-4fd1-8d10-8cfcaebdfa0c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.727 182096 DEBUG nova.network.neutron [req-ee8e3c7f-c66c-4fd5-a449-16b39c41bacf req-d15b8537-6b83-4c16-9289-1f2be2e65fb7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Updating instance_info_cache with network_info: [{"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.729 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.742 182096 DEBUG oslo_concurrency.lockutils [req-ee8e3c7f-c66c-4fd5-a449-16b39c41bacf req-d15b8537-6b83-4c16-9289-1f2be2e65fb7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.744 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.755 182096 INFO nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Took 4.19 seconds to spawn the instance on the hypervisor.
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.756 182096 DEBUG nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.803 182096 INFO nova.compute.manager [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Took 4.56 seconds to build instance.
Jan 23 09:13:37 compute-0 nova_compute[182092]: 2026-01-23 09:13:37.815 182096 DEBUG oslo_concurrency.lockutils [None req-f70e77d2-6624-4d34-ba0d-b0ac6b0fc5e2 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:38 compute-0 podman[210856]: 2026-01-23 09:13:38.21927356 +0000 UTC m=+0.049948341 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:13:38 compute-0 podman[210855]: 2026-01-23 09:13:38.225284995 +0000 UTC m=+0.057563159 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 23 09:13:39 compute-0 nova_compute[182092]: 2026-01-23 09:13:39.313 182096 DEBUG nova.compute.manager [req-b559b772-0a96-41e8-9d4f-2d49765d5798 req-f4871b7a-ca5c-4ff0-889f-f58e18015e52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Received event network-vif-plugged-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:13:39 compute-0 nova_compute[182092]: 2026-01-23 09:13:39.314 182096 DEBUG oslo_concurrency.lockutils [req-b559b772-0a96-41e8-9d4f-2d49765d5798 req-f4871b7a-ca5c-4ff0-889f-f58e18015e52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:39 compute-0 nova_compute[182092]: 2026-01-23 09:13:39.315 182096 DEBUG oslo_concurrency.lockutils [req-b559b772-0a96-41e8-9d4f-2d49765d5798 req-f4871b7a-ca5c-4ff0-889f-f58e18015e52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:39 compute-0 nova_compute[182092]: 2026-01-23 09:13:39.315 182096 DEBUG oslo_concurrency.lockutils [req-b559b772-0a96-41e8-9d4f-2d49765d5798 req-f4871b7a-ca5c-4ff0-889f-f58e18015e52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:39 compute-0 nova_compute[182092]: 2026-01-23 09:13:39.315 182096 DEBUG nova.compute.manager [req-b559b772-0a96-41e8-9d4f-2d49765d5798 req-f4871b7a-ca5c-4ff0-889f-f58e18015e52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] No waiting events found dispatching network-vif-plugged-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:13:39 compute-0 nova_compute[182092]: 2026-01-23 09:13:39.316 182096 WARNING nova.compute.manager [req-b559b772-0a96-41e8-9d4f-2d49765d5798 req-f4871b7a-ca5c-4ff0-889f-f58e18015e52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Received unexpected event network-vif-plugged-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c for instance with vm_state active and task_state None.
Jan 23 09:13:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:39.850 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:39.851 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:39.852 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:39 compute-0 sshd-session[210893]: Accepted publickey for nova from 192.168.122.102 port 43262 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:39 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:13:39 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:13:39 compute-0 systemd-logind[746]: New session 36 of user nova.
Jan 23 09:13:39 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:13:40 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:13:40 compute-0 systemd[210897]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <info>  [1769159620.0437] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/29)
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <info>  [1769159620.0442] device (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <warn>  [1769159620.0443] device (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <info>  [1769159620.0447] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/30)
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <info>  [1769159620.0448] device (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <warn>  [1769159620.0449] device (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:13:40 compute-0 nova_compute[182092]: 2026-01-23 09:13:40.043 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <info>  [1769159620.0452] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <info>  [1769159620.0455] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <info>  [1769159620.0457] device (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 09:13:40 compute-0 NetworkManager[54920]: <info>  [1769159620.0458] device (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 09:13:40 compute-0 nova_compute[182092]: 2026-01-23 09:13:40.118 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:40 compute-0 ovn_controller[94697]: 2026-01-23T09:13:40Z|00040|binding|INFO|Releasing lport 4b01708c-9f7f-4701-bf11-ec5e700921db from this chassis (sb_readonly=0)
Jan 23 09:13:40 compute-0 nova_compute[182092]: 2026-01-23 09:13:40.130 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:40 compute-0 systemd[210897]: Queued start job for default target Main User Target.
Jan 23 09:13:40 compute-0 systemd[210897]: Created slice User Application Slice.
Jan 23 09:13:40 compute-0 systemd[210897]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:13:40 compute-0 systemd[210897]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:13:40 compute-0 systemd[210897]: Reached target Paths.
Jan 23 09:13:40 compute-0 systemd[210897]: Reached target Timers.
Jan 23 09:13:40 compute-0 systemd[210897]: Starting D-Bus User Message Bus Socket...
Jan 23 09:13:40 compute-0 systemd[210897]: Starting Create User's Volatile Files and Directories...
Jan 23 09:13:40 compute-0 systemd[210897]: Finished Create User's Volatile Files and Directories.
Jan 23 09:13:40 compute-0 systemd[210897]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:13:40 compute-0 systemd[210897]: Reached target Sockets.
Jan 23 09:13:40 compute-0 systemd[210897]: Reached target Basic System.
Jan 23 09:13:40 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:13:40 compute-0 systemd[210897]: Reached target Main User Target.
Jan 23 09:13:40 compute-0 systemd[210897]: Startup finished in 147ms.
Jan 23 09:13:40 compute-0 systemd[1]: Started Session 36 of User nova.
Jan 23 09:13:40 compute-0 sshd-session[210893]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:40 compute-0 sshd-session[210914]: Received disconnect from 192.168.122.102 port 43262:11: disconnected by user
Jan 23 09:13:40 compute-0 sshd-session[210914]: Disconnected from user nova 192.168.122.102 port 43262
Jan 23 09:13:40 compute-0 sshd-session[210893]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:13:40 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Jan 23 09:13:40 compute-0 systemd-logind[746]: Session 36 logged out. Waiting for processes to exit.
Jan 23 09:13:40 compute-0 systemd-logind[746]: Removed session 36.
Jan 23 09:13:40 compute-0 sshd-session[210916]: Accepted publickey for nova from 192.168.122.102 port 43268 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:40 compute-0 systemd-logind[746]: New session 38 of user nova.
Jan 23 09:13:40 compute-0 systemd[1]: Started Session 38 of User nova.
Jan 23 09:13:40 compute-0 sshd-session[210916]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:40 compute-0 sshd-session[210919]: Received disconnect from 192.168.122.102 port 43268:11: disconnected by user
Jan 23 09:13:40 compute-0 sshd-session[210919]: Disconnected from user nova 192.168.122.102 port 43268
Jan 23 09:13:40 compute-0 sshd-session[210916]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:13:40 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Jan 23 09:13:40 compute-0 systemd-logind[746]: Session 38 logged out. Waiting for processes to exit.
Jan 23 09:13:40 compute-0 systemd-logind[746]: Removed session 38.
Jan 23 09:13:40 compute-0 sshd-session[210921]: Accepted publickey for nova from 192.168.122.102 port 43276 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:40 compute-0 systemd-logind[746]: New session 39 of user nova.
Jan 23 09:13:40 compute-0 systemd[1]: Started Session 39 of User nova.
Jan 23 09:13:40 compute-0 sshd-session[210921]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:13:40 compute-0 sshd-session[210924]: Received disconnect from 192.168.122.102 port 43276:11: disconnected by user
Jan 23 09:13:40 compute-0 sshd-session[210924]: Disconnected from user nova 192.168.122.102 port 43276
Jan 23 09:13:40 compute-0 sshd-session[210921]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:13:40 compute-0 systemd-logind[746]: Session 39 logged out. Waiting for processes to exit.
Jan 23 09:13:40 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Jan 23 09:13:40 compute-0 systemd-logind[746]: Removed session 39.
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.341 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.566 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.567 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquired lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.567 182096 DEBUG nova.network.neutron [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.641 182096 DEBUG nova.compute.manager [req-e3f8b45f-4a12-46e2-b644-a3e309c3463f req-50e66422-e3fc-4558-aabf-14ffc9f730a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Received event network-changed-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.643 182096 DEBUG nova.compute.manager [req-e3f8b45f-4a12-46e2-b644-a3e309c3463f req-50e66422-e3fc-4558-aabf-14ffc9f730a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Refreshing instance network info cache due to event network-changed-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.643 182096 DEBUG oslo_concurrency.lockutils [req-e3f8b45f-4a12-46e2-b644-a3e309c3463f req-50e66422-e3fc-4558-aabf-14ffc9f730a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.643 182096 DEBUG oslo_concurrency.lockutils [req-e3f8b45f-4a12-46e2-b644-a3e309c3463f req-50e66422-e3fc-4558-aabf-14ffc9f730a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.643 182096 DEBUG nova.network.neutron [req-e3f8b45f-4a12-46e2-b644-a3e309c3463f req-50e66422-e3fc-4558-aabf-14ffc9f730a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Refreshing network info cache for port a34494c7-736a-4fd1-8d10-8cfcaebdfa0c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.700 182096 DEBUG nova.network.neutron [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:13:41 compute-0 nova_compute[182092]: 2026-01-23 09:13:41.836 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:42 compute-0 nova_compute[182092]: 2026-01-23 09:13:42.843 182096 DEBUG nova.network.neutron [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:42 compute-0 nova_compute[182092]: 2026-01-23 09:13:42.872 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Releasing lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:13:42 compute-0 nova_compute[182092]: 2026-01-23 09:13:42.967 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 23 09:13:42 compute-0 nova_compute[182092]: 2026-01-23 09:13:42.969 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 23 09:13:42 compute-0 nova_compute[182092]: 2026-01-23 09:13:42.969 182096 INFO nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Creating image(s)
Jan 23 09:13:42 compute-0 nova_compute[182092]: 2026-01-23 09:13:42.970 182096 DEBUG nova.objects.instance [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:42 compute-0 nova_compute[182092]: 2026-01-23 09:13:42.979 182096 DEBUG oslo_concurrency.processutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.041 182096 DEBUG oslo_concurrency.processutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.043 182096 DEBUG nova.virt.disk.api [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Checking if we can resize image /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.043 182096 DEBUG oslo_concurrency.processutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.103 182096 DEBUG oslo_concurrency.processutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.105 182096 DEBUG nova.virt.disk.api [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Cannot resize image /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.129 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.129 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Ensure instance console log exists: /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.130 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.130 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.131 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.132 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.137 182096 WARNING nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.141 182096 DEBUG nova.virt.libvirt.host [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.141 182096 DEBUG nova.virt.libvirt.host [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.143 182096 DEBUG nova.virt.libvirt.host [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.144 182096 DEBUG nova.virt.libvirt.host [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.145 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.145 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9e575731-b613-4b19-83e1-51cae9e2c5da',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.146 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.146 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.146 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.147 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.147 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.147 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.147 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.148 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.148 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.148 182096 DEBUG nova.virt.hardware [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.148 182096 DEBUG nova.objects.instance [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.163 182096 DEBUG oslo_concurrency.processutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.223 182096 DEBUG oslo_concurrency.processutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.config --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.225 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "/var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.225 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "/var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.226 182096 DEBUG oslo_concurrency.lockutils [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "/var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.229 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <uuid>0e8dc87a-78f6-4d2d-b19b-6c51917bfd58</uuid>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <name>instance-00000009</name>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <memory>196608</memory>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <nova:name>tempest-MigrationsAdminTest-server-2016787926</nova:name>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:13:43</nova:creationTime>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <nova:flavor name="m1.micro">
Jan 23 09:13:43 compute-0 nova_compute[182092]:         <nova:memory>192</nova:memory>
Jan 23 09:13:43 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:13:43 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:13:43 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:13:43 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:13:43 compute-0 nova_compute[182092]:         <nova:user uuid="03b66d4c15354adcaae5e4e4cd711e1b">tempest-MigrationsAdminTest-1469473097-project-member</nova:user>
Jan 23 09:13:43 compute-0 nova_compute[182092]:         <nova:project uuid="0ce2bc741e474401a3f9c2ef00c19693">tempest-MigrationsAdminTest-1469473097</nova:project>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <system>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <entry name="serial">0e8dc87a-78f6-4d2d-b19b-6c51917bfd58</entry>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <entry name="uuid">0e8dc87a-78f6-4d2d-b19b-6c51917bfd58</entry>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     </system>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <os>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   </os>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <features>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   </features>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.config"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/console.log" append="off"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <video>
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     </video>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:13:43 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:13:43 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:13:43 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:13:43 compute-0 nova_compute[182092]: </domain>
Jan 23 09:13:43 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.378 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.381 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.382 182096 INFO nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Using config drive
Jan 23 09:13:43 compute-0 systemd-machined[153562]: New machine qemu-5-instance-00000009.
Jan 23 09:13:43 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.636 182096 DEBUG nova.network.neutron [req-e3f8b45f-4a12-46e2-b644-a3e309c3463f req-50e66422-e3fc-4558-aabf-14ffc9f730a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Updated VIF entry in instance network info cache for port a34494c7-736a-4fd1-8d10-8cfcaebdfa0c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.637 182096 DEBUG nova.network.neutron [req-e3f8b45f-4a12-46e2-b644-a3e309c3463f req-50e66422-e3fc-4558-aabf-14ffc9f730a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Updating instance_info_cache with network_info: [{"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:43 compute-0 nova_compute[182092]: 2026-01-23 09:13:43.650 182096 DEBUG oslo_concurrency.lockutils [req-e3f8b45f-4a12-46e2-b644-a3e309c3463f req-50e66422-e3fc-4558-aabf-14ffc9f730a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-99ca87e0-9a78-444e-bbf7-2c2e20731100" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.095 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159626.095475, 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.096 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] VM Resumed (Lifecycle Event)
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.098 182096 DEBUG nova.compute.manager [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.101 182096 INFO nova.virt.libvirt.driver [-] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Instance running successfully.
Jan 23 09:13:46 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.102 182096 DEBUG nova.virt.libvirt.guest [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.103 182096 DEBUG nova.virt.libvirt.driver [None req-9e71136e-5765-47d9-bd9e-6819121cd16c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.125 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.130 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.167 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.167 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159626.0955932, 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.167 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] VM Started (Lifecycle Event)
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.185 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.187 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.348 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:46 compute-0 nova_compute[182092]: 2026-01-23 09:13:46.838 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:48 compute-0 ovn_controller[94697]: 2026-01-23T09:13:48Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:18:62:f9 10.100.0.7
Jan 23 09:13:48 compute-0 ovn_controller[94697]: 2026-01-23T09:13:48Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:18:62:f9 10.100.0.7
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.671 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.671 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.672 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.672 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.743 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.815 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.815 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.879 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.884 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.946 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:49 compute-0 nova_compute[182092]: 2026-01-23 09:13:49.946 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.009 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.016 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.081 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.082 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.144 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.381 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.383 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5210MB free_disk=73.29507446289062GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.383 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.383 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.457 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.457 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.457 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 99ca87e0-9a78-444e-bbf7-2c2e20731100 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.457 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.458 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=4 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.520 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.532 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.547 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:13:50 compute-0 nova_compute[182092]: 2026-01-23 09:13:50.547 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:51 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:13:51 compute-0 systemd[210897]: Activating special unit Exit the Session...
Jan 23 09:13:51 compute-0 systemd[210897]: Stopped target Main User Target.
Jan 23 09:13:51 compute-0 systemd[210897]: Stopped target Basic System.
Jan 23 09:13:51 compute-0 systemd[210897]: Stopped target Paths.
Jan 23 09:13:51 compute-0 systemd[210897]: Stopped target Sockets.
Jan 23 09:13:51 compute-0 systemd[210897]: Stopped target Timers.
Jan 23 09:13:51 compute-0 systemd[210897]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:13:51 compute-0 systemd[210897]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:13:51 compute-0 systemd[210897]: Closed D-Bus User Message Bus Socket.
Jan 23 09:13:51 compute-0 systemd[210897]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:13:51 compute-0 systemd[210897]: Removed slice User Application Slice.
Jan 23 09:13:51 compute-0 systemd[210897]: Reached target Shutdown.
Jan 23 09:13:51 compute-0 systemd[210897]: Finished Exit the Session.
Jan 23 09:13:51 compute-0 systemd[210897]: Reached target Exit the Session.
Jan 23 09:13:51 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:13:51 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:13:51 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:13:51 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:13:51 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:13:51 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:13:51 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:13:51 compute-0 podman[210996]: 2026-01-23 09:13:51.215300109 +0000 UTC m=+0.048528935 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.352 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.543 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.544 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.544 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.669 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.669 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.669 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.840 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.842 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.843 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.843 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:13:51 compute-0 nova_compute[182092]: 2026-01-23 09:13:51.843 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:52 compute-0 nova_compute[182092]: 2026-01-23 09:13:52.096 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:13:52 compute-0 podman[211013]: 2026-01-23 09:13:52.20524743 +0000 UTC m=+0.043088816 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:13:52 compute-0 nova_compute[182092]: 2026-01-23 09:13:52.338 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:52 compute-0 nova_compute[182092]: 2026-01-23 09:13:52.351 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:13:52 compute-0 nova_compute[182092]: 2026-01-23 09:13:52.352 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:13:52 compute-0 nova_compute[182092]: 2026-01-23 09:13:52.352 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:52 compute-0 nova_compute[182092]: 2026-01-23 09:13:52.352 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:13:52 compute-0 nova_compute[182092]: 2026-01-23 09:13:52.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:53 compute-0 nova_compute[182092]: 2026-01-23 09:13:53.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:13:56 compute-0 podman[211046]: 2026-01-23 09:13:56.209369274 +0000 UTC m=+0.046248914 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Jan 23 09:13:56 compute-0 nova_compute[182092]: 2026-01-23 09:13:56.355 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:56 compute-0 nova_compute[182092]: 2026-01-23 09:13:56.842 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.026 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.268 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "99ca87e0-9a78-444e-bbf7-2c2e20731100" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.268 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.268 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.268 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.269 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.275 182096 INFO nova.compute.manager [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Terminating instance
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.280 182096 DEBUG nova.compute.manager [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:13:57 compute-0 kernel: tapa34494c7-73 (unregistering): left promiscuous mode
Jan 23 09:13:57 compute-0 NetworkManager[54920]: <info>  [1769159637.3038] device (tapa34494c7-73): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:13:57 compute-0 ovn_controller[94697]: 2026-01-23T09:13:57Z|00041|binding|INFO|Releasing lport a34494c7-736a-4fd1-8d10-8cfcaebdfa0c from this chassis (sb_readonly=0)
Jan 23 09:13:57 compute-0 ovn_controller[94697]: 2026-01-23T09:13:57Z|00042|binding|INFO|Setting lport a34494c7-736a-4fd1-8d10-8cfcaebdfa0c down in Southbound
Jan 23 09:13:57 compute-0 ovn_controller[94697]: 2026-01-23T09:13:57Z|00043|binding|INFO|Removing iface tapa34494c7-73 ovn-installed in OVS
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.312 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:62:f9 10.100.0.7'], port_security=['fa:16:3e:18:62:f9 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '99ca87e0-9a78-444e-bbf7-2c2e20731100', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d6a822-c968-4a80-a119-b33b7666b94b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7866c4af706d4fac8aba28da7683a209', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb83c6ea-b921-4a8a-b1b2-bf3532b122b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dad8fa26-a511-4939-9e59-86d36d58a8b2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.313 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a34494c7-736a-4fd1-8d10-8cfcaebdfa0c in datapath 58d6a822-c968-4a80-a119-b33b7666b94b unbound from our chassis
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.309 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.314 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58d6a822-c968-4a80-a119-b33b7666b94b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.316 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0bedf4-f560-4fb3-aa6f-c39632eaa687]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.316 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b namespace which is not needed anymore
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.324 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 23 09:13:57 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 11.708s CPU time.
Jan 23 09:13:57 compute-0 systemd-machined[153562]: Machine qemu-4-instance-0000000a terminated.
Jan 23 09:13:57 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[210840]: [NOTICE]   (210847) : haproxy version is 2.8.14-c23fe91
Jan 23 09:13:57 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[210840]: [NOTICE]   (210847) : path to executable is /usr/sbin/haproxy
Jan 23 09:13:57 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[210840]: [WARNING]  (210847) : Exiting Master process...
Jan 23 09:13:57 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[210840]: [ALERT]    (210847) : Current worker (210850) exited with code 143 (Terminated)
Jan 23 09:13:57 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[210840]: [WARNING]  (210847) : All workers exited. Exiting... (0)
Jan 23 09:13:57 compute-0 systemd[1]: libpod-86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398.scope: Deactivated successfully.
Jan 23 09:13:57 compute-0 podman[211085]: 2026-01-23 09:13:57.417033237 +0000 UTC m=+0.035757653 container died 86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:13:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398-userdata-shm.mount: Deactivated successfully.
Jan 23 09:13:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-e903add69059fd238b44228c31d982fe6bd1338b8fd86354997bc22613c6d316-merged.mount: Deactivated successfully.
Jan 23 09:13:57 compute-0 podman[211085]: 2026-01-23 09:13:57.4362282 +0000 UTC m=+0.054952626 container cleanup 86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 09:13:57 compute-0 systemd[1]: libpod-conmon-86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398.scope: Deactivated successfully.
Jan 23 09:13:57 compute-0 podman[211108]: 2026-01-23 09:13:57.475527631 +0000 UTC m=+0.023956633 container remove 86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.478 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[78ccc97f-384e-4efc-8821-9200c19796b6]: (4, ('Fri Jan 23 09:13:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b (86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398)\n86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398\nFri Jan 23 09:13:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b (86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398)\n86168152f1e1aef63ba661c1b5b968ee1fe46381cda0856bbe160fd2c4096398\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.479 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7912ace9-9f3a-40f5-9fdd-8f484cada61d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.480 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d6a822-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.481 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 kernel: tap58d6a822-c0: left promiscuous mode
Jan 23 09:13:57 compute-0 NetworkManager[54920]: <info>  [1769159637.4978] manager: (tapa34494c7-73): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.499 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.501 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b99b71fa-ffdf-4675-9763-c535466b4df6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.509 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3b15b9d4-8198-491f-9a65-4ef900f3a726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.510 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1158fd-90dd-452e-a8bd-7f1834183e99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.524 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ec4ae8-23eb-4640-a066-81d68bd1650f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312247, 'reachable_time': 25788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211131, 'error': None, 'target': 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d58d6a822\x2dc968\x2d4a80\x2da119\x2db33b7666b94b.mount: Deactivated successfully.
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.528 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:13:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:13:57.528 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[6005f22a-1ed6-49df-99c8-86868252c5fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.529 182096 INFO nova.virt.libvirt.driver [-] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Instance destroyed successfully.
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.530 182096 DEBUG nova.objects.instance [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lazy-loading 'resources' on Instance uuid 99ca87e0-9a78-444e-bbf7-2c2e20731100 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.542 182096 DEBUG nova.virt.libvirt.vif [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:13:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-557311420',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-557311420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-557311420',id=10,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnF9M/pmB8riKp/krNOOuokjWx9uWi7HlYZOvg1RAqabrh1siUuCWMfdq65vPoDZvjaYeqGc3d2cGJJGvJe4w2EAkygSiKea9MRplM49W4SX9X/cxJDh1vE8FS91IQ+UA==',key_name='tempest-keypair-1201646728',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:13:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7866c4af706d4fac8aba28da7683a209',ramdisk_id='',reservation_id='r-w20q3kv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:13:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='50480754599d4ae387d8c846a334d4bb',uuid=99ca87e0-9a78-444e-bbf7-2c2e20731100,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.542 182096 DEBUG nova.network.os_vif_util [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converting VIF {"id": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "address": "fa:16:3e:18:62:f9", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa34494c7-73", "ovs_interfaceid": "a34494c7-736a-4fd1-8d10-8cfcaebdfa0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.543 182096 DEBUG nova.network.os_vif_util [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:18:62:f9,bridge_name='br-int',has_traffic_filtering=True,id=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa34494c7-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.543 182096 DEBUG os_vif [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:18:62:f9,bridge_name='br-int',has_traffic_filtering=True,id=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa34494c7-73') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.545 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.545 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa34494c7-73, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.546 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.550 182096 INFO os_vif [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:18:62:f9,bridge_name='br-int',has_traffic_filtering=True,id=a34494c7-736a-4fd1-8d10-8cfcaebdfa0c,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa34494c7-73')
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.550 182096 INFO nova.virt.libvirt.driver [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Deleting instance files /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100_del
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.551 182096 INFO nova.virt.libvirt.driver [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Deletion of /var/lib/nova/instances/99ca87e0-9a78-444e-bbf7-2c2e20731100_del complete
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.613 182096 INFO nova.compute.manager [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.614 182096 DEBUG oslo.service.loopingcall [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.615 182096 DEBUG nova.compute.manager [-] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:13:57 compute-0 nova_compute[182092]: 2026-01-23 09:13:57.615 182096 DEBUG nova.network.neutron [-] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.091 182096 DEBUG nova.compute.manager [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.178 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.179 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.195 182096 DEBUG nova.objects.instance [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'pci_requests' on Instance uuid 72b93748-a6a0-4c01-93f6-5509b35c13e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.206 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.207 182096 INFO nova.compute.claims [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.207 182096 DEBUG nova.objects.instance [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'resources' on Instance uuid 72b93748-a6a0-4c01-93f6-5509b35c13e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.214 182096 DEBUG nova.objects.instance [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'pci_devices' on Instance uuid 72b93748-a6a0-4c01-93f6-5509b35c13e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.244 182096 INFO nova.compute.resource_tracker [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Updating resource usage from migration ea11aebc-54ba-430a-9885-4b4f15cc3dd2
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.245 182096 DEBUG nova.compute.resource_tracker [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Starting to track incoming migration ea11aebc-54ba-430a-9885-4b4f15cc3dd2 with flavor 9e575731-b613-4b19-83e1-51cae9e2c5da _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.346 182096 DEBUG nova.compute.provider_tree [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.370 182096 DEBUG nova.scheduler.client.report [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.383 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.384 182096 INFO nova.compute.manager [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Migrating
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.677 182096 DEBUG nova.network.neutron [-] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.697 182096 INFO nova.compute.manager [-] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Took 2.08 seconds to deallocate network for instance.
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.751 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.752 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.840 182096 DEBUG nova.compute.provider_tree [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.851 182096 DEBUG nova.scheduler.client.report [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.882 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.898 182096 DEBUG nova.compute.manager [req-ee81f2a7-be19-4c82-a5d6-b913fc4681c4 req-67175156-9acf-4863-8bcf-072699da912a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Received event network-vif-deleted-a34494c7-736a-4fd1-8d10-8cfcaebdfa0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.906 182096 INFO nova.scheduler.client.report [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Deleted allocations for instance 99ca87e0-9a78-444e-bbf7-2c2e20731100
Jan 23 09:13:59 compute-0 nova_compute[182092]: 2026-01-23 09:13:59.972 182096 DEBUG oslo_concurrency.lockutils [None req-160eb450-880d-4e1a-9f90-3e663e5b9a59 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "99ca87e0-9a78-444e-bbf7-2c2e20731100" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:13:59 compute-0 sshd-session[211138]: Accepted publickey for nova from 192.168.122.102 port 34820 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:13:59 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:14:00 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:14:00 compute-0 systemd-logind[746]: New session 40 of user nova.
Jan 23 09:14:00 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:14:00 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:14:00 compute-0 systemd[211142]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:14:00 compute-0 systemd[211142]: Queued start job for default target Main User Target.
Jan 23 09:14:00 compute-0 systemd[211142]: Created slice User Application Slice.
Jan 23 09:14:00 compute-0 systemd[211142]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:14:00 compute-0 systemd[211142]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:14:00 compute-0 systemd[211142]: Reached target Paths.
Jan 23 09:14:00 compute-0 systemd[211142]: Reached target Timers.
Jan 23 09:14:00 compute-0 systemd[211142]: Starting D-Bus User Message Bus Socket...
Jan 23 09:14:00 compute-0 systemd[211142]: Starting Create User's Volatile Files and Directories...
Jan 23 09:14:00 compute-0 systemd[211142]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:14:00 compute-0 systemd[211142]: Reached target Sockets.
Jan 23 09:14:00 compute-0 systemd[211142]: Finished Create User's Volatile Files and Directories.
Jan 23 09:14:00 compute-0 systemd[211142]: Reached target Basic System.
Jan 23 09:14:00 compute-0 systemd[211142]: Reached target Main User Target.
Jan 23 09:14:00 compute-0 systemd[211142]: Startup finished in 94ms.
Jan 23 09:14:00 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:14:00 compute-0 systemd[1]: Started Session 40 of User nova.
Jan 23 09:14:00 compute-0 sshd-session[211138]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:14:00 compute-0 sshd-session[211159]: Received disconnect from 192.168.122.102 port 34820:11: disconnected by user
Jan 23 09:14:00 compute-0 sshd-session[211159]: Disconnected from user nova 192.168.122.102 port 34820
Jan 23 09:14:00 compute-0 sshd-session[211138]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:14:00 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Jan 23 09:14:00 compute-0 systemd-logind[746]: Session 40 logged out. Waiting for processes to exit.
Jan 23 09:14:00 compute-0 systemd-logind[746]: Removed session 40.
Jan 23 09:14:00 compute-0 sshd-session[211161]: Accepted publickey for nova from 192.168.122.102 port 34830 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:14:00 compute-0 systemd-logind[746]: New session 42 of user nova.
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.303 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:00 compute-0 systemd[1]: Started Session 42 of User nova.
Jan 23 09:14:00 compute-0 sshd-session[211161]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:14:00 compute-0 sshd-session[211164]: Received disconnect from 192.168.122.102 port 34830:11: disconnected by user
Jan 23 09:14:00 compute-0 sshd-session[211164]: Disconnected from user nova 192.168.122.102 port 34830
Jan 23 09:14:00 compute-0 sshd-session[211161]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:14:00 compute-0 systemd[1]: session-42.scope: Deactivated successfully.
Jan 23 09:14:00 compute-0 systemd-logind[746]: Session 42 logged out. Waiting for processes to exit.
Jan 23 09:14:00 compute-0 systemd-logind[746]: Removed session 42.
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.714 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "52fc65a3-ff92-4710-bcde-048b467fdd86" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.715 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.728 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.787 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.787 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.792 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.792 182096 INFO nova.compute.claims [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.943 182096 DEBUG nova.compute.provider_tree [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.952 182096 DEBUG nova.scheduler.client.report [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.982 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:00 compute-0 nova_compute[182092]: 2026-01-23 09:14:00.982 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.043 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.043 182096 DEBUG nova.network.neutron [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.067 182096 INFO nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.093 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.192 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.193 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.193 182096 INFO nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Creating image(s)
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.193 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.194 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.194 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.205 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.250 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.251 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.252 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.261 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.304 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.305 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.322 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.323 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.323 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.365 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.365 182096 DEBUG nova.virt.disk.api [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Checking if we can resize image /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.366 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.407 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.408 182096 DEBUG nova.virt.disk.api [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Cannot resize image /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.408 182096 DEBUG nova.objects.instance [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lazy-loading 'migration_context' on Instance uuid 52fc65a3-ff92-4710-bcde-048b467fdd86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.428 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.428 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.428 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.429 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.429 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.429 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.444 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.444 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.464 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.465 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.476 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.518 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.519 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.520 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.529 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.574 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.575 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.eph0 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.595 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.eph0 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.595 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.596 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.628 182096 DEBUG nova.policy [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '50480754599d4ae387d8c846a334d4bb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7866c4af706d4fac8aba28da7683a209', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.640 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.641 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.641 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Ensure instance console log exists: /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.641 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.641 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.642 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:01 compute-0 nova_compute[182092]: 2026-01-23 09:14:01.844 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:02 compute-0 nova_compute[182092]: 2026-01-23 09:14:02.546 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:02 compute-0 nova_compute[182092]: 2026-01-23 09:14:02.679 182096 DEBUG nova.network.neutron [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Successfully created port: a4f3c92a-baee-40e3-8bfd-a92067a453df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:14:03 compute-0 podman[211197]: 2026-01-23 09:14:03.224205508 +0000 UTC m=+0.058711454 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:14:03 compute-0 nova_compute[182092]: 2026-01-23 09:14:03.608 182096 DEBUG nova.network.neutron [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Successfully updated port: a4f3c92a-baee-40e3-8bfd-a92067a453df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:14:03 compute-0 nova_compute[182092]: 2026-01-23 09:14:03.623 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:03 compute-0 nova_compute[182092]: 2026-01-23 09:14:03.623 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquired lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:03 compute-0 nova_compute[182092]: 2026-01-23 09:14:03.623 182096 DEBUG nova.network.neutron [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:03 compute-0 nova_compute[182092]: 2026-01-23 09:14:03.683 182096 DEBUG nova.compute.manager [req-38807e14-221f-448c-86f0-12e53cd876f6 req-d8c5282e-7d6e-4944-a9c6-682abd0f7384 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received event network-changed-a4f3c92a-baee-40e3-8bfd-a92067a453df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:03 compute-0 nova_compute[182092]: 2026-01-23 09:14:03.683 182096 DEBUG nova.compute.manager [req-38807e14-221f-448c-86f0-12e53cd876f6 req-d8c5282e-7d6e-4944-a9c6-682abd0f7384 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Refreshing instance network info cache due to event network-changed-a4f3c92a-baee-40e3-8bfd-a92067a453df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:14:03 compute-0 nova_compute[182092]: 2026-01-23 09:14:03.684 182096 DEBUG oslo_concurrency.lockutils [req-38807e14-221f-448c-86f0-12e53cd876f6 req-d8c5282e-7d6e-4944-a9c6-682abd0f7384 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:03 compute-0 nova_compute[182092]: 2026-01-23 09:14:03.784 182096 DEBUG nova.network.neutron [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.055 182096 DEBUG nova.network.neutron [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Updating instance_info_cache with network_info: [{"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.070 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Releasing lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.070 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Instance network_info: |[{"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.070 182096 DEBUG oslo_concurrency.lockutils [req-38807e14-221f-448c-86f0-12e53cd876f6 req-d8c5282e-7d6e-4944-a9c6-682abd0f7384 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.071 182096 DEBUG nova.network.neutron [req-38807e14-221f-448c-86f0-12e53cd876f6 req-d8c5282e-7d6e-4944-a9c6-682abd0f7384 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Refreshing network info cache for port a4f3c92a-baee-40e3-8bfd-a92067a453df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.073 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Start _get_guest_xml network_info=[{"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 1, 'device_name': '/dev/vdb', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.075 182096 WARNING nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.079 182096 DEBUG nova.virt.libvirt.host [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.080 182096 DEBUG nova.virt.libvirt.host [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.083 182096 DEBUG nova.virt.libvirt.host [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.084 182096 DEBUG nova.virt.libvirt.host [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.084 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.085 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:13:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={hw_rng:allowed='True'},flavorid='1712767136',id=25,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_1-277211911',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.085 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.085 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.085 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.085 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.086 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.086 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.086 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.086 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.086 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.087 182096 DEBUG nova.virt.hardware [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.089 182096 DEBUG nova.virt.libvirt.vif [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:14:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-90546038',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-90546038',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-90546038',id=14,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnF9M/pmB8riKp/krNOOuokjWx9uWi7HlYZOvg1RAqabrh1siUuCWMfdq65vPoDZvjaYeqGc3d2cGJJGvJe4w2EAkygSiKea9MRplM49W4SX9X/cxJDh1vE8FS91IQ+UA==',key_name='tempest-keypair-1201646728',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7866c4af706d4fac8aba28da7683a209',ramdisk_id='',reservation_id='r-773wx7w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:14:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='50480754599d4ae387d8c846a334d4bb',uuid=52fc65a3-ff92-4710-bcde-048b467fdd86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.089 182096 DEBUG nova.network.os_vif_util [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converting VIF {"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.090 182096 DEBUG nova.network.os_vif_util [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:a0:f2,bridge_name='br-int',has_traffic_filtering=True,id=a4f3c92a-baee-40e3-8bfd-a92067a453df,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4f3c92a-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.091 182096 DEBUG nova.objects.instance [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lazy-loading 'pci_devices' on Instance uuid 52fc65a3-ff92-4710-bcde-048b467fdd86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.098 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <uuid>52fc65a3-ff92-4710-bcde-048b467fdd86</uuid>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <name>instance-0000000e</name>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-90546038</nova:name>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:14:05</nova:creationTime>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <nova:flavor name="tempest-flavor_with_ephemeral_1-277211911">
Jan 23 09:14:05 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:14:05 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:14:05 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:14:05 compute-0 nova_compute[182092]:         <nova:ephemeral>1</nova:ephemeral>
Jan 23 09:14:05 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:14:05 compute-0 nova_compute[182092]:         <nova:user uuid="50480754599d4ae387d8c846a334d4bb">tempest-ServersWithSpecificFlavorTestJSON-1599791507-project-member</nova:user>
Jan 23 09:14:05 compute-0 nova_compute[182092]:         <nova:project uuid="7866c4af706d4fac8aba28da7683a209">tempest-ServersWithSpecificFlavorTestJSON-1599791507</nova:project>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:14:05 compute-0 nova_compute[182092]:         <nova:port uuid="a4f3c92a-baee-40e3-8bfd-a92067a453df">
Jan 23 09:14:05 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <system>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <entry name="serial">52fc65a3-ff92-4710-bcde-048b467fdd86</entry>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <entry name="uuid">52fc65a3-ff92-4710-bcde-048b467fdd86</entry>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </system>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <os>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   </os>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <features>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   </features>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.eph0"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <target dev="vdb" bus="virtio"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.config"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:51:a0:f2"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <target dev="tapa4f3c92a-ba"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/console.log" append="off"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <video>
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </video>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:14:05 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:14:05 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:14:05 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:14:05 compute-0 nova_compute[182092]: </domain>
Jan 23 09:14:05 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.099 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Preparing to wait for external event network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.099 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.099 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.100 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.100 182096 DEBUG nova.virt.libvirt.vif [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:14:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-90546038',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-90546038',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-90546038',id=14,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnF9M/pmB8riKp/krNOOuokjWx9uWi7HlYZOvg1RAqabrh1siUuCWMfdq65vPoDZvjaYeqGc3d2cGJJGvJe4w2EAkygSiKea9MRplM49W4SX9X/cxJDh1vE8FS91IQ+UA==',key_name='tempest-keypair-1201646728',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7866c4af706d4fac8aba28da7683a209',ramdisk_id='',reservation_id='r-773wx7w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:14:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='50480754599d4ae387d8c846a334d4bb',uuid=52fc65a3-ff92-4710-bcde-048b467fdd86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.100 182096 DEBUG nova.network.os_vif_util [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converting VIF {"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.101 182096 DEBUG nova.network.os_vif_util [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:a0:f2,bridge_name='br-int',has_traffic_filtering=True,id=a4f3c92a-baee-40e3-8bfd-a92067a453df,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4f3c92a-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.101 182096 DEBUG os_vif [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:a0:f2,bridge_name='br-int',has_traffic_filtering=True,id=a4f3c92a-baee-40e3-8bfd-a92067a453df,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4f3c92a-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.101 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.102 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.102 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.104 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.104 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4f3c92a-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.104 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa4f3c92a-ba, col_values=(('external_ids', {'iface-id': 'a4f3c92a-baee-40e3-8bfd-a92067a453df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:a0:f2', 'vm-uuid': '52fc65a3-ff92-4710-bcde-048b467fdd86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.105 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 NetworkManager[54920]: <info>  [1769159645.1068] manager: (tapa4f3c92a-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.108 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.111 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.112 182096 INFO os_vif [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:a0:f2,bridge_name='br-int',has_traffic_filtering=True,id=a4f3c92a-baee-40e3-8bfd-a92067a453df,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4f3c92a-ba')
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.148 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.149 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.149 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.150 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] No VIF found with MAC fa:16:3e:51:a0:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.150 182096 INFO nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Using config drive
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.303 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.422 182096 INFO nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Creating config drive at /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.config
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.428 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjibuqj9t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.548 182096 DEBUG oslo_concurrency.processutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjibuqj9t" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:05 compute-0 kernel: tapa4f3c92a-ba: entered promiscuous mode
Jan 23 09:14:05 compute-0 NetworkManager[54920]: <info>  [1769159645.5951] manager: (tapa4f3c92a-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Jan 23 09:14:05 compute-0 ovn_controller[94697]: 2026-01-23T09:14:05Z|00044|binding|INFO|Claiming lport a4f3c92a-baee-40e3-8bfd-a92067a453df for this chassis.
Jan 23 09:14:05 compute-0 ovn_controller[94697]: 2026-01-23T09:14:05Z|00045|binding|INFO|a4f3c92a-baee-40e3-8bfd-a92067a453df: Claiming fa:16:3e:51:a0:f2 10.100.0.9
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.599 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.604 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:a0:f2 10.100.0.9'], port_security=['fa:16:3e:51:a0:f2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '52fc65a3-ff92-4710-bcde-048b467fdd86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d6a822-c968-4a80-a119-b33b7666b94b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7866c4af706d4fac8aba28da7683a209', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb83c6ea-b921-4a8a-b1b2-bf3532b122b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dad8fa26-a511-4939-9e59-86d36d58a8b2, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a4f3c92a-baee-40e3-8bfd-a92067a453df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.605 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a4f3c92a-baee-40e3-8bfd-a92067a453df in datapath 58d6a822-c968-4a80-a119-b33b7666b94b bound to our chassis
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.606 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58d6a822-c968-4a80-a119-b33b7666b94b
Jan 23 09:14:05 compute-0 ovn_controller[94697]: 2026-01-23T09:14:05Z|00046|binding|INFO|Setting lport a4f3c92a-baee-40e3-8bfd-a92067a453df ovn-installed in OVS
Jan 23 09:14:05 compute-0 ovn_controller[94697]: 2026-01-23T09:14:05Z|00047|binding|INFO|Setting lport a4f3c92a-baee-40e3-8bfd-a92067a453df up in Southbound
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.612 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.615 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[94a4a50b-6443-43be-9472-3ad2ba022aa3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.616 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58d6a822-c1 in ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.619 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58d6a822-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.619 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b198f302-56be-47d7-aed6-de5565b80073]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.621 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3f5006f8-a736-4054-a167-d80289214fbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 systemd-udevd[211244]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:14:05 compute-0 systemd-machined[153562]: New machine qemu-6-instance-0000000e.
Jan 23 09:14:05 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000e.
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.631 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e51167-96e9-466b-958c-2d456acfc7dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 NetworkManager[54920]: <info>  [1769159645.6391] device (tapa4f3c92a-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:14:05 compute-0 NetworkManager[54920]: <info>  [1769159645.6397] device (tapa4f3c92a-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.643 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8b83e204-2617-4a85-a0da-1463e400bb43]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.664 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4bf575-bfd8-4157-8b5c-0c73f1494d24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 systemd-udevd[211248]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.667 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9ffde53b-b7ea-458f-8923-06cfd7a53bde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 NetworkManager[54920]: <info>  [1769159645.6699] manager: (tap58d6a822-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/36)
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.691 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[98316076-c917-426c-9c48-f076208c3806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.693 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[95696f11-f010-417a-8ae1-54057413a48b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 NetworkManager[54920]: <info>  [1769159645.7092] device (tap58d6a822-c0): carrier: link connected
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.713 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[787d56fe-2c88-4e92-9b1a-47cd87802d4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.726 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2f15c92e-9198-4fa6-8613-b43771e747bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d6a822-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:81:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 315119, 'reachable_time': 33738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211269, 'error': None, 'target': 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.738 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e39f3eae-0208-49fb-88f4-f4ab678d0b4c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:8133'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 315119, 'tstamp': 315119}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211270, 'error': None, 'target': 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.750 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac3fde9-b38d-4dd6-8814-809b2234ace0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d6a822-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:81:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 315119, 'reachable_time': 33738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211271, 'error': None, 'target': 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.773 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[771b3849-dff7-4bb4-b943-855063b599e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.787 182096 DEBUG nova.compute.manager [req-63c0abce-81aa-40fd-9c99-99609356ba7f req-c81d94bf-766c-4796-b684-385293af6571 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received event network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.787 182096 DEBUG oslo_concurrency.lockutils [req-63c0abce-81aa-40fd-9c99-99609356ba7f req-c81d94bf-766c-4796-b684-385293af6571 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.788 182096 DEBUG oslo_concurrency.lockutils [req-63c0abce-81aa-40fd-9c99-99609356ba7f req-c81d94bf-766c-4796-b684-385293af6571 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.788 182096 DEBUG oslo_concurrency.lockutils [req-63c0abce-81aa-40fd-9c99-99609356ba7f req-c81d94bf-766c-4796-b684-385293af6571 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.788 182096 DEBUG nova.compute.manager [req-63c0abce-81aa-40fd-9c99-99609356ba7f req-c81d94bf-766c-4796-b684-385293af6571 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Processing event network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.820 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ff20477b-9a20-4375-80e0-3c958e58b3ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.820 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d6a822-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.821 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.821 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58d6a822-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:05 compute-0 NetworkManager[54920]: <info>  [1769159645.8233] manager: (tap58d6a822-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 23 09:14:05 compute-0 kernel: tap58d6a822-c0: entered promiscuous mode
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.822 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.829 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58d6a822-c0, col_values=(('external_ids', {'iface-id': '4b01708c-9f7f-4701-bf11-ec5e700921db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:05 compute-0 ovn_controller[94697]: 2026-01-23T09:14:05Z|00048|binding|INFO|Releasing lport 4b01708c-9f7f-4701-bf11-ec5e700921db from this chassis (sb_readonly=0)
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.829 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.832 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58d6a822-c968-4a80-a119-b33b7666b94b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58d6a822-c968-4a80-a119-b33b7666b94b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:14:05 compute-0 nova_compute[182092]: 2026-01-23 09:14:05.844 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.841 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6fd9e874-4e6a-4b8c-8d03-c45e1d7272e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.846 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-58d6a822-c968-4a80-a119-b33b7666b94b
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/58d6a822-c968-4a80-a119-b33b7666b94b.pid.haproxy
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 58d6a822-c968-4a80-a119-b33b7666b94b
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:14:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:05.846 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'env', 'PROCESS_TAG=haproxy-58d6a822-c968-4a80-a119-b33b7666b94b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58d6a822-c968-4a80-a119-b33b7666b94b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:14:06 compute-0 podman[211299]: 2026-01-23 09:14:06.137405961 +0000 UTC m=+0.038085681 container create 447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:14:06 compute-0 systemd[1]: Started libpod-conmon-447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93.scope.
Jan 23 09:14:06 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:14:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/434d7a8ec5c725274256598c7948251851781958f3c6114bf8ca05b12b6677fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:14:06 compute-0 podman[211299]: 2026-01-23 09:14:06.205365101 +0000 UTC m=+0.106044811 container init 447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:14:06 compute-0 podman[211299]: 2026-01-23 09:14:06.209722216 +0000 UTC m=+0.110401937 container start 447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:14:06 compute-0 podman[211299]: 2026-01-23 09:14:06.116367021 +0000 UTC m=+0.017046743 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:14:06 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[211311]: [NOTICE]   (211315) : New worker (211317) forked
Jan 23 09:14:06 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[211311]: [NOTICE]   (211315) : Loading success.
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.389 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159646.389014, 52fc65a3-ff92-4710-bcde-048b467fdd86 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.390 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] VM Started (Lifecycle Event)
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.393 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.395 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.398 182096 INFO nova.virt.libvirt.driver [-] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Instance spawned successfully.
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.398 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.424 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.430 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.432 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.432 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.433 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.433 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.434 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.434 182096 DEBUG nova.virt.libvirt.driver [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.462 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.462 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159646.3897953, 52fc65a3-ff92-4710-bcde-048b467fdd86 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.462 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] VM Paused (Lifecycle Event)
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.487 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.489 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159646.394737, 52fc65a3-ff92-4710-bcde-048b467fdd86 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.489 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] VM Resumed (Lifecycle Event)
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.508 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.510 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.514 182096 INFO nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Took 5.32 seconds to spawn the instance on the hypervisor.
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.514 182096 DEBUG nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.534 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.576 182096 INFO nova.compute.manager [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Took 5.81 seconds to build instance.
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.588 182096 DEBUG oslo_concurrency.lockutils [None req-34e1f600-4d18-425c-8433-2226f3668d2c 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.748 182096 DEBUG nova.network.neutron [req-38807e14-221f-448c-86f0-12e53cd876f6 req-d8c5282e-7d6e-4944-a9c6-682abd0f7384 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Updated VIF entry in instance network info cache for port a4f3c92a-baee-40e3-8bfd-a92067a453df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.748 182096 DEBUG nova.network.neutron [req-38807e14-221f-448c-86f0-12e53cd876f6 req-d8c5282e-7d6e-4944-a9c6-682abd0f7384 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Updating instance_info_cache with network_info: [{"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.763 182096 DEBUG oslo_concurrency.lockutils [req-38807e14-221f-448c-86f0-12e53cd876f6 req-d8c5282e-7d6e-4944-a9c6-682abd0f7384 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:06 compute-0 nova_compute[182092]: 2026-01-23 09:14:06.845 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:07 compute-0 nova_compute[182092]: 2026-01-23 09:14:07.893 182096 DEBUG nova.compute.manager [req-1f75a406-efc2-4a49-82bc-0003ffbbda0f req-df3fcfc7-aebf-486a-b53b-f321aebe226c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received event network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:07 compute-0 nova_compute[182092]: 2026-01-23 09:14:07.893 182096 DEBUG oslo_concurrency.lockutils [req-1f75a406-efc2-4a49-82bc-0003ffbbda0f req-df3fcfc7-aebf-486a-b53b-f321aebe226c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:07 compute-0 nova_compute[182092]: 2026-01-23 09:14:07.893 182096 DEBUG oslo_concurrency.lockutils [req-1f75a406-efc2-4a49-82bc-0003ffbbda0f req-df3fcfc7-aebf-486a-b53b-f321aebe226c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:07 compute-0 nova_compute[182092]: 2026-01-23 09:14:07.893 182096 DEBUG oslo_concurrency.lockutils [req-1f75a406-efc2-4a49-82bc-0003ffbbda0f req-df3fcfc7-aebf-486a-b53b-f321aebe226c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:07 compute-0 nova_compute[182092]: 2026-01-23 09:14:07.893 182096 DEBUG nova.compute.manager [req-1f75a406-efc2-4a49-82bc-0003ffbbda0f req-df3fcfc7-aebf-486a-b53b-f321aebe226c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] No waiting events found dispatching network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:14:07 compute-0 nova_compute[182092]: 2026-01-23 09:14:07.894 182096 WARNING nova.compute.manager [req-1f75a406-efc2-4a49-82bc-0003ffbbda0f req-df3fcfc7-aebf-486a-b53b-f321aebe226c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received unexpected event network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df for instance with vm_state active and task_state None.
Jan 23 09:14:09 compute-0 podman[211330]: 2026-01-23 09:14:09.218205447 +0000 UTC m=+0.048178254 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:14:09 compute-0 podman[211329]: 2026-01-23 09:14:09.233223584 +0000 UTC m=+0.063691044 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true)
Jan 23 09:14:09 compute-0 nova_compute[182092]: 2026-01-23 09:14:09.986 182096 DEBUG nova.compute.manager [req-9546a362-58a0-48a0-8d62-92dd14dd95ea req-fedc199b-a828-4a54-a33c-8144de470ce3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received event network-changed-a4f3c92a-baee-40e3-8bfd-a92067a453df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:09 compute-0 nova_compute[182092]: 2026-01-23 09:14:09.986 182096 DEBUG nova.compute.manager [req-9546a362-58a0-48a0-8d62-92dd14dd95ea req-fedc199b-a828-4a54-a33c-8144de470ce3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Refreshing instance network info cache due to event network-changed-a4f3c92a-baee-40e3-8bfd-a92067a453df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:14:09 compute-0 nova_compute[182092]: 2026-01-23 09:14:09.987 182096 DEBUG oslo_concurrency.lockutils [req-9546a362-58a0-48a0-8d62-92dd14dd95ea req-fedc199b-a828-4a54-a33c-8144de470ce3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:09 compute-0 nova_compute[182092]: 2026-01-23 09:14:09.987 182096 DEBUG oslo_concurrency.lockutils [req-9546a362-58a0-48a0-8d62-92dd14dd95ea req-fedc199b-a828-4a54-a33c-8144de470ce3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:09 compute-0 nova_compute[182092]: 2026-01-23 09:14:09.987 182096 DEBUG nova.network.neutron [req-9546a362-58a0-48a0-8d62-92dd14dd95ea req-fedc199b-a828-4a54-a33c-8144de470ce3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Refreshing network info cache for port a4f3c92a-baee-40e3-8bfd-a92067a453df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:14:10 compute-0 nova_compute[182092]: 2026-01-23 09:14:10.106 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:10 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:14:10 compute-0 systemd[211142]: Activating special unit Exit the Session...
Jan 23 09:14:10 compute-0 systemd[211142]: Stopped target Main User Target.
Jan 23 09:14:10 compute-0 systemd[211142]: Stopped target Basic System.
Jan 23 09:14:10 compute-0 systemd[211142]: Stopped target Paths.
Jan 23 09:14:10 compute-0 systemd[211142]: Stopped target Sockets.
Jan 23 09:14:10 compute-0 systemd[211142]: Stopped target Timers.
Jan 23 09:14:10 compute-0 systemd[211142]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:14:10 compute-0 systemd[211142]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:14:10 compute-0 systemd[211142]: Closed D-Bus User Message Bus Socket.
Jan 23 09:14:10 compute-0 systemd[211142]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:14:10 compute-0 systemd[211142]: Removed slice User Application Slice.
Jan 23 09:14:10 compute-0 systemd[211142]: Reached target Shutdown.
Jan 23 09:14:10 compute-0 systemd[211142]: Finished Exit the Session.
Jan 23 09:14:10 compute-0 systemd[211142]: Reached target Exit the Session.
Jan 23 09:14:10 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:14:10 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:14:10 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:14:10 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:14:10 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:14:10 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:14:10 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:14:11 compute-0 nova_compute[182092]: 2026-01-23 09:14:11.719 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:11 compute-0 nova_compute[182092]: 2026-01-23 09:14:11.847 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:12 compute-0 nova_compute[182092]: 2026-01-23 09:14:12.442 182096 DEBUG nova.network.neutron [req-9546a362-58a0-48a0-8d62-92dd14dd95ea req-fedc199b-a828-4a54-a33c-8144de470ce3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Updated VIF entry in instance network info cache for port a4f3c92a-baee-40e3-8bfd-a92067a453df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:14:12 compute-0 nova_compute[182092]: 2026-01-23 09:14:12.442 182096 DEBUG nova.network.neutron [req-9546a362-58a0-48a0-8d62-92dd14dd95ea req-fedc199b-a828-4a54-a33c-8144de470ce3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Updating instance_info_cache with network_info: [{"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:12 compute-0 nova_compute[182092]: 2026-01-23 09:14:12.457 182096 DEBUG oslo_concurrency.lockutils [req-9546a362-58a0-48a0-8d62-92dd14dd95ea req-fedc199b-a828-4a54-a33c-8144de470ce3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-52fc65a3-ff92-4710-bcde-048b467fdd86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:12 compute-0 nova_compute[182092]: 2026-01-23 09:14:12.525 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159637.5238957, 99ca87e0-9a78-444e-bbf7-2c2e20731100 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:12 compute-0 nova_compute[182092]: 2026-01-23 09:14:12.525 182096 INFO nova.compute.manager [-] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] VM Stopped (Lifecycle Event)
Jan 23 09:14:12 compute-0 nova_compute[182092]: 2026-01-23 09:14:12.538 182096 DEBUG nova.compute.manager [None req-3e5da279-01a4-469c-917c-affaf0961cab - - - - - -] [instance: 99ca87e0-9a78-444e-bbf7-2c2e20731100] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:13 compute-0 sshd-session[211369]: Accepted publickey for nova from 192.168.122.102 port 35250 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:14:13 compute-0 systemd-logind[746]: New session 43 of user nova.
Jan 23 09:14:13 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:14:13 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:14:13 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:14:13 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:14:13 compute-0 systemd[211373]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:14:13 compute-0 systemd[211373]: Queued start job for default target Main User Target.
Jan 23 09:14:13 compute-0 systemd[211373]: Created slice User Application Slice.
Jan 23 09:14:13 compute-0 systemd[211373]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:14:13 compute-0 systemd[211373]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:14:13 compute-0 systemd[211373]: Reached target Paths.
Jan 23 09:14:13 compute-0 systemd[211373]: Reached target Timers.
Jan 23 09:14:13 compute-0 systemd[211373]: Starting D-Bus User Message Bus Socket...
Jan 23 09:14:13 compute-0 systemd[211373]: Starting Create User's Volatile Files and Directories...
Jan 23 09:14:13 compute-0 systemd[211373]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:14:13 compute-0 systemd[211373]: Reached target Sockets.
Jan 23 09:14:13 compute-0 systemd[211373]: Finished Create User's Volatile Files and Directories.
Jan 23 09:14:13 compute-0 systemd[211373]: Reached target Basic System.
Jan 23 09:14:13 compute-0 systemd[211373]: Reached target Main User Target.
Jan 23 09:14:13 compute-0 systemd[211373]: Startup finished in 133ms.
Jan 23 09:14:13 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:14:13 compute-0 systemd[1]: Started Session 43 of User nova.
Jan 23 09:14:13 compute-0 sshd-session[211369]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:14:14 compute-0 sshd-session[211389]: Received disconnect from 192.168.122.102 port 35250:11: disconnected by user
Jan 23 09:14:14 compute-0 sshd-session[211389]: Disconnected from user nova 192.168.122.102 port 35250
Jan 23 09:14:14 compute-0 sshd-session[211369]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:14:14 compute-0 systemd-logind[746]: Session 43 logged out. Waiting for processes to exit.
Jan 23 09:14:14 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Jan 23 09:14:14 compute-0 systemd-logind[746]: Removed session 43.
Jan 23 09:14:14 compute-0 sshd-session[211391]: Accepted publickey for nova from 192.168.122.102 port 35262 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:14:14 compute-0 systemd-logind[746]: New session 45 of user nova.
Jan 23 09:14:14 compute-0 systemd[1]: Started Session 45 of User nova.
Jan 23 09:14:14 compute-0 sshd-session[211391]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:14:14 compute-0 sshd-session[211394]: Received disconnect from 192.168.122.102 port 35262:11: disconnected by user
Jan 23 09:14:14 compute-0 sshd-session[211394]: Disconnected from user nova 192.168.122.102 port 35262
Jan 23 09:14:14 compute-0 sshd-session[211391]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:14:14 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Jan 23 09:14:14 compute-0 systemd-logind[746]: Session 45 logged out. Waiting for processes to exit.
Jan 23 09:14:14 compute-0 systemd-logind[746]: Removed session 45.
Jan 23 09:14:14 compute-0 sshd-session[211396]: Accepted publickey for nova from 192.168.122.102 port 35264 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:14:14 compute-0 systemd-logind[746]: New session 46 of user nova.
Jan 23 09:14:14 compute-0 systemd[1]: Started Session 46 of User nova.
Jan 23 09:14:14 compute-0 sshd-session[211396]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:14:14 compute-0 sshd-session[211399]: Received disconnect from 192.168.122.102 port 35264:11: disconnected by user
Jan 23 09:14:14 compute-0 sshd-session[211399]: Disconnected from user nova 192.168.122.102 port 35264
Jan 23 09:14:14 compute-0 sshd-session[211396]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:14:14 compute-0 systemd[1]: session-46.scope: Deactivated successfully.
Jan 23 09:14:14 compute-0 systemd-logind[746]: Session 46 logged out. Waiting for processes to exit.
Jan 23 09:14:14 compute-0 systemd-logind[746]: Removed session 46.
Jan 23 09:14:14 compute-0 nova_compute[182092]: 2026-01-23 09:14:14.731 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "refresh_cache-72b93748-a6a0-4c01-93f6-5509b35c13e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:14 compute-0 nova_compute[182092]: 2026-01-23 09:14:14.732 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquired lock "refresh_cache-72b93748-a6a0-4c01-93f6-5509b35c13e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:14 compute-0 nova_compute[182092]: 2026-01-23 09:14:14.733 182096 DEBUG nova.network.neutron [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:14 compute-0 nova_compute[182092]: 2026-01-23 09:14:14.752 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:14.752 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:14:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:14.753 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:14:14 compute-0 nova_compute[182092]: 2026-01-23 09:14:14.879 182096 DEBUG nova.network.neutron [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.107 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.115 182096 DEBUG nova.network.neutron [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.125 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Releasing lock "refresh_cache-72b93748-a6a0-4c01-93f6-5509b35c13e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.219 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.220 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.220 182096 INFO nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Creating image(s)
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.221 182096 DEBUG nova.objects.instance [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 72b93748-a6a0-4c01-93f6-5509b35c13e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.229 182096 DEBUG oslo_concurrency.processutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.289 182096 DEBUG oslo_concurrency.processutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.290 182096 DEBUG nova.virt.disk.api [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Checking if we can resize image /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.290 182096 DEBUG oslo_concurrency.processutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.347 182096 DEBUG oslo_concurrency.processutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.348 182096 DEBUG nova.virt.disk.api [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Cannot resize image /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.364 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.364 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Ensure instance console log exists: /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.365 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.365 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.365 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.367 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.371 182096 WARNING nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.377 182096 DEBUG nova.virt.libvirt.host [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.377 182096 DEBUG nova.virt.libvirt.host [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.380 182096 DEBUG nova.virt.libvirt.host [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.380 182096 DEBUG nova.virt.libvirt.host [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.381 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.381 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9e575731-b613-4b19-83e1-51cae9e2c5da',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.382 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.382 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.382 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.383 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.383 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.384 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.384 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.384 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.385 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.385 182096 DEBUG nova.virt.hardware [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.386 182096 DEBUG nova.objects.instance [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 72b93748-a6a0-4c01-93f6-5509b35c13e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.397 182096 DEBUG oslo_concurrency.processutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.453 182096 DEBUG oslo_concurrency.processutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk.config --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.454 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "/var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.454 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "/var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.455 182096 DEBUG oslo_concurrency.lockutils [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "/var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.457 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <uuid>72b93748-a6a0-4c01-93f6-5509b35c13e4</uuid>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <name>instance-0000000d</name>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <memory>196608</memory>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <nova:name>tempest-MigrationsAdminTest-server-427622826</nova:name>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:14:15</nova:creationTime>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <nova:flavor name="m1.micro">
Jan 23 09:14:15 compute-0 nova_compute[182092]:         <nova:memory>192</nova:memory>
Jan 23 09:14:15 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:14:15 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:14:15 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:14:15 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:14:15 compute-0 nova_compute[182092]:         <nova:user uuid="03b66d4c15354adcaae5e4e4cd711e1b">tempest-MigrationsAdminTest-1469473097-project-member</nova:user>
Jan 23 09:14:15 compute-0 nova_compute[182092]:         <nova:project uuid="0ce2bc741e474401a3f9c2ef00c19693">tempest-MigrationsAdminTest-1469473097</nova:project>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <system>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <entry name="serial">72b93748-a6a0-4c01-93f6-5509b35c13e4</entry>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <entry name="uuid">72b93748-a6a0-4c01-93f6-5509b35c13e4</entry>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     </system>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <os>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   </os>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <features>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   </features>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/disk.config"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/console.log" append="off"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <video>
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     </video>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:14:15 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:14:15 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:14:15 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:14:15 compute-0 nova_compute[182092]: </domain>
Jan 23 09:14:15 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.505 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.505 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:15 compute-0 nova_compute[182092]: 2026-01-23 09:14:15.506 182096 INFO nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Using config drive
Jan 23 09:14:15 compute-0 systemd-machined[153562]: New machine qemu-7-instance-0000000d.
Jan 23 09:14:15 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000d.
Jan 23 09:14:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:15.755 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.848 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.948 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159656.947924, 72b93748-a6a0-4c01-93f6-5509b35c13e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.948 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] VM Resumed (Lifecycle Event)
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.952 182096 DEBUG nova.compute.manager [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.954 182096 INFO nova.virt.libvirt.driver [-] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Instance running successfully.
Jan 23 09:14:16 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.957 182096 DEBUG nova.virt.libvirt.guest [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.957 182096 DEBUG nova.virt.libvirt.driver [None req-4c64ffc2-f4ca-46d3-8311-a68fbdf02f30 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.981 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:16 compute-0 nova_compute[182092]: 2026-01-23 09:14:16.982 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.027 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.028 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159656.9518516, 72b93748-a6a0-4c01-93f6-5509b35c13e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.028 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] VM Started (Lifecycle Event)
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.047 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.049 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:17 compute-0 ovn_controller[94697]: 2026-01-23T09:14:17Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:51:a0:f2 10.100.0.9
Jan 23 09:14:17 compute-0 ovn_controller[94697]: 2026-01-23T09:14:17Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:51:a0:f2 10.100.0.9
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.786 182096 DEBUG oslo_concurrency.lockutils [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "refresh_cache-72b93748-a6a0-4c01-93f6-5509b35c13e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.787 182096 DEBUG oslo_concurrency.lockutils [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquired lock "refresh_cache-72b93748-a6a0-4c01-93f6-5509b35c13e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.787 182096 DEBUG nova.network.neutron [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:17 compute-0 nova_compute[182092]: 2026-01-23 09:14:17.931 182096 DEBUG nova.network.neutron [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.306 182096 DEBUG nova.network.neutron [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.321 182096 DEBUG oslo_concurrency.lockutils [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Releasing lock "refresh_cache-72b93748-a6a0-4c01-93f6-5509b35c13e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.332 182096 DEBUG nova.virt.libvirt.driver [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Creating tmpfile /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4/tmp43__o83t to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618
Jan 23 09:14:18 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 23 09:14:18 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000d.scope: Consumed 2.730s CPU time.
Jan 23 09:14:18 compute-0 systemd-machined[153562]: Machine qemu-7-instance-0000000d terminated.
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.568 182096 INFO nova.virt.libvirt.driver [-] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Instance destroyed successfully.
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.568 182096 DEBUG nova.objects.instance [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'resources' on Instance uuid 72b93748-a6a0-4c01-93f6-5509b35c13e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.578 182096 INFO nova.virt.libvirt.driver [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Deleting instance files /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4_del
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.582 182096 INFO nova.virt.libvirt.driver [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Deletion of /var/lib/nova/instances/72b93748-a6a0-4c01-93f6-5509b35c13e4_del complete
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.701 182096 DEBUG oslo_concurrency.lockutils [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.702 182096 DEBUG oslo_concurrency.lockutils [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.717 182096 DEBUG nova.objects.instance [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'migration_context' on Instance uuid 72b93748-a6a0-4c01-93f6-5509b35c13e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.794 182096 DEBUG nova.compute.provider_tree [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.804 182096 DEBUG nova.scheduler.client.report [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:18 compute-0 nova_compute[182092]: 2026-01-23 09:14:18.836 182096 DEBUG oslo_concurrency.lockutils [None req-87210e4a-b147-4000-9d72-8835831a2a9c 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:20 compute-0 nova_compute[182092]: 2026-01-23 09:14:20.109 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:21 compute-0 nova_compute[182092]: 2026-01-23 09:14:21.849 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:22 compute-0 podman[211455]: 2026-01-23 09:14:22.212194794 +0000 UTC m=+0.043626741 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:14:22 compute-0 podman[211471]: 2026-01-23 09:14:22.27012997 +0000 UTC m=+0.037321461 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.328 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.329 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.359 182096 DEBUG nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.429 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.429 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.434 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.434 182096 INFO nova.compute.claims [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.548 182096 DEBUG nova.compute.provider_tree [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.566 182096 DEBUG nova.scheduler.client.report [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.587 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.588 182096 DEBUG nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.636 182096 DEBUG nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.636 182096 DEBUG nova.network.neutron [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.648 182096 INFO nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.660 182096 DEBUG nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.744 182096 DEBUG nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.745 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.745 182096 INFO nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Creating image(s)
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.746 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.746 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.747 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.758 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.805 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.806 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.807 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.816 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.861 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.862 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.881 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.882 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.883 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.926 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.927 182096 DEBUG nova.virt.disk.api [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Checking if we can resize image /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.927 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.971 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.972 182096 DEBUG nova.virt.disk.api [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Cannot resize image /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.972 182096 DEBUG nova.objects.instance [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'migration_context' on Instance uuid f7ddc4fd-b0a8-4309-bdb4-fd989a270ede obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.990 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.991 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Ensure instance console log exists: /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.991 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.992 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:23 compute-0 nova_compute[182092]: 2026-01-23 09:14:23.992 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.038 182096 DEBUG nova.network.neutron [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.038 182096 DEBUG nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.039 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.042 182096 WARNING nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.046 182096 DEBUG nova.virt.libvirt.host [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.046 182096 DEBUG nova.virt.libvirt.host [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.049 182096 DEBUG nova.virt.libvirt.host [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.049 182096 DEBUG nova.virt.libvirt.host [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.050 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.051 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.051 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.051 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.052 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.052 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.052 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.052 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.053 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.053 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.053 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.053 182096 DEBUG nova.virt.hardware [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.056 182096 DEBUG nova.objects.instance [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'pci_devices' on Instance uuid f7ddc4fd-b0a8-4309-bdb4-fd989a270ede obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.074 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <uuid>f7ddc4fd-b0a8-4309-bdb4-fd989a270ede</uuid>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <name>instance-00000012</name>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <nova:name>tempest-MigrationsAdminTest-server-1833245038</nova:name>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:14:24</nova:creationTime>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:14:24 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:14:24 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:14:24 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:14:24 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:14:24 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:14:24 compute-0 nova_compute[182092]:         <nova:user uuid="03b66d4c15354adcaae5e4e4cd711e1b">tempest-MigrationsAdminTest-1469473097-project-member</nova:user>
Jan 23 09:14:24 compute-0 nova_compute[182092]:         <nova:project uuid="0ce2bc741e474401a3f9c2ef00c19693">tempest-MigrationsAdminTest-1469473097</nova:project>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <system>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <entry name="serial">f7ddc4fd-b0a8-4309-bdb4-fd989a270ede</entry>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <entry name="uuid">f7ddc4fd-b0a8-4309-bdb4-fd989a270ede</entry>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     </system>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <os>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   </os>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <features>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   </features>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/console.log" append="off"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <video>
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     </video>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:14:24 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:14:24 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:14:24 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:14:24 compute-0 nova_compute[182092]: </domain>
Jan 23 09:14:24 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.118 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.119 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.119 182096 INFO nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Using config drive
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.238 182096 INFO nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Creating config drive at /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.243 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps4idv1jt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.359 182096 DEBUG oslo_concurrency.processutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps4idv1jt" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:24 compute-0 systemd-machined[153562]: New machine qemu-8-instance-00000012.
Jan 23 09:14:24 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000012.
Jan 23 09:14:24 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:14:24 compute-0 systemd[211373]: Activating special unit Exit the Session...
Jan 23 09:14:24 compute-0 systemd[211373]: Stopped target Main User Target.
Jan 23 09:14:24 compute-0 systemd[211373]: Stopped target Basic System.
Jan 23 09:14:24 compute-0 systemd[211373]: Stopped target Paths.
Jan 23 09:14:24 compute-0 systemd[211373]: Stopped target Sockets.
Jan 23 09:14:24 compute-0 systemd[211373]: Stopped target Timers.
Jan 23 09:14:24 compute-0 systemd[211373]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:14:24 compute-0 systemd[211373]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:14:24 compute-0 systemd[211373]: Closed D-Bus User Message Bus Socket.
Jan 23 09:14:24 compute-0 systemd[211373]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:14:24 compute-0 systemd[211373]: Removed slice User Application Slice.
Jan 23 09:14:24 compute-0 systemd[211373]: Reached target Shutdown.
Jan 23 09:14:24 compute-0 systemd[211373]: Finished Exit the Session.
Jan 23 09:14:24 compute-0 systemd[211373]: Reached target Exit the Session.
Jan 23 09:14:24 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:14:24 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:14:24 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:14:24 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:14:24 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:14:24 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:14:24 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.759 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159664.758826, f7ddc4fd-b0a8-4309-bdb4-fd989a270ede => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.760 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] VM Resumed (Lifecycle Event)
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.763 182096 DEBUG nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.763 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.766 182096 INFO nova.virt.libvirt.driver [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance spawned successfully.
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.767 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.883 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.886 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.904 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.904 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159664.7599812, f7ddc4fd-b0a8-4309-bdb4-fd989a270ede => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.905 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] VM Started (Lifecycle Event)
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.908 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.908 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.909 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.909 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.910 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.910 182096 DEBUG nova.virt.libvirt.driver [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.934 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.937 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.959 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.980 182096 INFO nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Took 1.24 seconds to spawn the instance on the hypervisor.
Jan 23 09:14:24 compute-0 nova_compute[182092]: 2026-01-23 09:14:24.981 182096 DEBUG nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.060 182096 INFO nova.compute.manager [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Took 1.65 seconds to build instance.
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.070 182096 DEBUG oslo_concurrency.lockutils [None req-1ca3bcc7-4fe1-4669-b1ad-baf4a1f419de 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.741s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.110 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.388 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "52fc65a3-ff92-4710-bcde-048b467fdd86" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.388 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.389 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.389 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.390 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.397 182096 INFO nova.compute.manager [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Terminating instance
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.402 182096 DEBUG nova.compute.manager [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:14:25 compute-0 kernel: tapa4f3c92a-ba (unregistering): left promiscuous mode
Jan 23 09:14:25 compute-0 NetworkManager[54920]: <info>  [1769159665.4215] device (tapa4f3c92a-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:14:25 compute-0 ovn_controller[94697]: 2026-01-23T09:14:25Z|00049|binding|INFO|Releasing lport a4f3c92a-baee-40e3-8bfd-a92067a453df from this chassis (sb_readonly=0)
Jan 23 09:14:25 compute-0 ovn_controller[94697]: 2026-01-23T09:14:25Z|00050|binding|INFO|Setting lport a4f3c92a-baee-40e3-8bfd-a92067a453df down in Southbound
Jan 23 09:14:25 compute-0 ovn_controller[94697]: 2026-01-23T09:14:25Z|00051|binding|INFO|Removing iface tapa4f3c92a-ba ovn-installed in OVS
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.431 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.449 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:a0:f2 10.100.0.9'], port_security=['fa:16:3e:51:a0:f2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '52fc65a3-ff92-4710-bcde-048b467fdd86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d6a822-c968-4a80-a119-b33b7666b94b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7866c4af706d4fac8aba28da7683a209', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb83c6ea-b921-4a8a-b1b2-bf3532b122b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dad8fa26-a511-4939-9e59-86d36d58a8b2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a4f3c92a-baee-40e3-8bfd-a92067a453df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.450 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a4f3c92a-baee-40e3-8bfd-a92067a453df in datapath 58d6a822-c968-4a80-a119-b33b7666b94b unbound from our chassis
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.451 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58d6a822-c968-4a80-a119-b33b7666b94b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.452 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c488f472-85e6-438a-a786-9098a8ee8d26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.452 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b namespace which is not needed anymore
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.455 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 23 09:14:25 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000e.scope: Consumed 11.741s CPU time.
Jan 23 09:14:25 compute-0 systemd-machined[153562]: Machine qemu-6-instance-0000000e terminated.
Jan 23 09:14:25 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[211311]: [NOTICE]   (211315) : haproxy version is 2.8.14-c23fe91
Jan 23 09:14:25 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[211311]: [NOTICE]   (211315) : path to executable is /usr/sbin/haproxy
Jan 23 09:14:25 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[211311]: [WARNING]  (211315) : Exiting Master process...
Jan 23 09:14:25 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[211311]: [WARNING]  (211315) : Exiting Master process...
Jan 23 09:14:25 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[211311]: [ALERT]    (211315) : Current worker (211317) exited with code 143 (Terminated)
Jan 23 09:14:25 compute-0 neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b[211311]: [WARNING]  (211315) : All workers exited. Exiting... (0)
Jan 23 09:14:25 compute-0 systemd[1]: libpod-447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93.scope: Deactivated successfully.
Jan 23 09:14:25 compute-0 conmon[211311]: conmon 447b8a8f02a8816a095d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93.scope/container/memory.events
Jan 23 09:14:25 compute-0 podman[211556]: 2026-01-23 09:14:25.565614012 +0000 UTC m=+0.042262700 container died 447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 09:14:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93-userdata-shm.mount: Deactivated successfully.
Jan 23 09:14:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-434d7a8ec5c725274256598c7948251851781958f3c6114bf8ca05b12b6677fa-merged.mount: Deactivated successfully.
Jan 23 09:14:25 compute-0 podman[211556]: 2026-01-23 09:14:25.583970272 +0000 UTC m=+0.060618960 container cleanup 447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 09:14:25 compute-0 systemd[1]: libpod-conmon-447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93.scope: Deactivated successfully.
Jan 23 09:14:25 compute-0 kernel: tapa4f3c92a-ba: entered promiscuous mode
Jan 23 09:14:25 compute-0 NetworkManager[54920]: <info>  [1769159665.6165] manager: (tapa4f3c92a-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 23 09:14:25 compute-0 ovn_controller[94697]: 2026-01-23T09:14:25Z|00052|binding|INFO|Claiming lport a4f3c92a-baee-40e3-8bfd-a92067a453df for this chassis.
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.617 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 kernel: tapa4f3c92a-ba (unregistering): left promiscuous mode
Jan 23 09:14:25 compute-0 ovn_controller[94697]: 2026-01-23T09:14:25Z|00053|binding|INFO|a4f3c92a-baee-40e3-8bfd-a92067a453df: Claiming fa:16:3e:51:a0:f2 10.100.0.9
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.625 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:a0:f2 10.100.0.9'], port_security=['fa:16:3e:51:a0:f2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '52fc65a3-ff92-4710-bcde-048b467fdd86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d6a822-c968-4a80-a119-b33b7666b94b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7866c4af706d4fac8aba28da7683a209', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb83c6ea-b921-4a8a-b1b2-bf3532b122b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dad8fa26-a511-4939-9e59-86d36d58a8b2, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a4f3c92a-baee-40e3-8bfd-a92067a453df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.637 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 podman[211577]: 2026-01-23 09:14:25.651299465 +0000 UTC m=+0.050267987 container remove 447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.654 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 ovn_controller[94697]: 2026-01-23T09:14:25Z|00054|binding|INFO|Releasing lport a4f3c92a-baee-40e3-8bfd-a92067a453df from this chassis (sb_readonly=0)
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.669 182096 INFO nova.virt.libvirt.driver [-] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Instance destroyed successfully.
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.670 182096 DEBUG nova.objects.instance [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lazy-loading 'resources' on Instance uuid 52fc65a3-ff92-4710-bcde-048b467fdd86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.671 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:a0:f2 10.100.0.9'], port_security=['fa:16:3e:51:a0:f2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '52fc65a3-ff92-4710-bcde-048b467fdd86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d6a822-c968-4a80-a119-b33b7666b94b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7866c4af706d4fac8aba28da7683a209', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb83c6ea-b921-4a8a-b1b2-bf3532b122b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dad8fa26-a511-4939-9e59-86d36d58a8b2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a4f3c92a-baee-40e3-8bfd-a92067a453df) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.669 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[745cbbfb-3617-44e9-9269-d9f8f47a6aef]: (4, ('Fri Jan 23 09:14:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b (447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93)\n447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93\nFri Jan 23 09:14:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b (447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93)\n447b8a8f02a8816a095da9cd62959b7b5bad0575a66b44bdef2c2674733c9f93\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.674 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[51402411-d24a-4222-ade8-b7fe548eb542]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.675 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d6a822-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.676 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 kernel: tap58d6a822-c0: left promiscuous mode
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.688 182096 DEBUG nova.virt.libvirt.vif [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:14:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-90546038',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-90546038',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-90546038',id=14,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnF9M/pmB8riKp/krNOOuokjWx9uWi7HlYZOvg1RAqabrh1siUuCWMfdq65vPoDZvjaYeqGc3d2cGJJGvJe4w2EAkygSiKea9MRplM49W4SX9X/cxJDh1vE8FS91IQ+UA==',key_name='tempest-keypair-1201646728',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:14:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7866c4af706d4fac8aba28da7683a209',ramdisk_id='',reservation_id='r-773wx7w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1599791507-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:14:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='50480754599d4ae387d8c846a334d4bb',uuid=52fc65a3-ff92-4710-bcde-048b467fdd86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.690 182096 DEBUG nova.network.os_vif_util [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converting VIF {"id": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "address": "fa:16:3e:51:a0:f2", "network": {"id": "58d6a822-c968-4a80-a119-b33b7666b94b", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-952764055-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7866c4af706d4fac8aba28da7683a209", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4f3c92a-ba", "ovs_interfaceid": "a4f3c92a-baee-40e3-8bfd-a92067a453df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.690 182096 DEBUG nova.network.os_vif_util [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:51:a0:f2,bridge_name='br-int',has_traffic_filtering=True,id=a4f3c92a-baee-40e3-8bfd-a92067a453df,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4f3c92a-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.691 182096 DEBUG os_vif [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:51:a0:f2,bridge_name='br-int',has_traffic_filtering=True,id=a4f3c92a-baee-40e3-8bfd-a92067a453df,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4f3c92a-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.692 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.692 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4f3c92a-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.693 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.695 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.696 182096 INFO os_vif [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:51:a0:f2,bridge_name='br-int',has_traffic_filtering=True,id=a4f3c92a-baee-40e3-8bfd-a92067a453df,network=Network(58d6a822-c968-4a80-a119-b33b7666b94b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4f3c92a-ba')
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.697 182096 INFO nova.virt.libvirt.driver [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Deleting instance files /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86_del
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.697 182096 INFO nova.virt.libvirt.driver [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Deletion of /var/lib/nova/instances/52fc65a3-ff92-4710-bcde-048b467fdd86_del complete
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.697 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[72710ca0-ade2-4deb-9485-0621b2edf780]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.709 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab1d654-6d2a-475a-aec9-9813de78d61b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.711 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5b6a7d-5944-439c-89a6-b65b8f82dc21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.723 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[20abbb25-fa9d-4e38-9b96-884b952721fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 315114, 'reachable_time': 15145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211619, 'error': None, 'target': 'ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 systemd[1]: run-netns-ovnmeta\x2d58d6a822\x2dc968\x2d4a80\x2da119\x2db33b7666b94b.mount: Deactivated successfully.
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.725 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58d6a822-c968-4a80-a119-b33b7666b94b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.726 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d5dee7-5723-4bb6-b97f-65ee6cb3347e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.726 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a4f3c92a-baee-40e3-8bfd-a92067a453df in datapath 58d6a822-c968-4a80-a119-b33b7666b94b unbound from our chassis
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.727 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58d6a822-c968-4a80-a119-b33b7666b94b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.728 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6508391e-0ed8-49a9-9129-753308f5fc72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.728 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a4f3c92a-baee-40e3-8bfd-a92067a453df in datapath 58d6a822-c968-4a80-a119-b33b7666b94b unbound from our chassis
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.729 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58d6a822-c968-4a80-a119-b33b7666b94b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.732 182096 DEBUG nova.compute.manager [req-0b07b42d-f2e4-4de7-9cfb-de2e0c16da95 req-2e2672a1-58cb-4a65-9433-c2bb62411380 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received event network-vif-unplugged-a4f3c92a-baee-40e3-8bfd-a92067a453df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.732 182096 DEBUG oslo_concurrency.lockutils [req-0b07b42d-f2e4-4de7-9cfb-de2e0c16da95 req-2e2672a1-58cb-4a65-9433-c2bb62411380 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.732 182096 DEBUG oslo_concurrency.lockutils [req-0b07b42d-f2e4-4de7-9cfb-de2e0c16da95 req-2e2672a1-58cb-4a65-9433-c2bb62411380 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.733 182096 DEBUG oslo_concurrency.lockutils [req-0b07b42d-f2e4-4de7-9cfb-de2e0c16da95 req-2e2672a1-58cb-4a65-9433-c2bb62411380 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.733 182096 DEBUG nova.compute.manager [req-0b07b42d-f2e4-4de7-9cfb-de2e0c16da95 req-2e2672a1-58cb-4a65-9433-c2bb62411380 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] No waiting events found dispatching network-vif-unplugged-a4f3c92a-baee-40e3-8bfd-a92067a453df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.733 182096 DEBUG nova.compute.manager [req-0b07b42d-f2e4-4de7-9cfb-de2e0c16da95 req-2e2672a1-58cb-4a65-9433-c2bb62411380 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received event network-vif-unplugged-a4f3c92a-baee-40e3-8bfd-a92067a453df for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:14:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:25.734 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[382a38ef-67c1-41fe-9512-c08107a2bece]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.756 182096 INFO nova.compute.manager [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.756 182096 DEBUG oslo.service.loopingcall [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.756 182096 DEBUG nova.compute.manager [-] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:14:25 compute-0 nova_compute[182092]: 2026-01-23 09:14:25.756 182096 DEBUG nova.network.neutron [-] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:14:26 compute-0 nova_compute[182092]: 2026-01-23 09:14:26.850 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:27 compute-0 podman[211620]: 2026-01-23 09:14:27.223243796 +0000 UTC m=+0.048219692 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.597 182096 DEBUG nova.network.neutron [-] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.638 182096 INFO nova.compute.manager [-] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Took 1.88 seconds to deallocate network for instance.
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.698 182096 DEBUG nova.compute.manager [req-b25c07ae-610a-40c8-93f8-2556de0ef382 req-80b6a5ea-e95b-431a-bdf5-1407453d6824 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received event network-vif-deleted-a4f3c92a-baee-40e3-8bfd-a92067a453df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.717 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.717 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.824 182096 DEBUG nova.compute.provider_tree [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.838 182096 DEBUG nova.compute.manager [req-5f325feb-a087-4afb-a2e1-b070633373af req-ecc9b46a-71a7-44d3-af41-e3b43b409d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received event network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.838 182096 DEBUG oslo_concurrency.lockutils [req-5f325feb-a087-4afb-a2e1-b070633373af req-ecc9b46a-71a7-44d3-af41-e3b43b409d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.838 182096 DEBUG oslo_concurrency.lockutils [req-5f325feb-a087-4afb-a2e1-b070633373af req-ecc9b46a-71a7-44d3-af41-e3b43b409d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.839 182096 DEBUG oslo_concurrency.lockutils [req-5f325feb-a087-4afb-a2e1-b070633373af req-ecc9b46a-71a7-44d3-af41-e3b43b409d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.839 182096 DEBUG nova.compute.manager [req-5f325feb-a087-4afb-a2e1-b070633373af req-ecc9b46a-71a7-44d3-af41-e3b43b409d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] No waiting events found dispatching network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.839 182096 WARNING nova.compute.manager [req-5f325feb-a087-4afb-a2e1-b070633373af req-ecc9b46a-71a7-44d3-af41-e3b43b409d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Received unexpected event network-vif-plugged-a4f3c92a-baee-40e3-8bfd-a92067a453df for instance with vm_state deleted and task_state None.
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.855 182096 DEBUG nova.scheduler.client.report [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.873 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.897 182096 INFO nova.scheduler.client.report [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Deleted allocations for instance 52fc65a3-ff92-4710-bcde-048b467fdd86
Jan 23 09:14:27 compute-0 nova_compute[182092]: 2026-01-23 09:14:27.955 182096 DEBUG oslo_concurrency.lockutils [None req-7ad61566-90cb-4c30-aa67-5e3fb3412859 50480754599d4ae387d8c846a334d4bb 7866c4af706d4fac8aba28da7683a209 - - default default] Lock "52fc65a3-ff92-4710-bcde-048b467fdd86" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:28 compute-0 nova_compute[182092]: 2026-01-23 09:14:28.535 182096 DEBUG oslo_concurrency.lockutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquiring lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:28 compute-0 nova_compute[182092]: 2026-01-23 09:14:28.536 182096 DEBUG oslo_concurrency.lockutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquired lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:28 compute-0 nova_compute[182092]: 2026-01-23 09:14:28.536 182096 DEBUG nova.network.neutron [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:28 compute-0 nova_compute[182092]: 2026-01-23 09:14:28.672 182096 DEBUG nova.network.neutron [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:29 compute-0 nova_compute[182092]: 2026-01-23 09:14:29.672 182096 DEBUG nova.network.neutron [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:29 compute-0 nova_compute[182092]: 2026-01-23 09:14:29.690 182096 DEBUG oslo_concurrency.lockutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Releasing lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:29 compute-0 nova_compute[182092]: 2026-01-23 09:14:29.793 182096 DEBUG nova.virt.libvirt.driver [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 23 09:14:29 compute-0 nova_compute[182092]: 2026-01-23 09:14:29.794 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Creating file /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/f097fe40dd7e4418988639c60cdc41f6.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 23 09:14:29 compute-0 nova_compute[182092]: 2026-01-23 09:14:29.794 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/f097fe40dd7e4418988639c60cdc41f6.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:30 compute-0 nova_compute[182092]: 2026-01-23 09:14:30.114 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/f097fe40dd7e4418988639c60cdc41f6.tmp" returned: 1 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:30 compute-0 nova_compute[182092]: 2026-01-23 09:14:30.115 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/f097fe40dd7e4418988639c60cdc41f6.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 09:14:30 compute-0 nova_compute[182092]: 2026-01-23 09:14:30.115 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Creating directory /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 23 09:14:30 compute-0 nova_compute[182092]: 2026-01-23 09:14:30.116 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:30 compute-0 nova_compute[182092]: 2026-01-23 09:14:30.278 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" returned: 0 in 0.162s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:30 compute-0 nova_compute[182092]: 2026-01-23 09:14:30.281 182096 DEBUG nova.virt.libvirt.driver [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:14:30 compute-0 nova_compute[182092]: 2026-01-23 09:14:30.694 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:31 compute-0 nova_compute[182092]: 2026-01-23 09:14:31.851 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:33.518 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}3973e950f0ad6b60c77919bcf2579e13dd1674cd8c4e3330f95dcdc2467f47a5" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 23 09:14:33 compute-0 nova_compute[182092]: 2026-01-23 09:14:33.567 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159658.5665197, 72b93748-a6a0-4c01-93f6-5509b35c13e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:33 compute-0 nova_compute[182092]: 2026-01-23 09:14:33.567 182096 INFO nova.compute.manager [-] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] VM Stopped (Lifecycle Event)
Jan 23 09:14:33 compute-0 nova_compute[182092]: 2026-01-23 09:14:33.590 182096 DEBUG nova.compute.manager [None req-baa26fad-c237-47dd-8178-dbeca8f5d584 - - - - - -] [instance: 72b93748-a6a0-4c01-93f6-5509b35c13e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:34 compute-0 podman[211640]: 2026-01-23 09:14:34.219324666 +0000 UTC m=+0.057643377 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:14:34 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:34.893 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 23 Jan 2026 09:14:33 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-9e54aea5-f505-4695-9359-8ce56649ad1a x-openstack-request-id: req-9e54aea5-f505-4695-9359-8ce56649ad1a _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 23 09:14:34 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:34.894 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "98e818ca-8ca1-4177-8a64-bde266c399d2", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/98e818ca-8ca1-4177-8a64-bde266c399d2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/98e818ca-8ca1-4177-8a64-bde266c399d2"}]}, {"id": "9e575731-b613-4b19-83e1-51cae9e2c5da", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9e575731-b613-4b19-83e1-51cae9e2c5da"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9e575731-b613-4b19-83e1-51cae9e2c5da"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 23 09:14:34 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:34.894 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-9e54aea5-f505-4695-9359-8ce56649ad1a request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 23 09:14:34 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:34.896 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/9e575731-b613-4b19-83e1-51cae9e2c5da -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}3973e950f0ad6b60c77919bcf2579e13dd1674cd8c4e3330f95dcdc2467f47a5" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 23 09:14:35 compute-0 nova_compute[182092]: 2026-01-23 09:14:35.695 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:36 compute-0 nova_compute[182092]: 2026-01-23 09:14:36.853 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.267 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Fri, 23 Jan 2026 09:14:34 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-186ef592-9f3f-4eaa-b4b1-eb7cab98b1af x-openstack-request-id: req-186ef592-9f3f-4eaa-b4b1-eb7cab98b1af _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.267 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "9e575731-b613-4b19-83e1-51cae9e2c5da", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9e575731-b613-4b19-83e1-51cae9e2c5da"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9e575731-b613-4b19-83e1-51cae9e2c5da"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.268 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/9e575731-b613-4b19-83e1-51cae9e2c5da used request id req-186ef592-9f3f-4eaa-b4b1-eb7cab98b1af request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.269 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'name': 'tempest-MigrationsAdminTest-server-2016787926', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000009', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0ce2bc741e474401a3f9c2ef00c19693', 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'hostId': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.271 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}3973e950f0ad6b60c77919bcf2579e13dd1674cd8c4e3330f95dcdc2467f47a5" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.350 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 23 Jan 2026 09:14:37 GMT Keep-Alive: timeout=5, max=98 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-33afdc86-5f65-4962-a393-ddff21997c0c x-openstack-request-id: req-33afdc86-5f65-4962-a393-ddff21997c0c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.350 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "98e818ca-8ca1-4177-8a64-bde266c399d2", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/98e818ca-8ca1-4177-8a64-bde266c399d2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/98e818ca-8ca1-4177-8a64-bde266c399d2"}]}, {"id": "9e575731-b613-4b19-83e1-51cae9e2c5da", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9e575731-b613-4b19-83e1-51cae9e2c5da"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9e575731-b613-4b19-83e1-51cae9e2c5da"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.351 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-33afdc86-5f65-4962-a393-ddff21997c0c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.352 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/98e818ca-8ca1-4177-8a64-bde266c399d2 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}3973e950f0ad6b60c77919bcf2579e13dd1674cd8c4e3330f95dcdc2467f47a5" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.472 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Fri, 23 Jan 2026 09:14:37 GMT Keep-Alive: timeout=5, max=97 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-58c26aef-6cb6-4b1d-a85b-9e0fe7ddfaa9 x-openstack-request-id: req-58c26aef-6cb6-4b1d-a85b-9e0fe7ddfaa9 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.472 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "98e818ca-8ca1-4177-8a64-bde266c399d2", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/98e818ca-8ca1-4177-8a64-bde266c399d2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/98e818ca-8ca1-4177-8a64-bde266c399d2"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.472 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/98e818ca-8ca1-4177-8a64-bde266c399d2 used request id req-58c26aef-6cb6-4b1d-a85b-9e0fe7ddfaa9 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.473 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'name': 'tempest-MigrationsAdminTest-server-1833245038', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000012', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0ce2bc741e474401a3f9c2ef00c19693', 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'hostId': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.474 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'name': 'tempest-MigrationsAdminTest-server-1223466208', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0ce2bc741e474401a3f9c2ef00c19693', 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'hostId': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.475 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.475 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.475 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-2016787926>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1833245038>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1223466208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-2016787926>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1833245038>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1223466208>]
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.475 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.493 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.write.bytes volume: 274432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.494 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.511 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.write.bytes volume: 72712192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.512 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.530 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.write.bytes volume: 335872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.531 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8374245-d2f0-4801-8396-42558135654a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274432, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.476027', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0c8da7c-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': 'fd584954c9d2438b585deace1fbe4bd6bc5ae8cdc6a0812cb82e028360faa1cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.476027', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0c8e9c2-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '3bcc5e6e27701208289de2966801f676239b2539183204e5d667df7a3d95fca6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72712192, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.476027', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0cb9b5e-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '4281955b53d6a636f4e650f96b86e5a8c7bb83b2f4e21c8fbfb8cb243732457c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.476027', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0cba6ee-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '0965235ed013c17731db239702cea0e98a5520f4e8c7463b893470e78867570f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 335872, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.476027', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0ce9da4-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': '9ba0a90c3875b6f95f383bed1ebbde83146c4004b252b8d2313705b5c3add999'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.476027', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_g
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: b': 1, 'disk_name': 'sda'}, 'message_id': 'f0ceae20-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': '0d1690a684cb53935d6531dbe9164ebf891f7505aaad891afdb4293bfef226ea'}]}, 'timestamp': '2026-01-23 09:14:37.532325', '_unique_id': '97e3202238784238a1cc913ad74e658b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.544 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.545 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.553 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.553 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.560 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.560 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '444d014c-6c2f-4f43-abfc-c5d74d1169ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.538197', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d0a946-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.067666604, 'message_signature': '9663f7b1cd728fc4649f4cf8b47ca0315208129e33e62f40e759bb134c7bd2d5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.538197', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d0b1fc-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.067666604, 'message_signature': '46656fe83921b95ad040fcf583507cd517550bd1e665e07d4d4dcea9bbd445d3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.538197', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d1ee64-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.07478588, 'message_signature': '7379900731ef4410ea98eac541252b7f634984d86215715b6f8f9a14cd08de4c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.538197', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d1fa12-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.07478588, 'message_signature': 'c43930d4790c6ab71a17202aa1dfb18b5c492e7d26ba75033e01b9a92d627ca2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.538197', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d2fc1e-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.083179822, 'message_signature': '0cafaaf4c462f9ffb1be2f1b922e84b4d3d4122f5597e0cc049117ceb53fc101'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.538197', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sd
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: a'}, 'message_id': 'f0d304de-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.083179822, 'message_signature': '789ca9252c890994a81b4a5bee7d1dcf70436cfa697973f0dd870868d908bf3b'}]}, 'timestamp': '2026-01-23 09:14:37.560707', '_unique_id': '203baac8934741c28fcf45b03a943500'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.566 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.566 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.566 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.567 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-2016787926>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1833245038>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1223466208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-2016787926>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1833245038>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1223466208>]
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.567 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.567 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.567 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.577 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/cpu volume: 9730000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.587 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/cpu volume: 10120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.597 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/cpu volume: 10010000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6db107c6-e8c3-4efc-968c-2e22fbea989a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9730000000, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'timestamp': '2026-01-23T09:14:37.567452', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f0d59730-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.10639201, 'message_signature': '9078ed3588c3995afc8f1c8a099bea55cce0e08e7a7cd9c3d3db8b22769ded6f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10120000000, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'timestamp': '2026-01-23T09:14:37.567452', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f0d72f6e-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.117042207, 'message_signature': 'd8ee70e28b46eec7cb5f1241348829e0135852618418874bb1d7112bd5194705'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10010000000, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'timestamp': '2026-01-23T09:14:37.567452', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f0d8b51e-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.126968027, 'message_signature': 'ac49e62a30d3fc438ab2b4af7e46c22b7cb1ad9cb502daae420af644fc44745c'}]}, 'timestamp': '2026-01-23 09:14:37.597931', '_unique_id': '10114479b04942cdaa14cb80ac55b142'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.598 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.599 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.599 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.allocation volume: 30085120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.599 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.599 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.600 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.600 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.allocation volume: 30085120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.600 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03adf972-78c5-4d9e-af74-95c3bdfa5808', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30085120, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.599318', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d8f4de-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.067666604, 'message_signature': '53663a87adb102b31e63e34d891ee7de2b99ad1bf5a2c3948d40c173c6dc8466'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.599318', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d8fdd0-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.067666604, 'message_signature': '0f87a8709c0678f10940cc6e776ece82abf2521fb22c8d21aae59b810b1a739d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.599318', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d907b2-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.07478588, 'message_signature': 'dc157a366442b468c9077d12708d0ff683cf8739dfc689bc46b9b44e469e61cc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.599318', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d90f82-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.07478588, 'message_signature': '75d336437ef46b27ebca569ab1bc3bd6d3be7e894e6f45dcdf71d3aced22481c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30085120, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.599318', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d91720-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.083179822, 'message_signature': '2a53d8be02dd04a42f8640ecc5f778ac69922810f266993013d9afa87b77fd38'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.599318', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: ': 'sda'}, 'message_id': 'f0d91f04-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.083179822, 'message_signature': '17b822f8bf752ea8919204cace25bd54d2d5910cb3305b6ccffc10150ec4936f'}]}, 'timestamp': '2026-01-23 09:14:37.600627', '_unique_id': '8e1960502bcd4ad1bfcf3c55f1b17ce8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.601 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.write.latency volume: 22870493 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.602 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.602 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.write.latency volume: 352035908 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.602 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.602 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.write.latency volume: 40301453 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '265dfe08-41ce-49de-b723-891899bcf2cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22870493, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.601942', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d95aa0-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '5ef18e2eef26f3db04818908b357570c95be725911ed91809b80bce4870c7778'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.601942', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d96298-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '527ad084af9fe0589c6a3e6514c00c3b64b7ba50347754d6046593db231ff8de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 352035908, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.601942', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d96b08-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '4bbd2eefe7f10619ce887902235b8a222b03eebf86b1f3b2cedc6a69c885ffe3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.601942', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d9731e-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '6663429530555f4922d7e2681a53f57ef63f34b3245e10dd5d3dc0f3a2156438'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40301453, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.601942', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d97c2e-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': '5e8561661857c59841f314ef7dbab79e501557e9aa6fa46ee6043910c896481e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.601942', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ep
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: hemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d983fe-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': 'c15ce9a718ea697b8e6372d2e83811cfef8d5f1e67a2a08eb541c577a3e31769'}]}, 'timestamp': '2026-01-23 09:14:37.603211', '_unique_id': '5a31110db387474c882c34f920eefe61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.604 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.read.latency volume: 278355382 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.604 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.read.latency volume: 15950451 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.605 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.read.latency volume: 244629022 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.605 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.read.latency volume: 19942082 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.605 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.read.latency volume: 240180047 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.605 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.read.latency volume: 21654078 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57acb1cc-5163-4ff2-83bb-9e12018be2ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 278355382, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.604617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d9c3fa-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': 'c3e97193d3751bcb7cac3ef9beb53dd6bbb47c68f8d9a1e395810f1650a635ab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15950451, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.604617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d9ce40-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '2d699390d13c9bce4fbb81d48a784fa8c7f7ae8454ab84fefcfcfc2ac5ccfe61'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 244629022, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.604617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d9d624-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '3cf86a202e4b48e8dce71d354d1f4a2bf4164a919b1d37ea49f0f9e21afbd0cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19942082, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.604617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d9de12-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '3f34b63a3f60b8ae9745c7a7d870e3bc3f5973d628d2f12038a4b260ec7e153b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 240180047, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.604617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0d9e574-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': 'b8e340c95e4d126315c4fdfa188bfeeb049db5ff29bc9774fc6f3bed8aa99541'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21654078, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.604617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0d9ed94-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': '01c1ac540da88afb38213dd1b50a2a58b81c84a7430dfc852c96a77dd4097c01'}]}, 'timestamp': '2026-01-23 09:14:37.605916', '_unique_id': 'd0c33f10a71441b99586836d81908b8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.607 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.607 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-2016787926>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1833245038>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1223466208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-2016787926>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1833245038>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1223466208>]
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.607 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.607 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.read.bytes volume: 32020480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.607 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.608 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.read.bytes volume: 29538816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.608 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.608 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.read.bytes volume: 32016384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.608 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bef2459e-3ad4-4f51-9d6e-a98f146fd75f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32020480, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.607559', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0da368c-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '2b7800cd57efd39b4a0d88f622abfc1e224fc528fc5d86499febc98c2fae013d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.607559', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0da3eac-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '7e803d0d5885ab742e25ea559c73d04bbb233c090e678b94e3f4e86646b20d08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29538816, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.607559', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0da4852-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '5df23608cef9939bb61f455aa1f14007737488b4e0b41af896ddf8d9cc8c8db2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.607559', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0da4fc8-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '3828e7e1b33518ecdfce22ec331f84ea66951d8b1e59656f3fabbbfad175f340'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32016384, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.607559', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0da57a2-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': 'e357793c1c2fe6a5476ed3a2dfb4d17b356f227d98c9e9b9bca4a205493d17a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.607559', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: ': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0da5fae-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': '7022c6e3b788d7e27c8b914c9e6f4a44a12288cca13a9029aa913fdec92e31bf'}]}, 'timestamp': '2026-01-23 09:14:37.608836', '_unique_id': 'b966c4d3dae442faae50f576d9c95d0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.609 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.610 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/memory.usage volume: 43.01171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.610 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/memory.usage volume: 40.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.610 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/memory.usage volume: 40.6796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47c2cded-0f9e-48af-b790-f677a9a0e596', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.01171875, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'timestamp': '2026-01-23T09:14:37.610110', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f0da99d8-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.10639201, 'message_signature': '331efb3c8c7a54c24345996aa1b87e9121d816f9744ac9e565962ac7fc92fbd4'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.35546875, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'timestamp': '2026-01-23T09:14:37.610110', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f0daa270-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.117042207, 'message_signature': 'b470cd3f9535e475432236da822adadf322b6b9e440e873ba324b9d885180fa8'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.6796875, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'timestamp': '2026-01-23T09:14:37.610110', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f0daaafe-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.126968027, 'message_signature': '7e429bc1daecb351f22e791282e2311a135a82f66e45bcb6de14d793cd488c63'}]}, 'timestamp': '2026-01-23 09:14:37.610768', '_unique_id': 'd6b8511906c744b1a50964188d27ea17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.611 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.612 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.read.requests volume: 1206 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.612 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.612 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.read.requests volume: 1056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.612 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.612 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.read.requests volume: 1205 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7093935-db92-4068-8294-14c2b1999c2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1206, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.612037', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0dae690-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': 'ca5bcb75e35712ee9b7905fae975fa4cfd021bb018318d6087f8ad068422291e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.612037', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0daef00-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '0721db4f4032ad8d7d976fa83e29aa89b700656c81543c0756d352e001c1a9b3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1056, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.612037', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0daf6da-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': 'dc202df173e62da901a74aa783307e67d795c8f3216127dd7ceaea374788aa3b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.612037', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0daff36-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': 'c4c7f4e0f1a1274bfdf4b737e584869bb8369bdbedae50602d68696f6cf49a21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1205, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.612037', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0db0742-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': '8cc716086015a92229f4a091f7e10384257652762999d2d4c690eb1e41754763'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.612037', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb':
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]:  128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0db1066-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': 'a803c7267f141aa2986784ac8d02d308f6f220e9226652e449cd7092cbcc7c6d'}]}, 'timestamp': '2026-01-23 09:14:37.613373', '_unique_id': 'f25f8b4a436a4c66b7f515e4f7b2ec62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.614 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.614 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.614 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.614 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.write.requests volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.614 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.615 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.write.requests volume: 310 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.615 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.615 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.write.requests volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.615 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eca03822-30d0-4c55-a50f-698e7c6b6d34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 30, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.614769', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0db4f86-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '545a093ee233a54562fe646cc693448412d5239e2abcfcbfafb6609549711ec9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.614769', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0db576a-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.005562736, 'message_signature': '507b8d382d8af233246d5529119eb07de0b2d875474e6f02fbf6b50bc64939dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 310, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.614769', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0db6110-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': 'c4901af595126e0cfa9e0dd221a17cbccf86c47c0f10b4da89f8e048a385e9d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.614769', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0db6912-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.02379901, 'message_signature': '8f95844a2f8878b8041d261b2149dbec22d02741ff32ef6a0b2ad6cd75dfcecd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 41, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.614769', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0db7146-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': '93186be2dc11f746748aae0dd64a57cebdeee5becebb40f2ae3869ef5a60b950'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.614769', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]:  'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0db78bc-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.041738694, 'message_signature': 'f9007ed5dd625be01441ca7e31be7c7311e7d352acb734f14a18736bc2c9e94e'}]}, 'timestamp': '2026-01-23 09:14:37.616031', '_unique_id': 'e3ddbe76468c47f4b7468ee944ffcbc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.617 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.617 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.617 12 DEBUG ceilometer.compute.pollsters [-] 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.617 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.617 12 DEBUG ceilometer.compute.pollsters [-] f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.618 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.618 12 DEBUG ceilometer.compute.pollsters [-] 7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f0b6f06-a4e7-4da5-b625-bb796cd2451c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-vda', 'timestamp': '2026-01-23T09:14:37.617263', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0dbb17e-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.067666604, 'message_signature': '407cd4b6d68dae730e1bd78b05428d26304da23946a3439f774a3ea88107b5fb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-sda', 'timestamp': '2026-01-23T09:14:37.617263', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2016787926', 'name': 'instance-00000009', 'instance_id': '0e8dc87a-78f6-4d2d-b19b-6c51917bfd58', 'instance_type': 'm1.micro', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '9e575731-b613-4b19-83e1-51cae9e2c5da', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0dbb976-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.067666604, 'message_signature': '8e957f0718915da80907a0de5f6e25bf6e9a4cca51d7f96f3da3f105fd4508b9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-vda', 'timestamp': '2026-01-23T09:14:37.617263', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0dbc1d2-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.07478588, 'message_signature': 'cfe4c8d2d621a657e2c56b8a2961633192ffe702911ca8673cdcc9fd0526fe75'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-sda', 'timestamp': '2026-01-23T09:14:37.617263', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1833245038', 'name': 'instance-00000012', 'instance_id': 'f7ddc4fd-b0a8-4309-bdb4-fd989a270ede', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0dbc952-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.07478588, 'message_signature': '90ccf536861597aee353ce43ff063006dd86863a7a90300b06981cb07f57c5b1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-vda', 'timestamp': '2026-01-23T09:14:37.617263', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f0dbd078-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.083179822, 'message_signature': '384b1861159ec67c57e76551db57cfc056a12e36b99bf068790f2ef6a3d59b00'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'user_name': None, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'project_name': None, 'resource_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928-sda', 'timestamp': '2026-01-23T09:14:37.617263', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1223466208', 'name': 'instance-00000005', 'instance_id': '7bf6771c-28de-4cd3-a95a-d8a3e8b25928', 'instance_type': 'm1.nano', 'host': 'e6db7d3c554af8071c91b2e9996d2a82e4dd9b04b35efc46a19180fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f0db
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: da0a-f83b-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3183.083179822, 'message_signature': '6527e5e995678a6b20b4878eb6e03e9b41fd5eb98c5d26f18a240a9c21c42850'}]}, 'timestamp': '2026-01-23 09:14:37.618523', '_unique_id': '8f8c9b6204cf4427b6422215e6c405f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:14:37 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:14:37.619 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-2016787926>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1833245038>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1223466208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-2016787926>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1833245038>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1223466208>]
Jan 23 09:14:37 compute-0 nova_compute[182092]: 2026-01-23 09:14:37.644 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:37 compute-0 nova_compute[182092]: 2026-01-23 09:14:37.650 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "8c2049e7-3c68-4550-a1c7-541164b292b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:37 compute-0 nova_compute[182092]: 2026-01-23 09:14:37.651 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "8c2049e7-3c68-4550-a1c7-541164b292b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:37 compute-0 nova_compute[182092]: 2026-01-23 09:14:37.692 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:14:37 compute-0 nova_compute[182092]: 2026-01-23 09:14:37.852 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.536 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.561 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.601 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.603 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.606 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.609 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.613 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.616 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:37 compute-0 rsyslogd[962]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:14:37.619 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 09:14:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:39.852 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:39.852 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:39.852 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:40 compute-0 podman[211674]: 2026-01-23 09:14:40.205462308 +0000 UTC m=+0.041940080 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:14:40 compute-0 podman[211673]: 2026-01-23 09:14:40.205872152 +0000 UTC m=+0.045033585 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.313 182096 DEBUG nova.virt.libvirt.driver [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.665 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159665.6640286, 52fc65a3-ff92-4710-bcde-048b467fdd86 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.665 182096 INFO nova.compute.manager [-] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] VM Stopped (Lifecycle Event)
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.697 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.820 182096 DEBUG nova.compute.manager [None req-b9fb6774-6379-4adb-a091-688b33a4568c - - - - - -] [instance: 52fc65a3-ff92-4710-bcde-048b467fdd86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.834 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.835 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.843 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:14:40 compute-0 nova_compute[182092]: 2026-01-23 09:14:40.843 182096 INFO nova.compute.claims [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:14:41 compute-0 nova_compute[182092]: 2026-01-23 09:14:41.039 182096 DEBUG nova.compute.provider_tree [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:41 compute-0 nova_compute[182092]: 2026-01-23 09:14:41.855 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:42 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 23 09:14:42 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000012.scope: Consumed 11.082s CPU time.
Jan 23 09:14:42 compute-0 systemd-machined[153562]: Machine qemu-8-instance-00000012 terminated.
Jan 23 09:14:43 compute-0 nova_compute[182092]: 2026-01-23 09:14:43.323 182096 INFO nova.virt.libvirt.driver [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance shutdown successfully after 13 seconds.
Jan 23 09:14:43 compute-0 nova_compute[182092]: 2026-01-23 09:14:43.327 182096 INFO nova.virt.libvirt.driver [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance destroyed successfully.
Jan 23 09:14:43 compute-0 nova_compute[182092]: 2026-01-23 09:14:43.329 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:43 compute-0 nova_compute[182092]: 2026-01-23 09:14:43.375 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:43 compute-0 nova_compute[182092]: 2026-01-23 09:14:43.376 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:43 compute-0 nova_compute[182092]: 2026-01-23 09:14:43.422 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:43 compute-0 nova_compute[182092]: 2026-01-23 09:14:43.424 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Copying file /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk to 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:14:43 compute-0 nova_compute[182092]: 2026-01-23 09:14:43.424 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.000 182096 DEBUG nova.scheduler.client.report [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.059 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "scp -r /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk" returned: 0 in 0.634s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.059 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Copying file /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.060 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk.config 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.107 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.160 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "996e59bf-2599-4464-b733-b0bbed240ae4" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.161 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "996e59bf-2599-4464-b733-b0bbed240ae4" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.171 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] No node specified, defaulting to compute-0.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.214 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "996e59bf-2599-4464-b733-b0bbed240ae4" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.214 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.238 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "scp -C -r /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk.config 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.238 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Copying file /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.238 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk.info 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.281 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.281 182096 DEBUG nova.network.neutron [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.303 182096 INFO nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.325 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.404 182096 DEBUG oslo_concurrency.processutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] CMD "scp -C -r /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_resize/disk.info 192.168.122.102:/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.462 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "e20a497a-3335-4412-8252-160abc20748b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.463 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.495 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.502 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.502 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.503 182096 INFO nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Creating image(s)
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.503 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "/var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.503 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "/var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.504 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "/var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.514 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.560 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.561 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.561 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.570 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.582 182096 DEBUG oslo_concurrency.lockutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Acquiring lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.583 182096 DEBUG oslo_concurrency.lockutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.583 182096 DEBUG oslo_concurrency.lockutils [None req-8994f31f-2472-46e3-890f-8c9b81290340 39e05d5298044cce8d8baa526aabda4e 6d31b6d5a7b34d789dc67cdecfeaf727 - - default default] Lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.614 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.614 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.616 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.617 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.631 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.631 182096 INFO nova.compute.claims [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.638 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.639 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.639 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.659 182096 DEBUG nova.network.neutron [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.659 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.686 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.686 182096 DEBUG nova.virt.disk.api [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Checking if we can resize image /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.687 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.733 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.733 182096 DEBUG nova.virt.disk.api [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Cannot resize image /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.734 182096 DEBUG nova.objects.instance [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lazy-loading 'migration_context' on Instance uuid 8c2049e7-3c68-4550-a1c7-541164b292b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.754 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.754 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Ensure instance console log exists: /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.754 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.755 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.755 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.756 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.759 182096 WARNING nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.764 182096 DEBUG nova.virt.libvirt.host [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.764 182096 DEBUG nova.virt.libvirt.host [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.766 182096 DEBUG nova.virt.libvirt.host [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.767 182096 DEBUG nova.virt.libvirt.host [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.768 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.768 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.768 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.768 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.769 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.769 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.769 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.769 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.769 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.770 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.770 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.770 182096 DEBUG nova.virt.hardware [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.772 182096 DEBUG nova.objects.instance [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c2049e7-3c68-4550-a1c7-541164b292b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.785 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <uuid>8c2049e7-3c68-4550-a1c7-541164b292b3</uuid>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <name>instance-00000016</name>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersOnMultiNodesTest-server-241474519-1</nova:name>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:14:44</nova:creationTime>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:14:44 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:14:44 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:14:44 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:14:44 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:14:44 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:14:44 compute-0 nova_compute[182092]:         <nova:user uuid="bd82cd872bee49129f7a48e632aad32b">tempest-ServersOnMultiNodesTest-1645797145-project-member</nova:user>
Jan 23 09:14:44 compute-0 nova_compute[182092]:         <nova:project uuid="0ae23d2314c44c60aeac024a2d783cb4">tempest-ServersOnMultiNodesTest-1645797145</nova:project>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <system>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <entry name="serial">8c2049e7-3c68-4550-a1c7-541164b292b3</entry>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <entry name="uuid">8c2049e7-3c68-4550-a1c7-541164b292b3</entry>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     </system>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <os>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   </os>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <features>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   </features>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk.config"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/console.log" append="off"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <video>
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     </video>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:14:44 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:14:44 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:14:44 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:14:44 compute-0 nova_compute[182092]: </domain>
Jan 23 09:14:44 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.822 182096 DEBUG nova.compute.provider_tree [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.847 182096 DEBUG nova.scheduler.client.report [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.853 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.853 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.853 182096 INFO nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Using config drive
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.872 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.873 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.916 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.916 182096 DEBUG nova.network.neutron [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.931 182096 INFO nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:14:44 compute-0 nova_compute[182092]: 2026-01-23 09:14:44.951 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.067 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.068 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.068 182096 INFO nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Creating image(s)
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.069 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "/var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.069 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "/var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.070 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "/var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.079 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.126 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.127 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.127 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.137 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.183 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.184 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.216 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.217 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.217 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.263 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.264 182096 DEBUG nova.virt.disk.api [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Checking if we can resize image /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.264 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.308 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.309 182096 DEBUG nova.virt.disk.api [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Cannot resize image /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.310 182096 DEBUG nova.objects.instance [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lazy-loading 'migration_context' on Instance uuid e20a497a-3335-4412-8252-160abc20748b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.339 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.340 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Ensure instance console log exists: /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.340 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.341 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.341 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.473 182096 INFO nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Creating config drive at /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk.config
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.476 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzm86n6h0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.589 182096 DEBUG nova.policy [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '500d8382713b469882428068fb752dda', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '20809f8e4c7d472e9e4e858c4ae7ebde', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.593 182096 DEBUG oslo_concurrency.processutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzm86n6h0" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:45 compute-0 systemd-machined[153562]: New machine qemu-9-instance-00000016.
Jan 23 09:14:45 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000016.
Jan 23 09:14:45 compute-0 nova_compute[182092]: 2026-01-23 09:14:45.697 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.020 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159686.020308, 8c2049e7-3c68-4550-a1c7-541164b292b3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.021 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] VM Resumed (Lifecycle Event)
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.023 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.023 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.026 182096 INFO nova.virt.libvirt.driver [-] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Instance spawned successfully.
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.026 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.062 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.066 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.068 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.068 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.069 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.069 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.069 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.070 182096 DEBUG nova.virt.libvirt.driver [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.093 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.094 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159686.0209339, 8c2049e7-3c68-4550-a1c7-541164b292b3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.094 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] VM Started (Lifecycle Event)
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.120 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.121 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.144 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.157 182096 INFO nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Took 1.66 seconds to spawn the instance on the hypervisor.
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.157 182096 DEBUG nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.209 182096 INFO nova.compute.manager [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Took 6.82 seconds to build instance.
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.230 182096 DEBUG oslo_concurrency.lockutils [None req-8754cab8-b4cf-4dc6-a0e5-91650b685f9b bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "8c2049e7-3c68-4550-a1c7-541164b292b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.761 182096 DEBUG nova.network.neutron [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Successfully created port: 712e26fc-1b1b-4107-b48d-e274507951ce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:14:46 compute-0 nova_compute[182092]: 2026-01-23 09:14:46.857 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:47 compute-0 nova_compute[182092]: 2026-01-23 09:14:47.766 182096 DEBUG nova.network.neutron [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Successfully updated port: 712e26fc-1b1b-4107-b48d-e274507951ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:14:47 compute-0 nova_compute[182092]: 2026-01-23 09:14:47.778 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:47 compute-0 nova_compute[182092]: 2026-01-23 09:14:47.778 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquired lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:47 compute-0 nova_compute[182092]: 2026-01-23 09:14:47.778 182096 DEBUG nova.network.neutron [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:47 compute-0 nova_compute[182092]: 2026-01-23 09:14:47.853 182096 DEBUG nova.compute.manager [req-bafc9883-95c6-425d-a234-25b3cac46fd9 req-94f0c545-b49b-4d7b-b401-4b279a42d0fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received event network-changed-712e26fc-1b1b-4107-b48d-e274507951ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:47 compute-0 nova_compute[182092]: 2026-01-23 09:14:47.853 182096 DEBUG nova.compute.manager [req-bafc9883-95c6-425d-a234-25b3cac46fd9 req-94f0c545-b49b-4d7b-b401-4b279a42d0fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Refreshing instance network info cache due to event network-changed-712e26fc-1b1b-4107-b48d-e274507951ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:14:47 compute-0 nova_compute[182092]: 2026-01-23 09:14:47.853 182096 DEBUG oslo_concurrency.lockutils [req-bafc9883-95c6-425d-a234-25b3cac46fd9 req-94f0c545-b49b-4d7b-b401-4b279a42d0fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:47 compute-0 nova_compute[182092]: 2026-01-23 09:14:47.976 182096 DEBUG nova.network.neutron [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.652 182096 DEBUG nova.network.neutron [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Updating instance_info_cache with network_info: [{"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.676 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Releasing lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.676 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Instance network_info: |[{"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.677 182096 DEBUG oslo_concurrency.lockutils [req-bafc9883-95c6-425d-a234-25b3cac46fd9 req-94f0c545-b49b-4d7b-b401-4b279a42d0fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.677 182096 DEBUG nova.network.neutron [req-bafc9883-95c6-425d-a234-25b3cac46fd9 req-94f0c545-b49b-4d7b-b401-4b279a42d0fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Refreshing network info cache for port 712e26fc-1b1b-4107-b48d-e274507951ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.679 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Start _get_guest_xml network_info=[{"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.683 182096 WARNING nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.692 182096 DEBUG nova.virt.libvirt.host [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.693 182096 DEBUG nova.virt.libvirt.host [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.698 182096 DEBUG nova.virt.libvirt.host [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.698 182096 DEBUG nova.virt.libvirt.host [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.699 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.699 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.700 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.700 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.700 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.700 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.701 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.701 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.702 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.702 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.703 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.703 182096 DEBUG nova.virt.hardware [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.705 182096 DEBUG nova.virt.libvirt.vif [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:14:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1186247627',display_name='tempest-ServersAdminTestJSON-server-1186247627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1186247627',id=24,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20809f8e4c7d472e9e4e858c4ae7ebde',ramdisk_id='',reservation_id='r-s0yyyf3o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1639739915',owner_user_name='tempest-ServersAdminTestJSON-1639739915-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:14:44Z,user_data=None,user_id='500d8382713b469882428068fb752dda',uuid=e20a497a-3335-4412-8252-160abc20748b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.706 182096 DEBUG nova.network.os_vif_util [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Converting VIF {"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.707 182096 DEBUG nova.network.os_vif_util [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:13:02,bridge_name='br-int',has_traffic_filtering=True,id=712e26fc-1b1b-4107-b48d-e274507951ce,network=Network(d1e82582-8315-4300-9368-7c276c138ed9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap712e26fc-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.707 182096 DEBUG nova.objects.instance [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lazy-loading 'pci_devices' on Instance uuid e20a497a-3335-4412-8252-160abc20748b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.716 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <uuid>e20a497a-3335-4412-8252-160abc20748b</uuid>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <name>instance-00000018</name>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersAdminTestJSON-server-1186247627</nova:name>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:14:49</nova:creationTime>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:14:49 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:14:49 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:14:49 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:14:49 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:14:49 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:14:49 compute-0 nova_compute[182092]:         <nova:user uuid="500d8382713b469882428068fb752dda">tempest-ServersAdminTestJSON-1639739915-project-member</nova:user>
Jan 23 09:14:49 compute-0 nova_compute[182092]:         <nova:project uuid="20809f8e4c7d472e9e4e858c4ae7ebde">tempest-ServersAdminTestJSON-1639739915</nova:project>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:14:49 compute-0 nova_compute[182092]:         <nova:port uuid="712e26fc-1b1b-4107-b48d-e274507951ce">
Jan 23 09:14:49 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <system>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <entry name="serial">e20a497a-3335-4412-8252-160abc20748b</entry>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <entry name="uuid">e20a497a-3335-4412-8252-160abc20748b</entry>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </system>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <os>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   </os>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <features>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   </features>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk.config"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:d0:13:02"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <target dev="tap712e26fc-1b"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/console.log" append="off"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <video>
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </video>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:14:49 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:14:49 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:14:49 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:14:49 compute-0 nova_compute[182092]: </domain>
Jan 23 09:14:49 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.720 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Preparing to wait for external event network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.721 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "e20a497a-3335-4412-8252-160abc20748b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.721 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.721 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.722 182096 DEBUG nova.virt.libvirt.vif [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:14:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1186247627',display_name='tempest-ServersAdminTestJSON-server-1186247627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1186247627',id=24,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20809f8e4c7d472e9e4e858c4ae7ebde',ramdisk_id='',reservation_id='r-s0yyyf3o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1639739915',owner_user_name='tempest-ServersAdminTestJSON-1639739915-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:14:44Z,user_data=None,user_id='500d8382713b469882428068fb752dda',uuid=e20a497a-3335-4412-8252-160abc20748b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.722 182096 DEBUG nova.network.os_vif_util [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Converting VIF {"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.723 182096 DEBUG nova.network.os_vif_util [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:13:02,bridge_name='br-int',has_traffic_filtering=True,id=712e26fc-1b1b-4107-b48d-e274507951ce,network=Network(d1e82582-8315-4300-9368-7c276c138ed9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap712e26fc-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.723 182096 DEBUG os_vif [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:13:02,bridge_name='br-int',has_traffic_filtering=True,id=712e26fc-1b1b-4107-b48d-e274507951ce,network=Network(d1e82582-8315-4300-9368-7c276c138ed9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap712e26fc-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.724 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.724 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.724 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.727 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.727 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap712e26fc-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.727 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap712e26fc-1b, col_values=(('external_ids', {'iface-id': '712e26fc-1b1b-4107-b48d-e274507951ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:13:02', 'vm-uuid': 'e20a497a-3335-4412-8252-160abc20748b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.729 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:49 compute-0 NetworkManager[54920]: <info>  [1769159689.7300] manager: (tap712e26fc-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.732 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.736 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.737 182096 INFO os_vif [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:13:02,bridge_name='br-int',has_traffic_filtering=True,id=712e26fc-1b1b-4107-b48d-e274507951ce,network=Network(d1e82582-8315-4300-9368-7c276c138ed9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap712e26fc-1b')
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.788 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.788 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.788 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] No VIF found with MAC fa:16:3e:d0:13:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:14:49 compute-0 nova_compute[182092]: 2026-01-23 09:14:49.789 182096 INFO nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Using config drive
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.313 182096 INFO nova.compute.manager [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Swapping old allocation on dict_keys(['052a7ae7-9ec7-49ca-a013-73791f9c049a']) held by migration 9f6c614a-c17a-4be9-8fc4-5e5a503f8933 for instance
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.368 182096 DEBUG nova.scheduler.client.report [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Overwriting current allocation {'allocations': {'d68f3af8-1e98-40ae-8ad3-27aefe8f9d3d': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 16}}, 'project_id': '0ce2bc741e474401a3f9c2ef00c19693', 'user_id': '03b66d4c15354adcaae5e4e4cd711e1b', 'consumer_generation': 1} on consumer f7ddc4fd-b0a8-4309-bdb4-fd989a270ede move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.455 182096 INFO nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Creating config drive at /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk.config
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.459 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe6x_uayb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.577 182096 DEBUG oslo_concurrency.processutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe6x_uayb" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:50 compute-0 NetworkManager[54920]: <info>  [1769159690.6132] manager: (tap712e26fc-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Jan 23 09:14:50 compute-0 kernel: tap712e26fc-1b: entered promiscuous mode
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.617 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:50 compute-0 ovn_controller[94697]: 2026-01-23T09:14:50Z|00055|binding|INFO|Claiming lport 712e26fc-1b1b-4107-b48d-e274507951ce for this chassis.
Jan 23 09:14:50 compute-0 ovn_controller[94697]: 2026-01-23T09:14:50Z|00056|binding|INFO|712e26fc-1b1b-4107-b48d-e274507951ce: Claiming fa:16:3e:d0:13:02 10.100.0.3
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.624 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:13:02 10.100.0.3'], port_security=['fa:16:3e:d0:13:02 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e20a497a-3335-4412-8252-160abc20748b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1e82582-8315-4300-9368-7c276c138ed9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20809f8e4c7d472e9e4e858c4ae7ebde', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3a369b5a-736d-46db-b1d1-2611ef520ef5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e21d2a14-35ce-4230-a1cf-88a95faee610, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=712e26fc-1b1b-4107-b48d-e274507951ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.625 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 712e26fc-1b1b-4107-b48d-e274507951ce in datapath d1e82582-8315-4300-9368-7c276c138ed9 bound to our chassis
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.626 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1e82582-8315-4300-9368-7c276c138ed9
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.638 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5039f1f2-08ad-4dba-9473-4af2e80d2b40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.638 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1e82582-81 in ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.644 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1e82582-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.644 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a5b50ae8-93e8-4036-9614-cda62000f0c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.645 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ee34d2d7-d6d6-448f-916b-c6b09db32e31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.652 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[ff4b0f14-950e-4f98-baba-e805af58b1ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.654 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.655 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.665 182096 DEBUG oslo_concurrency.lockutils [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.665 182096 DEBUG oslo_concurrency.lockutils [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquired lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.665 182096 DEBUG nova.network.neutron [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:50 compute-0 systemd-udevd[211813]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:14:50 compute-0 systemd-machined[153562]: New machine qemu-10-instance-00000018.
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.680 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fd150410-2939-4b95-9f48-d6c9dbf5861e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 NetworkManager[54920]: <info>  [1769159690.6829] device (tap712e26fc-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:14:50 compute-0 NetworkManager[54920]: <info>  [1769159690.6834] device (tap712e26fc-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:14:50 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-00000018.
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.685 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:50 compute-0 ovn_controller[94697]: 2026-01-23T09:14:50Z|00057|binding|INFO|Setting lport 712e26fc-1b1b-4107-b48d-e274507951ce ovn-installed in OVS
Jan 23 09:14:50 compute-0 ovn_controller[94697]: 2026-01-23T09:14:50Z|00058|binding|INFO|Setting lport 712e26fc-1b1b-4107-b48d-e274507951ce up in Southbound
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.692 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.695 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.695 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.695 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.696 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.707 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "8c2049e7-3c68-4550-a1c7-541164b292b3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.708 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "8c2049e7-3c68-4550-a1c7-541164b292b3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.708 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "8c2049e7-3c68-4550-a1c7-541164b292b3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.708 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "8c2049e7-3c68-4550-a1c7-541164b292b3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.708 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "8c2049e7-3c68-4550-a1c7-541164b292b3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.709 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2c70c6-d147-46be-996e-10a81b77cb89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 NetworkManager[54920]: <info>  [1769159690.7145] manager: (tapd1e82582-80): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Jan 23 09:14:50 compute-0 systemd-udevd[211816]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.716 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9d8e92a6-aa96-47d8-a2d2-e6181c3e4b5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.723 182096 INFO nova.compute.manager [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Terminating instance
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.730 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "refresh_cache-8c2049e7-3c68-4550-a1c7-541164b292b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.730 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquired lock "refresh_cache-8c2049e7-3c68-4550-a1c7-541164b292b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.730 182096 DEBUG nova.network.neutron [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.748 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[307226f8-f94e-42f4-889a-b3ba46b01472]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.750 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[73d8d7ef-b8e7-4817-a542-e1d3a50353ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 NetworkManager[54920]: <info>  [1769159690.7682] device (tapd1e82582-80): carrier: link connected
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.772 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[16889484-7648-4722-ad4e-884e38c70cd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.786 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b7773c0a-3d79-45f9-bb52-46f5142b4994]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1e82582-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:2d:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319625, 'reachable_time': 19005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211836, 'error': None, 'target': 'ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.798 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a69e2d5b-3cea-4f94-9a47-ee657f53e43a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe67:2d43'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 319625, 'tstamp': 319625}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211837, 'error': None, 'target': 'ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.811 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.809 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1983b8-428c-4ba7-ad29-f5985e17bd66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1e82582-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:2d:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319625, 'reachable_time': 19005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211838, 'error': None, 'target': 'ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.839 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6f429d53-4e8c-4595-875c-4d9bd4969b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.854 182096 DEBUG nova.network.neutron [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.874 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.875 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.885 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2b09ad9a-dbfa-4f06-a925-a2bc7ead34c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.886 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e82582-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.887 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.887 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1e82582-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:50 compute-0 kernel: tapd1e82582-80: entered promiscuous mode
Jan 23 09:14:50 compute-0 NetworkManager[54920]: <info>  [1769159690.8901] manager: (tapd1e82582-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.894 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.894 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1e82582-80, col_values=(('external_ids', {'iface-id': 'cfa17eb5-58cc-4063-95c2-58b8e9f89675'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:14:50 compute-0 ovn_controller[94697]: 2026-01-23T09:14:50Z|00059|binding|INFO|Releasing lport cfa17eb5-58cc-4063-95c2-58b8e9f89675 from this chassis (sb_readonly=0)
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.908 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.909 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1e82582-8315-4300-9368-7c276c138ed9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1e82582-8315-4300-9368-7c276c138ed9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.909 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5424542f-6b60-4be1-897d-1b79689d1b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.910 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-d1e82582-8315-4300-9368-7c276c138ed9
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/d1e82582-8315-4300-9368-7c276c138ed9.pid.haproxy
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID d1e82582-8315-4300-9368-7c276c138ed9
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:14:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:14:50.911 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9', 'env', 'PROCESS_TAG=haproxy-d1e82582-8315-4300-9368-7c276c138ed9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1e82582-8315-4300-9368-7c276c138ed9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.946 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.947 182096 DEBUG nova.network.neutron [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.952 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000012, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.955 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.974 182096 DEBUG nova.compute.manager [req-3290232e-c98d-43a2-bf66-a7ae1264b8cd req-8a9531dd-6d51-4ce0-ae29-4036685b1d44 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received event network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.975 182096 DEBUG oslo_concurrency.lockutils [req-3290232e-c98d-43a2-bf66-a7ae1264b8cd req-8a9531dd-6d51-4ce0-ae29-4036685b1d44 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e20a497a-3335-4412-8252-160abc20748b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.975 182096 DEBUG oslo_concurrency.lockutils [req-3290232e-c98d-43a2-bf66-a7ae1264b8cd req-8a9531dd-6d51-4ce0-ae29-4036685b1d44 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.975 182096 DEBUG oslo_concurrency.lockutils [req-3290232e-c98d-43a2-bf66-a7ae1264b8cd req-8a9531dd-6d51-4ce0-ae29-4036685b1d44 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:50 compute-0 nova_compute[182092]: 2026-01-23 09:14:50.975 182096 DEBUG nova.compute.manager [req-3290232e-c98d-43a2-bf66-a7ae1264b8cd req-8a9531dd-6d51-4ce0-ae29-4036685b1d44 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Processing event network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.026 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.026 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.094 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.102 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.161 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.161 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.231 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.235 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:51 compute-0 podman[211886]: 2026-01-23 09:14:51.242881755 +0000 UTC m=+0.047047742 container create 29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:14:51 compute-0 systemd[1]: Started libpod-conmon-29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6.scope.
Jan 23 09:14:51 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.291 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.292 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/164c1baee4dee95ecab06372cd5b6dc3b3b47eba41d7f3ff579a9244501ae2af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:14:51 compute-0 podman[211886]: 2026-01-23 09:14:51.30383868 +0000 UTC m=+0.108004688 container init 29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 09:14:51 compute-0 podman[211886]: 2026-01-23 09:14:51.30881726 +0000 UTC m=+0.112983247 container start 29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 09:14:51 compute-0 podman[211886]: 2026-01-23 09:14:51.219006947 +0000 UTC m=+0.023172954 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:14:51 compute-0 neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9[211901]: [NOTICE]   (211908) : New worker (211910) forked
Jan 23 09:14:51 compute-0 neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9[211901]: [NOTICE]   (211908) : Loading success.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.365 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.569 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.569 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159691.5682309, e20a497a-3335-4412-8252-160abc20748b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.570 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] VM Started (Lifecycle Event)
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.572 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.575 182096 INFO nova.virt.libvirt.driver [-] [instance: e20a497a-3335-4412-8252-160abc20748b] Instance spawned successfully.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.575 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.596 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.599 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.600 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.600 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.600 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.600 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.601 182096 DEBUG nova.virt.libvirt.driver [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.603 182096 DEBUG nova.network.neutron [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.606 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.632 182096 DEBUG oslo_concurrency.lockutils [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Releasing lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.632 182096 DEBUG nova.virt.libvirt.driver [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.634 182096 DEBUG nova.network.neutron [req-bafc9883-95c6-425d-a234-25b3cac46fd9 req-94f0c545-b49b-4d7b-b401-4b279a42d0fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Updated VIF entry in instance network info cache for port 712e26fc-1b1b-4107-b48d-e274507951ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.634 182096 DEBUG nova.network.neutron [req-bafc9883-95c6-425d-a234-25b3cac46fd9 req-94f0c545-b49b-4d7b-b401-4b279a42d0fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Updating instance_info_cache with network_info: [{"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.638 182096 DEBUG nova.virt.libvirt.driver [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.641 182096 WARNING nova.virt.libvirt.driver [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.646 182096 DEBUG nova.virt.libvirt.host [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.646 182096 DEBUG nova.virt.libvirt.host [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.648 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.648 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5339MB free_disk=73.28674697875977GB free_vcpus=0 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.649 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.649 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.654 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.654 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159691.5683622, e20a497a-3335-4412-8252-160abc20748b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.654 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] VM Paused (Lifecycle Event)
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.655 182096 DEBUG nova.virt.libvirt.host [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.655 182096 DEBUG nova.virt.libvirt.host [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.656 182096 DEBUG nova.virt.libvirt.driver [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.656 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.657 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.657 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.657 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.657 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.657 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.657 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.658 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.658 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.658 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.658 182096 DEBUG nova.virt.hardware [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.658 182096 DEBUG nova.objects.instance [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f7ddc4fd-b0a8-4309-bdb4-fd989a270ede obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.661 182096 DEBUG oslo_concurrency.lockutils [req-bafc9883-95c6-425d-a234-25b3cac46fd9 req-94f0c545-b49b-4d7b-b401-4b279a42d0fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.681 182096 DEBUG oslo_concurrency.processutils [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.694 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.701 182096 INFO nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Took 6.63 seconds to spawn the instance on the hypervisor.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.701 182096 DEBUG nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.706 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159691.57221, e20a497a-3335-4412-8252-160abc20748b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.706 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] VM Resumed (Lifecycle Event)
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.730 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.734 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.739 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 8c2049e7-3c68-4550-a1c7-541164b292b3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.739 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.739 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.739 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance f7ddc4fd-b0a8-4309-bdb4-fd989a270ede actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.739 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance e20a497a-3335-4412-8252-160abc20748b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.739 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.739 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=1216MB phys_disk=79GB used_disk=5GB total_vcpus=4 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.742 182096 DEBUG nova.network.neutron [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.747 182096 DEBUG oslo_concurrency.processutils [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.747 182096 DEBUG oslo_concurrency.lockutils [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.747 182096 DEBUG oslo_concurrency.lockutils [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.748 182096 DEBUG oslo_concurrency.lockutils [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.749 182096 DEBUG nova.virt.libvirt.driver [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <uuid>f7ddc4fd-b0a8-4309-bdb4-fd989a270ede</uuid>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <name>instance-00000012</name>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <nova:name>tempest-MigrationsAdminTest-server-1833245038</nova:name>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:14:51</nova:creationTime>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:14:51 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:14:51 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:14:51 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:14:51 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:14:51 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:14:51 compute-0 nova_compute[182092]:         <nova:user uuid="03b66d4c15354adcaae5e4e4cd711e1b">tempest-MigrationsAdminTest-1469473097-project-member</nova:user>
Jan 23 09:14:51 compute-0 nova_compute[182092]:         <nova:project uuid="0ce2bc741e474401a3f9c2ef00c19693">tempest-MigrationsAdminTest-1469473097</nova:project>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <system>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <entry name="serial">f7ddc4fd-b0a8-4309-bdb4-fd989a270ede</entry>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <entry name="uuid">f7ddc4fd-b0a8-4309-bdb4-fd989a270ede</entry>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     </system>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <os>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   </os>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <features>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   </features>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/disk.config"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede/console.log" append="off"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <video>
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     </video>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <input type="keyboard" bus="usb"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:14:51 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:14:51 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:14:51 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:14:51 compute-0 nova_compute[182092]: </domain>
Jan 23 09:14:51 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.757 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.760 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Releasing lock "refresh_cache-8c2049e7-3c68-4550-a1c7-541164b292b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.761 182096 DEBUG nova.compute.manager [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.781 182096 INFO nova.compute.manager [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Took 7.19 seconds to build instance.
Jan 23 09:14:51 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 23 09:14:51 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000016.scope: Consumed 6.054s CPU time.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.802 182096 DEBUG oslo_concurrency.lockutils [None req-0abf34be-8c13-4a31-b42e-b92776a8bd86 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:51 compute-0 systemd-machined[153562]: Machine qemu-9-instance-00000016 terminated.
Jan 23 09:14:51 compute-0 systemd-machined[153562]: New machine qemu-11-instance-00000012.
Jan 23 09:14:51 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-00000012.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.856 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.859 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.869 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.882 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.883 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.997 182096 INFO nova.virt.libvirt.driver [-] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Instance destroyed successfully.
Jan 23 09:14:51 compute-0 nova_compute[182092]: 2026-01-23 09:14:51.997 182096 DEBUG nova.objects.instance [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lazy-loading 'resources' on Instance uuid 8c2049e7-3c68-4550-a1c7-541164b292b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.016 182096 INFO nova.virt.libvirt.driver [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Deleting instance files /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3_del
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.016 182096 INFO nova.virt.libvirt.driver [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Deletion of /var/lib/nova/instances/8c2049e7-3c68-4550-a1c7-541164b292b3_del complete
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.079 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for f7ddc4fd-b0a8-4309-bdb4-fd989a270ede due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.080 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159692.0795352, f7ddc4fd-b0a8-4309-bdb4-fd989a270ede => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.080 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] VM Resumed (Lifecycle Event)
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.081 182096 DEBUG nova.compute.manager [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.084 182096 INFO nova.virt.libvirt.driver [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance running successfully.
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.084 182096 DEBUG nova.virt.libvirt.driver [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.123 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.125 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.160 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.161 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159692.0812042, f7ddc4fd-b0a8-4309-bdb4-fd989a270ede => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.161 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] VM Started (Lifecycle Event)
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.183 182096 INFO nova.compute.manager [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.183 182096 DEBUG oslo.service.loopingcall [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.184 182096 DEBUG nova.compute.manager [-] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.184 182096 DEBUG nova.network.neutron [-] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.187 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.189 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.212 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.252 182096 INFO nova.compute.manager [None req-15f9fc45-a1b8-4f98-8a6c-de37a88dbfd3 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Updating instance to original state: 'active'
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.431 182096 DEBUG nova.network.neutron [-] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.449 182096 DEBUG nova.network.neutron [-] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.462 182096 INFO nova.compute.manager [-] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Took 0.28 seconds to deallocate network for instance.
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.524 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.525 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.651 182096 DEBUG nova.compute.provider_tree [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.666 182096 DEBUG nova.scheduler.client.report [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.684 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.706 182096 INFO nova.scheduler.client.report [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Deleted allocations for instance 8c2049e7-3c68-4550-a1c7-541164b292b3
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.808 182096 DEBUG oslo_concurrency.lockutils [None req-6c3e18ea-6e81-4a45-bd9d-7ea944336d57 bd82cd872bee49129f7a48e632aad32b 0ae23d2314c44c60aeac024a2d783cb4 - - default default] Lock "8c2049e7-3c68-4550-a1c7-541164b292b3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.873 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.874 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:52 compute-0 nova_compute[182092]: 2026-01-23 09:14:52.874 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.168 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.168 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.168 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.193 182096 DEBUG oslo_concurrency.lockutils [None req-127a7737-a10e-458e-9486-6234d963b79b 14158726c1ce45feb31a33c2e538f9cb da0c252a9d894995820785631977e131 - - default default] Acquiring lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.193 182096 DEBUG oslo_concurrency.lockutils [None req-127a7737-a10e-458e-9486-6234d963b79b 14158726c1ce45feb31a33c2e538f9cb da0c252a9d894995820785631977e131 - - default default] Acquired lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.194 182096 DEBUG nova.network.neutron [None req-127a7737-a10e-458e-9486-6234d963b79b 14158726c1ce45feb31a33c2e538f9cb da0c252a9d894995820785631977e131 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:53 compute-0 podman[211963]: 2026-01-23 09:14:53.228246238 +0000 UTC m=+0.067050087 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 23 09:14:53 compute-0 podman[211964]: 2026-01-23 09:14:53.229560488 +0000 UTC m=+0.068093596 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.287 182096 DEBUG nova.compute.manager [req-b669cd2b-b655-4c5b-9f35-c7490e984a2d req-3c2cb548-59b5-4fc3-ac67-dbba2e376360 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received event network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.288 182096 DEBUG oslo_concurrency.lockutils [req-b669cd2b-b655-4c5b-9f35-c7490e984a2d req-3c2cb548-59b5-4fc3-ac67-dbba2e376360 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e20a497a-3335-4412-8252-160abc20748b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.288 182096 DEBUG oslo_concurrency.lockutils [req-b669cd2b-b655-4c5b-9f35-c7490e984a2d req-3c2cb548-59b5-4fc3-ac67-dbba2e376360 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.288 182096 DEBUG oslo_concurrency.lockutils [req-b669cd2b-b655-4c5b-9f35-c7490e984a2d req-3c2cb548-59b5-4fc3-ac67-dbba2e376360 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.288 182096 DEBUG nova.compute.manager [req-b669cd2b-b655-4c5b-9f35-c7490e984a2d req-3c2cb548-59b5-4fc3-ac67-dbba2e376360 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] No waiting events found dispatching network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.288 182096 WARNING nova.compute.manager [req-b669cd2b-b655-4c5b-9f35-c7490e984a2d req-3c2cb548-59b5-4fc3-ac67-dbba2e376360 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received unexpected event network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce for instance with vm_state active and task_state None.
Jan 23 09:14:53 compute-0 nova_compute[182092]: 2026-01-23 09:14:53.590 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.583 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.597 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.597 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.598 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.598 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.598 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.598 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.598 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.729 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.915 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.916 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.916 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.916 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.916 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.924 182096 INFO nova.compute.manager [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Terminating instance
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.929 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.930 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquired lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:14:54 compute-0 nova_compute[182092]: 2026-01-23 09:14:54.930 182096 DEBUG nova.network.neutron [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.121 182096 DEBUG nova.network.neutron [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.350 182096 DEBUG nova.network.neutron [None req-127a7737-a10e-458e-9486-6234d963b79b 14158726c1ce45feb31a33c2e538f9cb da0c252a9d894995820785631977e131 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Updating instance_info_cache with network_info: [{"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.383 182096 DEBUG oslo_concurrency.lockutils [None req-127a7737-a10e-458e-9486-6234d963b79b 14158726c1ce45feb31a33c2e538f9cb da0c252a9d894995820785631977e131 - - default default] Releasing lock "refresh_cache-e20a497a-3335-4412-8252-160abc20748b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.383 182096 DEBUG nova.compute.manager [None req-127a7737-a10e-458e-9486-6234d963b79b 14158726c1ce45feb31a33c2e538f9cb da0c252a9d894995820785631977e131 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.383 182096 DEBUG nova.compute.manager [None req-127a7737-a10e-458e-9486-6234d963b79b 14158726c1ce45feb31a33c2e538f9cb da0c252a9d894995820785631977e131 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] network_info to inject: |[{"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.577 182096 DEBUG nova.network.neutron [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.605 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Releasing lock "refresh_cache-f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.606 182096 DEBUG nova.compute.manager [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:14:55 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 23 09:14:55 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000012.scope: Consumed 3.756s CPU time.
Jan 23 09:14:55 compute-0 systemd-machined[153562]: Machine qemu-11-instance-00000012 terminated.
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.834 182096 INFO nova.virt.libvirt.driver [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance destroyed successfully.
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.834 182096 DEBUG nova.objects.instance [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'resources' on Instance uuid f7ddc4fd-b0a8-4309-bdb4-fd989a270ede obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.846 182096 INFO nova.virt.libvirt.driver [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Deleting instance files /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_del
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.851 182096 INFO nova.virt.libvirt.driver [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Deletion of /var/lib/nova/instances/f7ddc4fd-b0a8-4309-bdb4-fd989a270ede_del complete
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.933 182096 INFO nova.compute.manager [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.933 182096 DEBUG oslo.service.loopingcall [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.933 182096 DEBUG nova.compute.manager [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:14:55 compute-0 nova_compute[182092]: 2026-01-23 09:14:55.933 182096 DEBUG nova.network.neutron [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.147 182096 DEBUG nova.network.neutron [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.158 182096 DEBUG nova.network.neutron [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.173 182096 INFO nova.compute.manager [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Took 0.24 seconds to deallocate network for instance.
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.233 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.233 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.338 182096 DEBUG nova.compute.provider_tree [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.357 182096 DEBUG nova.scheduler.client.report [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.374 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.407 182096 INFO nova.scheduler.client.report [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Deleted allocations for instance f7ddc4fd-b0a8-4309-bdb4-fd989a270ede
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.507 182096 DEBUG oslo_concurrency.lockutils [None req-47c4747c-9270-42b5-86e9-7df37019ffe8 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "f7ddc4fd-b0a8-4309-bdb4-fd989a270ede" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:14:56 compute-0 nova_compute[182092]: 2026-01-23 09:14:56.861 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:14:58 compute-0 podman[212021]: 2026-01-23 09:14:58.20980847 +0000 UTC m=+0.048118323 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:14:59 compute-0 nova_compute[182092]: 2026-01-23 09:14:59.730 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:01 compute-0 ovn_controller[94697]: 2026-01-23T09:15:01Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:13:02 10.100.0.3
Jan 23 09:15:01 compute-0 ovn_controller[94697]: 2026-01-23T09:15:01Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:13:02 10.100.0.3
Jan 23 09:15:01 compute-0 nova_compute[182092]: 2026-01-23 09:15:01.862 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.877 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.878 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.878 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.878 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.878 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "0e8dc87a-78f6-4d2d-b19b-6c51917bfd58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.885 182096 INFO nova.compute.manager [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Terminating instance
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.891 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.891 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquired lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:15:02 compute-0 nova_compute[182092]: 2026-01-23 09:15:02.891 182096 DEBUG nova.network.neutron [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:15:03 compute-0 nova_compute[182092]: 2026-01-23 09:15:03.134 182096 DEBUG nova.network.neutron [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:15:03 compute-0 nova_compute[182092]: 2026-01-23 09:15:03.762 182096 DEBUG nova.network.neutron [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:15:03 compute-0 nova_compute[182092]: 2026-01-23 09:15:03.785 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Releasing lock "refresh_cache-0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:15:03 compute-0 nova_compute[182092]: 2026-01-23 09:15:03.786 182096 DEBUG nova.compute.manager [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:15:03 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 23 09:15:03 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 14.055s CPU time.
Jan 23 09:15:03 compute-0 systemd-machined[153562]: Machine qemu-5-instance-00000009 terminated.
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.011 182096 INFO nova.virt.libvirt.driver [-] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Instance destroyed successfully.
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.011 182096 DEBUG nova.objects.instance [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'resources' on Instance uuid 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.113 182096 INFO nova.virt.libvirt.driver [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Deleting instance files /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58_del
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.116 182096 INFO nova.virt.libvirt.driver [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Deletion of /var/lib/nova/instances/0e8dc87a-78f6-4d2d-b19b-6c51917bfd58_del complete
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.235 182096 INFO nova.compute.manager [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.236 182096 DEBUG oslo.service.loopingcall [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.236 182096 DEBUG nova.compute.manager [-] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.236 182096 DEBUG nova.network.neutron [-] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.460 182096 DEBUG nova.network.neutron [-] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.472 182096 DEBUG nova.network.neutron [-] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.489 182096 INFO nova.compute.manager [-] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Took 0.25 seconds to deallocate network for instance.
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.550 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.551 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.644 182096 DEBUG nova.compute.provider_tree [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.653 182096 DEBUG nova.scheduler.client.report [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.681 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.721 182096 INFO nova.scheduler.client.report [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Deleted allocations for instance 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.731 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:04 compute-0 nova_compute[182092]: 2026-01-23 09:15:04.817 182096 DEBUG oslo_concurrency.lockutils [None req-b8f8a892-b707-4ed1-8755-28a002e48bde 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "0e8dc87a-78f6-4d2d-b19b-6c51917bfd58" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:05 compute-0 podman[212059]: 2026-01-23 09:15:05.226387809 +0000 UTC m=+0.064694274 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.852 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "7bf6771c-28de-4cd3-a95a-d8a3e8b25928" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.852 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "7bf6771c-28de-4cd3-a95a-d8a3e8b25928" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.853 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "7bf6771c-28de-4cd3-a95a-d8a3e8b25928-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.853 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "7bf6771c-28de-4cd3-a95a-d8a3e8b25928-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.853 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "7bf6771c-28de-4cd3-a95a-d8a3e8b25928-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.861 182096 INFO nova.compute.manager [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Terminating instance
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.866 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.866 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquired lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:15:05 compute-0 nova_compute[182092]: 2026-01-23 09:15:05.867 182096 DEBUG nova.network.neutron [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.108 182096 DEBUG nova.network.neutron [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.592 182096 DEBUG nova.network.neutron [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.617 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Releasing lock "refresh_cache-7bf6771c-28de-4cd3-a95a-d8a3e8b25928" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.618 182096 DEBUG nova.compute.manager [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:15:06 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 23 09:15:06 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 12.912s CPU time.
Jan 23 09:15:06 compute-0 systemd-machined[153562]: Machine qemu-3-instance-00000005 terminated.
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.852 182096 INFO nova.virt.libvirt.driver [-] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Instance destroyed successfully.
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.852 182096 DEBUG nova.objects.instance [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lazy-loading 'resources' on Instance uuid 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.863 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.866 182096 INFO nova.virt.libvirt.driver [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Deleting instance files /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928_del
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.870 182096 INFO nova.virt.libvirt.driver [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Deletion of /var/lib/nova/instances/7bf6771c-28de-4cd3-a95a-d8a3e8b25928_del complete
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.922 182096 INFO nova.compute.manager [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.922 182096 DEBUG oslo.service.loopingcall [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.922 182096 DEBUG nova.compute.manager [-] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.923 182096 DEBUG nova.network.neutron [-] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.996 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159691.9952357, 8c2049e7-3c68-4550-a1c7-541164b292b3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:06 compute-0 nova_compute[182092]: 2026-01-23 09:15:06.996 182096 INFO nova.compute.manager [-] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] VM Stopped (Lifecycle Event)
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.017 182096 DEBUG nova.compute.manager [None req-2cdb7c36-8a69-4ad1-b949-835e5bbc90c4 - - - - - -] [instance: 8c2049e7-3c68-4550-a1c7-541164b292b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.184 182096 DEBUG nova.network.neutron [-] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.200 182096 DEBUG nova.network.neutron [-] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.212 182096 INFO nova.compute.manager [-] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Took 0.29 seconds to deallocate network for instance.
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.289 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.289 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.345 182096 DEBUG nova.compute.provider_tree [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.360 182096 DEBUG nova.scheduler.client.report [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.382 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.414 182096 INFO nova.scheduler.client.report [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Deleted allocations for instance 7bf6771c-28de-4cd3-a95a-d8a3e8b25928
Jan 23 09:15:07 compute-0 nova_compute[182092]: 2026-01-23 09:15:07.493 182096 DEBUG oslo_concurrency.lockutils [None req-7fc990a0-c64e-4287-9012-3d730c245316 03b66d4c15354adcaae5e4e4cd711e1b 0ce2bc741e474401a3f9c2ef00c19693 - - default default] Lock "7bf6771c-28de-4cd3-a95a-d8a3e8b25928" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:09 compute-0 nova_compute[182092]: 2026-01-23 09:15:09.733 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:10 compute-0 nova_compute[182092]: 2026-01-23 09:15:10.834 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159695.8324254, f7ddc4fd-b0a8-4309-bdb4-fd989a270ede => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:10 compute-0 nova_compute[182092]: 2026-01-23 09:15:10.834 182096 INFO nova.compute.manager [-] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] VM Stopped (Lifecycle Event)
Jan 23 09:15:10 compute-0 nova_compute[182092]: 2026-01-23 09:15:10.852 182096 DEBUG nova.compute.manager [None req-12bad9dd-4741-4af9-a97a-6300f847ad30 - - - - - -] [instance: f7ddc4fd-b0a8-4309-bdb4-fd989a270ede] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:11 compute-0 podman[212092]: 2026-01-23 09:15:11.211632819 +0000 UTC m=+0.041812039 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:15:11 compute-0 podman[212091]: 2026-01-23 09:15:11.212800531 +0000 UTC m=+0.045407149 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 23 09:15:11 compute-0 nova_compute[182092]: 2026-01-23 09:15:11.864 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:14 compute-0 nova_compute[182092]: 2026-01-23 09:15:14.735 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:16 compute-0 nova_compute[182092]: 2026-01-23 09:15:16.866 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:17.582 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:15:17 compute-0 nova_compute[182092]: 2026-01-23 09:15:17.582 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:17.583 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:15:19 compute-0 nova_compute[182092]: 2026-01-23 09:15:19.011 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159704.0100539, 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:19 compute-0 nova_compute[182092]: 2026-01-23 09:15:19.012 182096 INFO nova.compute.manager [-] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] VM Stopped (Lifecycle Event)
Jan 23 09:15:19 compute-0 nova_compute[182092]: 2026-01-23 09:15:19.040 182096 DEBUG nova.compute.manager [None req-d4dcb984-fd9c-402e-a5e6-9fc4da5cc271 - - - - - -] [instance: 0e8dc87a-78f6-4d2d-b19b-6c51917bfd58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:19 compute-0 nova_compute[182092]: 2026-01-23 09:15:19.737 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:21 compute-0 nova_compute[182092]: 2026-01-23 09:15:21.852 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159706.850868, 7bf6771c-28de-4cd3-a95a-d8a3e8b25928 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:21 compute-0 nova_compute[182092]: 2026-01-23 09:15:21.854 182096 INFO nova.compute.manager [-] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] VM Stopped (Lifecycle Event)
Jan 23 09:15:21 compute-0 nova_compute[182092]: 2026-01-23 09:15:21.869 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:21 compute-0 nova_compute[182092]: 2026-01-23 09:15:21.874 182096 DEBUG nova.compute.manager [None req-ceef8533-1176-4918-b76d-8c44183596c9 - - - - - -] [instance: 7bf6771c-28de-4cd3-a95a-d8a3e8b25928] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:24 compute-0 podman[212129]: 2026-01-23 09:15:24.209579532 +0000 UTC m=+0.046549399 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:15:24 compute-0 podman[212130]: 2026-01-23 09:15:24.209592707 +0000 UTC m=+0.041668390 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:15:24 compute-0 nova_compute[182092]: 2026-01-23 09:15:24.739 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:26.585 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:15:26 compute-0 nova_compute[182092]: 2026-01-23 09:15:26.870 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.653 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "e20a497a-3335-4412-8252-160abc20748b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.653 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.654 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "e20a497a-3335-4412-8252-160abc20748b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.654 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.654 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.660 182096 INFO nova.compute.manager [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Terminating instance
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.665 182096 DEBUG nova.compute.manager [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:15:28 compute-0 kernel: tap712e26fc-1b (unregistering): left promiscuous mode
Jan 23 09:15:28 compute-0 NetworkManager[54920]: <info>  [1769159728.6834] device (tap712e26fc-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:15:28 compute-0 ovn_controller[94697]: 2026-01-23T09:15:28Z|00060|binding|INFO|Releasing lport 712e26fc-1b1b-4107-b48d-e274507951ce from this chassis (sb_readonly=0)
Jan 23 09:15:28 compute-0 ovn_controller[94697]: 2026-01-23T09:15:28Z|00061|binding|INFO|Setting lport 712e26fc-1b1b-4107-b48d-e274507951ce down in Southbound
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.689 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 ovn_controller[94697]: 2026-01-23T09:15:28Z|00062|binding|INFO|Removing iface tap712e26fc-1b ovn-installed in OVS
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.692 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.695 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:13:02 10.100.0.3'], port_security=['fa:16:3e:d0:13:02 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e20a497a-3335-4412-8252-160abc20748b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1e82582-8315-4300-9368-7c276c138ed9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20809f8e4c7d472e9e4e858c4ae7ebde', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a369b5a-736d-46db-b1d1-2611ef520ef5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e21d2a14-35ce-4230-a1cf-88a95faee610, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=712e26fc-1b1b-4107-b48d-e274507951ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.696 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 712e26fc-1b1b-4107-b48d-e274507951ce in datapath d1e82582-8315-4300-9368-7c276c138ed9 unbound from our chassis
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.697 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1e82582-8315-4300-9368-7c276c138ed9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.698 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a587d3-8e13-458f-9c49-2865ef885f94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.699 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9 namespace which is not needed anymore
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.713 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 23 09:15:28 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000018.scope: Consumed 11.471s CPU time.
Jan 23 09:15:28 compute-0 systemd-machined[153562]: Machine qemu-10-instance-00000018 terminated.
Jan 23 09:15:28 compute-0 podman[212166]: 2026-01-23 09:15:28.759702055 +0000 UTC m=+0.058588704 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 23 09:15:28 compute-0 neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9[211901]: [NOTICE]   (211908) : haproxy version is 2.8.14-c23fe91
Jan 23 09:15:28 compute-0 neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9[211901]: [NOTICE]   (211908) : path to executable is /usr/sbin/haproxy
Jan 23 09:15:28 compute-0 neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9[211901]: [WARNING]  (211908) : Exiting Master process...
Jan 23 09:15:28 compute-0 neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9[211901]: [ALERT]    (211908) : Current worker (211910) exited with code 143 (Terminated)
Jan 23 09:15:28 compute-0 neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9[211901]: [WARNING]  (211908) : All workers exited. Exiting... (0)
Jan 23 09:15:28 compute-0 systemd[1]: libpod-29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6.scope: Deactivated successfully.
Jan 23 09:15:28 compute-0 conmon[211901]: conmon 29603bfd015125e34c9f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6.scope/container/memory.events
Jan 23 09:15:28 compute-0 podman[212205]: 2026-01-23 09:15:28.814108187 +0000 UTC m=+0.036139568 container died 29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:15:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6-userdata-shm.mount: Deactivated successfully.
Jan 23 09:15:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-164c1baee4dee95ecab06372cd5b6dc3b3b47eba41d7f3ff579a9244501ae2af-merged.mount: Deactivated successfully.
Jan 23 09:15:28 compute-0 podman[212205]: 2026-01-23 09:15:28.838838518 +0000 UTC m=+0.060869899 container cleanup 29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:15:28 compute-0 systemd[1]: libpod-conmon-29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6.scope: Deactivated successfully.
Jan 23 09:15:28 compute-0 podman[212229]: 2026-01-23 09:15:28.89085483 +0000 UTC m=+0.033943815 container remove 29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.897 182096 DEBUG nova.compute.manager [req-34dfa1bd-ba73-4ba4-b50e-04cd608b0211 req-28e744f2-bc6e-4eb4-8cf7-436db0577651 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received event network-vif-unplugged-712e26fc-1b1b-4107-b48d-e274507951ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.897 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a46163b9-4951-46a6-afa5-537bb1d011e3]: (4, ('Fri Jan 23 09:15:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9 (29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6)\n29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6\nFri Jan 23 09:15:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9 (29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6)\n29603bfd015125e34c9f9411977a0ddbd9b0dba79b1df8b180cda358b49f7cb6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.897 182096 DEBUG oslo_concurrency.lockutils [req-34dfa1bd-ba73-4ba4-b50e-04cd608b0211 req-28e744f2-bc6e-4eb4-8cf7-436db0577651 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e20a497a-3335-4412-8252-160abc20748b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.898 182096 DEBUG oslo_concurrency.lockutils [req-34dfa1bd-ba73-4ba4-b50e-04cd608b0211 req-28e744f2-bc6e-4eb4-8cf7-436db0577651 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.898 182096 DEBUG oslo_concurrency.lockutils [req-34dfa1bd-ba73-4ba4-b50e-04cd608b0211 req-28e744f2-bc6e-4eb4-8cf7-436db0577651 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.898 182096 DEBUG nova.compute.manager [req-34dfa1bd-ba73-4ba4-b50e-04cd608b0211 req-28e744f2-bc6e-4eb4-8cf7-436db0577651 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] No waiting events found dispatching network-vif-unplugged-712e26fc-1b1b-4107-b48d-e274507951ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.898 182096 DEBUG nova.compute.manager [req-34dfa1bd-ba73-4ba4-b50e-04cd608b0211 req-28e744f2-bc6e-4eb4-8cf7-436db0577651 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received event network-vif-unplugged-712e26fc-1b1b-4107-b48d-e274507951ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.899 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cd009aee-21bd-4836-9c25-2920088946ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.900 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e82582-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.901 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 kernel: tapd1e82582-80: left promiscuous mode
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.918 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.920 182096 INFO nova.virt.libvirt.driver [-] [instance: e20a497a-3335-4412-8252-160abc20748b] Instance destroyed successfully.
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.920 182096 DEBUG nova.objects.instance [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lazy-loading 'resources' on Instance uuid e20a497a-3335-4412-8252-160abc20748b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.921 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2781e442-b862-4deb-89df-828282827630]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.930 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f12ba3d6-25cc-424b-839b-da6324cf8f90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.931 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[df85d529-be36-46de-b0db-ef33053da63b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.933 182096 DEBUG nova.virt.libvirt.vif [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:14:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1186247627',display_name='tempest-ServersAdminTestJSON-server-1186247627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1186247627',id=24,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:14:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20809f8e4c7d472e9e4e858c4ae7ebde',ramdisk_id='',reservation_id='r-s0yyyf3o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1639739915',owner_user_name='tempest-ServersAdminTestJSON-1639739915-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:14:51Z,user_data=None,user_id='500d8382713b469882428068fb752dda',uuid=e20a497a-3335-4412-8252-160abc20748b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.933 182096 DEBUG nova.network.os_vif_util [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Converting VIF {"id": "712e26fc-1b1b-4107-b48d-e274507951ce", "address": "fa:16:3e:d0:13:02", "network": {"id": "d1e82582-8315-4300-9368-7c276c138ed9", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1879868738-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20809f8e4c7d472e9e4e858c4ae7ebde", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap712e26fc-1b", "ovs_interfaceid": "712e26fc-1b1b-4107-b48d-e274507951ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.933 182096 DEBUG nova.network.os_vif_util [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d0:13:02,bridge_name='br-int',has_traffic_filtering=True,id=712e26fc-1b1b-4107-b48d-e274507951ce,network=Network(d1e82582-8315-4300-9368-7c276c138ed9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap712e26fc-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.934 182096 DEBUG os_vif [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:13:02,bridge_name='br-int',has_traffic_filtering=True,id=712e26fc-1b1b-4107-b48d-e274507951ce,network=Network(d1e82582-8315-4300-9368-7c276c138ed9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap712e26fc-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.935 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.935 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712e26fc-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.936 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.938 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.942 182096 INFO os_vif [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:13:02,bridge_name='br-int',has_traffic_filtering=True,id=712e26fc-1b1b-4107-b48d-e274507951ce,network=Network(d1e82582-8315-4300-9368-7c276c138ed9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap712e26fc-1b')
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.942 182096 INFO nova.virt.libvirt.driver [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Deleting instance files /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b_del
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.943 182096 INFO nova.virt.libvirt.driver [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Deletion of /var/lib/nova/instances/e20a497a-3335-4412-8252-160abc20748b_del complete
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.945 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc365a7-a41c-4b81-be4c-07f995e336c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319619, 'reachable_time': 32491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212261, 'error': None, 'target': 'ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.947 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1e82582-8315-4300-9368-7c276c138ed9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:15:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:28.947 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[3d70fdcd-ea03-44f1-a72e-9c4cf81037fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:15:28 compute-0 systemd[1]: run-netns-ovnmeta\x2dd1e82582\x2d8315\x2d4300\x2d9368\x2d7c276c138ed9.mount: Deactivated successfully.
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.997 182096 INFO nova.compute.manager [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.997 182096 DEBUG oslo.service.loopingcall [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.998 182096 DEBUG nova.compute.manager [-] [instance: e20a497a-3335-4412-8252-160abc20748b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:15:28 compute-0 nova_compute[182092]: 2026-01-23 09:15:28.998 182096 DEBUG nova.network.neutron [-] [instance: e20a497a-3335-4412-8252-160abc20748b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:15:29 compute-0 nova_compute[182092]: 2026-01-23 09:15:29.857 182096 DEBUG nova.network.neutron [-] [instance: e20a497a-3335-4412-8252-160abc20748b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:15:29 compute-0 nova_compute[182092]: 2026-01-23 09:15:29.869 182096 INFO nova.compute.manager [-] [instance: e20a497a-3335-4412-8252-160abc20748b] Took 0.87 seconds to deallocate network for instance.
Jan 23 09:15:29 compute-0 nova_compute[182092]: 2026-01-23 09:15:29.914 182096 DEBUG nova.compute.manager [req-343bc5cd-ab62-4137-8c63-d0d8a348f2b8 req-65f99215-387e-4c79-bc8d-8277eb88831e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received event network-vif-deleted-712e26fc-1b1b-4107-b48d-e274507951ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:15:29 compute-0 nova_compute[182092]: 2026-01-23 09:15:29.922 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:29 compute-0 nova_compute[182092]: 2026-01-23 09:15:29.922 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:29 compute-0 nova_compute[182092]: 2026-01-23 09:15:29.986 182096 DEBUG nova.compute.provider_tree [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:15:29 compute-0 nova_compute[182092]: 2026-01-23 09:15:29.996 182096 DEBUG nova.scheduler.client.report [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.010 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.043 182096 INFO nova.scheduler.client.report [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Deleted allocations for instance e20a497a-3335-4412-8252-160abc20748b
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.113 182096 DEBUG oslo_concurrency.lockutils [None req-5be22451-4e29-48a4-9b2c-828b2c9f559b 500d8382713b469882428068fb752dda 20809f8e4c7d472e9e4e858c4ae7ebde - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.375 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "733473e1-d4b4-4076-8736-00a5139d8996" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.376 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "733473e1-d4b4-4076-8736-00a5139d8996" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.392 182096 DEBUG nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.462 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.462 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.467 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.468 182096 INFO nova.compute.claims [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.556 182096 DEBUG nova.compute.provider_tree [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.574 182096 DEBUG nova.scheduler.client.report [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.777 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.777 182096 DEBUG nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.814 182096 DEBUG nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.825 182096 INFO nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.837 182096 DEBUG nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.918 182096 DEBUG nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.919 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.919 182096 INFO nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Creating image(s)
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.920 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "/var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.920 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "/var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.921 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "/var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.931 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.977 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.978 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.979 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:30 compute-0 nova_compute[182092]: 2026-01-23 09:15:30.988 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.003 182096 DEBUG nova.compute.manager [req-19fc0b93-f70f-4b71-95d7-6e0c304211f9 req-01e8b1b4-bec6-4eca-9c82-4912b23b84f7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received event network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.004 182096 DEBUG oslo_concurrency.lockutils [req-19fc0b93-f70f-4b71-95d7-6e0c304211f9 req-01e8b1b4-bec6-4eca-9c82-4912b23b84f7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e20a497a-3335-4412-8252-160abc20748b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.004 182096 DEBUG oslo_concurrency.lockutils [req-19fc0b93-f70f-4b71-95d7-6e0c304211f9 req-01e8b1b4-bec6-4eca-9c82-4912b23b84f7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.004 182096 DEBUG oslo_concurrency.lockutils [req-19fc0b93-f70f-4b71-95d7-6e0c304211f9 req-01e8b1b4-bec6-4eca-9c82-4912b23b84f7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e20a497a-3335-4412-8252-160abc20748b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.004 182096 DEBUG nova.compute.manager [req-19fc0b93-f70f-4b71-95d7-6e0c304211f9 req-01e8b1b4-bec6-4eca-9c82-4912b23b84f7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] No waiting events found dispatching network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.005 182096 WARNING nova.compute.manager [req-19fc0b93-f70f-4b71-95d7-6e0c304211f9 req-01e8b1b4-bec6-4eca-9c82-4912b23b84f7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e20a497a-3335-4412-8252-160abc20748b] Received unexpected event network-vif-plugged-712e26fc-1b1b-4107-b48d-e274507951ce for instance with vm_state deleted and task_state None.
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.033 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.034 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.069 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.069 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.070 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.132 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.133 182096 DEBUG nova.virt.disk.api [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Checking if we can resize image /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.133 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.192 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.193 182096 DEBUG nova.virt.disk.api [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Cannot resize image /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.193 182096 DEBUG nova.objects.instance [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lazy-loading 'migration_context' on Instance uuid 733473e1-d4b4-4076-8736-00a5139d8996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.205 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.205 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Ensure instance console log exists: /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.206 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.206 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.206 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.208 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.211 182096 WARNING nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.215 182096 DEBUG nova.virt.libvirt.host [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.216 182096 DEBUG nova.virt.libvirt.host [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.218 182096 DEBUG nova.virt.libvirt.host [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.219 182096 DEBUG nova.virt.libvirt.host [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.220 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.220 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.221 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.221 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.221 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.221 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.221 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.222 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.222 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.222 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.222 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.222 182096 DEBUG nova.virt.hardware [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.226 182096 DEBUG nova.objects.instance [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lazy-loading 'pci_devices' on Instance uuid 733473e1-d4b4-4076-8736-00a5139d8996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.236 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <uuid>733473e1-d4b4-4076-8736-00a5139d8996</uuid>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <name>instance-0000001c</name>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerDiagnosticsV248Test-server-391121388</nova:name>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:15:31</nova:creationTime>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:15:31 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:15:31 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:15:31 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:15:31 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:15:31 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:15:31 compute-0 nova_compute[182092]:         <nova:user uuid="e2fcc88735d5430ca53b5d3ce0a141e3">tempest-ServerDiagnosticsV248Test-703994952-project-member</nova:user>
Jan 23 09:15:31 compute-0 nova_compute[182092]:         <nova:project uuid="81428a503adb47a2a1a371c1a274572a">tempest-ServerDiagnosticsV248Test-703994952</nova:project>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <system>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <entry name="serial">733473e1-d4b4-4076-8736-00a5139d8996</entry>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <entry name="uuid">733473e1-d4b4-4076-8736-00a5139d8996</entry>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     </system>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <os>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   </os>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <features>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   </features>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk.config"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/console.log" append="off"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <video>
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     </video>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:15:31 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:15:31 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:15:31 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:15:31 compute-0 nova_compute[182092]: </domain>
Jan 23 09:15:31 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.270 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.270 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.270 182096 INFO nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Using config drive
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.637 182096 INFO nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Creating config drive at /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk.config
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.641 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxpjegdru execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.763 182096 DEBUG oslo_concurrency.processutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxpjegdru" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:31 compute-0 systemd-machined[153562]: New machine qemu-12-instance-0000001c.
Jan 23 09:15:31 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000001c.
Jan 23 09:15:31 compute-0 nova_compute[182092]: 2026-01-23 09:15:31.872 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.380 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159732.3801186, 733473e1-d4b4-4076-8736-00a5139d8996 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.380 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] VM Resumed (Lifecycle Event)
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.384 182096 DEBUG nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.384 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.386 182096 INFO nova.virt.libvirt.driver [-] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Instance spawned successfully.
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.387 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.394 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.398 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.401 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.401 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.401 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.402 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.402 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.402 182096 DEBUG nova.virt.libvirt.driver [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.417 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.417 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159732.3832073, 733473e1-d4b4-4076-8736-00a5139d8996 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.418 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] VM Started (Lifecycle Event)
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.436 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.438 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.452 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.464 182096 INFO nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Took 1.55 seconds to spawn the instance on the hypervisor.
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.464 182096 DEBUG nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.524 182096 INFO nova.compute.manager [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Took 2.09 seconds to build instance.
Jan 23 09:15:32 compute-0 nova_compute[182092]: 2026-01-23 09:15:32.536 182096 DEBUG oslo_concurrency.lockutils [None req-1ae2b67c-7b50-4d80-8713-2a7fa6c3cecf e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "733473e1-d4b4-4076-8736-00a5139d8996" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:33 compute-0 nova_compute[182092]: 2026-01-23 09:15:33.461 182096 DEBUG nova.compute.manager [None req-d783e023-f9c6-4231-ab8e-0cbe2dfa9d57 6b85fa10bb204fd1a45be9cb36eac69a 8e6a384085d84a008c65bd26147b0fa5 - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:33 compute-0 nova_compute[182092]: 2026-01-23 09:15:33.463 182096 INFO nova.compute.manager [None req-d783e023-f9c6-4231-ab8e-0cbe2dfa9d57 6b85fa10bb204fd1a45be9cb36eac69a 8e6a384085d84a008c65bd26147b0fa5 - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Retrieving diagnostics
Jan 23 09:15:33 compute-0 nova_compute[182092]: 2026-01-23 09:15:33.936 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:36 compute-0 podman[212305]: 2026-01-23 09:15:36.227877891 +0000 UTC m=+0.062017422 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:15:36 compute-0 nova_compute[182092]: 2026-01-23 09:15:36.872 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:38 compute-0 nova_compute[182092]: 2026-01-23 09:15:38.937 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:39 compute-0 nova_compute[182092]: 2026-01-23 09:15:39.704 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:39.852 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:39.853 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:15:39.853 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:41 compute-0 nova_compute[182092]: 2026-01-23 09:15:41.874 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:42 compute-0 podman[212341]: 2026-01-23 09:15:42.21843469 +0000 UTC m=+0.045976328 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:15:42 compute-0 podman[212340]: 2026-01-23 09:15:42.22220361 +0000 UTC m=+0.051898241 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.647 182096 DEBUG nova.compute.manager [None req-fdf1ab44-d684-48c0-8f82-27c08e63a8fb 6b85fa10bb204fd1a45be9cb36eac69a 8e6a384085d84a008c65bd26147b0fa5 - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.650 182096 INFO nova.compute.manager [None req-fdf1ab44-d684-48c0-8f82-27c08e63a8fb 6b85fa10bb204fd1a45be9cb36eac69a 8e6a384085d84a008c65bd26147b0fa5 - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Retrieving diagnostics
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.882 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "733473e1-d4b4-4076-8736-00a5139d8996" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.883 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "733473e1-d4b4-4076-8736-00a5139d8996" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.883 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "733473e1-d4b4-4076-8736-00a5139d8996-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.883 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "733473e1-d4b4-4076-8736-00a5139d8996-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.884 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "733473e1-d4b4-4076-8736-00a5139d8996-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.890 182096 INFO nova.compute.manager [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Terminating instance
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.895 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "refresh_cache-733473e1-d4b4-4076-8736-00a5139d8996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.895 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquired lock "refresh_cache-733473e1-d4b4-4076-8736-00a5139d8996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.896 182096 DEBUG nova.network.neutron [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.918 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159728.9174578, e20a497a-3335-4412-8252-160abc20748b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.919 182096 INFO nova.compute.manager [-] [instance: e20a497a-3335-4412-8252-160abc20748b] VM Stopped (Lifecycle Event)
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.930 182096 DEBUG nova.compute.manager [None req-ca76c843-b31b-4e56-87c4-420bb5251480 - - - - - -] [instance: e20a497a-3335-4412-8252-160abc20748b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:43 compute-0 nova_compute[182092]: 2026-01-23 09:15:43.939 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.079 182096 DEBUG nova.network.neutron [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.467 182096 DEBUG nova.network.neutron [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.481 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Releasing lock "refresh_cache-733473e1-d4b4-4076-8736-00a5139d8996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.482 182096 DEBUG nova.compute.manager [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:15:44 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 23 09:15:44 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001c.scope: Consumed 10.712s CPU time.
Jan 23 09:15:44 compute-0 systemd-machined[153562]: Machine qemu-12-instance-0000001c terminated.
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.712 182096 INFO nova.virt.libvirt.driver [-] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Instance destroyed successfully.
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.712 182096 DEBUG nova.objects.instance [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lazy-loading 'resources' on Instance uuid 733473e1-d4b4-4076-8736-00a5139d8996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.730 182096 INFO nova.virt.libvirt.driver [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Deleting instance files /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996_del
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.730 182096 INFO nova.virt.libvirt.driver [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Deletion of /var/lib/nova/instances/733473e1-d4b4-4076-8736-00a5139d8996_del complete
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.778 182096 INFO nova.compute.manager [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.778 182096 DEBUG oslo.service.loopingcall [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.778 182096 DEBUG nova.compute.manager [-] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.778 182096 DEBUG nova.network.neutron [-] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.912 182096 DEBUG nova.network.neutron [-] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.925 182096 DEBUG nova.network.neutron [-] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.934 182096 INFO nova.compute.manager [-] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Took 0.16 seconds to deallocate network for instance.
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.989 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:44 compute-0 nova_compute[182092]: 2026-01-23 09:15:44.990 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:45 compute-0 nova_compute[182092]: 2026-01-23 09:15:45.047 182096 DEBUG nova.compute.provider_tree [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:15:45 compute-0 nova_compute[182092]: 2026-01-23 09:15:45.058 182096 DEBUG nova.scheduler.client.report [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:15:45 compute-0 nova_compute[182092]: 2026-01-23 09:15:45.071 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:45 compute-0 nova_compute[182092]: 2026-01-23 09:15:45.088 182096 INFO nova.scheduler.client.report [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Deleted allocations for instance 733473e1-d4b4-4076-8736-00a5139d8996
Jan 23 09:15:45 compute-0 nova_compute[182092]: 2026-01-23 09:15:45.138 182096 DEBUG oslo_concurrency.lockutils [None req-7cb6388e-de6e-4228-8461-1abea2767022 e2fcc88735d5430ca53b5d3ce0a141e3 81428a503adb47a2a1a371c1a274572a - - default default] Lock "733473e1-d4b4-4076-8736-00a5139d8996" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:46 compute-0 nova_compute[182092]: 2026-01-23 09:15:46.876 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:48 compute-0 nova_compute[182092]: 2026-01-23 09:15:48.883 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:48 compute-0 nova_compute[182092]: 2026-01-23 09:15:48.883 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:48 compute-0 nova_compute[182092]: 2026-01-23 09:15:48.894 182096 DEBUG nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:15:48 compute-0 nova_compute[182092]: 2026-01-23 09:15:48.940 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:48 compute-0 nova_compute[182092]: 2026-01-23 09:15:48.951 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:48 compute-0 nova_compute[182092]: 2026-01-23 09:15:48.951 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:48 compute-0 nova_compute[182092]: 2026-01-23 09:15:48.957 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:15:48 compute-0 nova_compute[182092]: 2026-01-23 09:15:48.957 182096 INFO nova.compute.claims [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.055 182096 DEBUG nova.compute.provider_tree [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.068 182096 DEBUG nova.scheduler.client.report [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.082 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.082 182096 DEBUG nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.130 182096 DEBUG nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.130 182096 DEBUG nova.network.neutron [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.143 182096 INFO nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.156 182096 DEBUG nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.222 182096 DEBUG nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.223 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.223 182096 INFO nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Creating image(s)
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.224 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "/var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.224 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "/var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.224 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "/var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.235 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.280 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.281 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.281 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.290 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.333 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.334 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.353 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.354 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.355 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.397 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.398 182096 DEBUG nova.virt.disk.api [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Checking if we can resize image /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.398 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.443 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.444 182096 DEBUG nova.virt.disk.api [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Cannot resize image /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.444 182096 DEBUG nova.objects.instance [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lazy-loading 'migration_context' on Instance uuid bf6fb354-5e55-40b5-b2c5-5d1d55ee992c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.454 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.455 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Ensure instance console log exists: /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.455 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.455 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.456 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.587 182096 DEBUG nova.network.neutron [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.587 182096 DEBUG nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.588 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.591 182096 WARNING nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.594 182096 DEBUG nova.virt.libvirt.host [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.595 182096 DEBUG nova.virt.libvirt.host [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.597 182096 DEBUG nova.virt.libvirt.host [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.597 182096 DEBUG nova.virt.libvirt.host [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.598 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.598 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.599 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.599 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.599 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.599 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.599 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.600 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.600 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.600 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.600 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.600 182096 DEBUG nova.virt.hardware [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.603 182096 DEBUG nova.objects.instance [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid bf6fb354-5e55-40b5-b2c5-5d1d55ee992c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.611 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <uuid>bf6fb354-5e55-40b5-b2c5-5d1d55ee992c</uuid>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <name>instance-0000001e</name>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersAdminNegativeTestJSON-server-2140345047</nova:name>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:15:49</nova:creationTime>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:15:49 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:15:49 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:15:49 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:15:49 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:15:49 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:15:49 compute-0 nova_compute[182092]:         <nova:user uuid="1bcdbf60b3f541eaa8352073858601f5">tempest-ServersAdminNegativeTestJSON-153050740-project-member</nova:user>
Jan 23 09:15:49 compute-0 nova_compute[182092]:         <nova:project uuid="f07a505f47df4c1bb4bc8c49c96a92d4">tempest-ServersAdminNegativeTestJSON-153050740</nova:project>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <system>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <entry name="serial">bf6fb354-5e55-40b5-b2c5-5d1d55ee992c</entry>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <entry name="uuid">bf6fb354-5e55-40b5-b2c5-5d1d55ee992c</entry>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     </system>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <os>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   </os>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <features>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   </features>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk.config"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/console.log" append="off"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <video>
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     </video>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:15:49 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:15:49 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:15:49 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:15:49 compute-0 nova_compute[182092]: </domain>
Jan 23 09:15:49 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.640 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.640 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.641 182096 INFO nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Using config drive
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.903 182096 INFO nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Creating config drive at /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk.config
Jan 23 09:15:49 compute-0 nova_compute[182092]: 2026-01-23 09:15:49.907 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn6xqsap0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.024 182096 DEBUG oslo_concurrency.processutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn6xqsap0" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:50 compute-0 systemd-machined[153562]: New machine qemu-13-instance-0000001e.
Jan 23 09:15:50 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000001e.
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.662 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159750.6616008, bf6fb354-5e55-40b5-b2c5-5d1d55ee992c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.662 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] VM Resumed (Lifecycle Event)
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.667 182096 DEBUG nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.668 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.671 182096 INFO nova.virt.libvirt.driver [-] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Instance spawned successfully.
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.671 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.682 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.686 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.689 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.689 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.690 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.690 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.691 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.691 182096 DEBUG nova.virt.libvirt.driver [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.715 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.715 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159750.6654203, bf6fb354-5e55-40b5-b2c5-5d1d55ee992c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.715 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] VM Started (Lifecycle Event)
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.744 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.746 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.761 182096 INFO nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Took 1.54 seconds to spawn the instance on the hypervisor.
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.762 182096 DEBUG nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.763 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.818 182096 INFO nova.compute.manager [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Took 1.89 seconds to build instance.
Jan 23 09:15:50 compute-0 nova_compute[182092]: 2026-01-23 09:15:50.828 182096 DEBUG oslo_concurrency.lockutils [None req-befa6319-2145-4c2e-a195-b2897a333f48 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:51 compute-0 nova_compute[182092]: 2026-01-23 09:15:51.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:51 compute-0 nova_compute[182092]: 2026-01-23 09:15:51.658 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:51 compute-0 nova_compute[182092]: 2026-01-23 09:15:51.658 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:51 compute-0 nova_compute[182092]: 2026-01-23 09:15:51.658 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:15:51 compute-0 nova_compute[182092]: 2026-01-23 09:15:51.877 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.676 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.677 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.677 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.678 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.717 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.773 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.774 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:52 compute-0 nova_compute[182092]: 2026-01-23 09:15:52.826 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.016 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.018 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5648MB free_disk=73.37364959716797GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.019 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.019 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.126 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance bf6fb354-5e55-40b5-b2c5-5d1d55ee992c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.126 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.127 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.172 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.186 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.202 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.203 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:53 compute-0 nova_compute[182092]: 2026-01-23 09:15:53.941 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:54 compute-0 nova_compute[182092]: 2026-01-23 09:15:54.199 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:54 compute-0 nova_compute[182092]: 2026-01-23 09:15:54.200 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:54 compute-0 nova_compute[182092]: 2026-01-23 09:15:54.200 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:15:54 compute-0 nova_compute[182092]: 2026-01-23 09:15:54.221 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:15:54 compute-0 nova_compute[182092]: 2026-01-23 09:15:54.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:54 compute-0 nova_compute[182092]: 2026-01-23 09:15:54.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:55 compute-0 podman[212438]: 2026-01-23 09:15:55.218370189 +0000 UTC m=+0.051072132 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:15:55 compute-0 podman[212439]: 2026-01-23 09:15:55.232740683 +0000 UTC m=+0.064481292 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:15:55 compute-0 nova_compute[182092]: 2026-01-23 09:15:55.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:15:56 compute-0 nova_compute[182092]: 2026-01-23 09:15:56.878 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.791 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.791 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.806 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.875 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.875 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.881 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.881 182096 INFO nova.compute.claims [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.943 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.979 182096 DEBUG nova.compute.provider_tree [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:15:58 compute-0 nova_compute[182092]: 2026-01-23 09:15:58.992 182096 DEBUG nova.scheduler.client.report [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.008 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.009 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.050 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.051 182096 DEBUG nova.network.neutron [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.064 182096 INFO nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.081 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:15:59 compute-0 podman[212475]: 2026-01-23 09:15:59.193501966 +0000 UTC m=+0.035334879 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.195 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.196 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.197 182096 INFO nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Creating image(s)
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.197 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.197 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.198 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.211 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.267 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.268 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.269 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.278 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.323 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.324 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.344 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.345 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.346 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.367 182096 DEBUG nova.policy [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.390 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.390 182096 DEBUG nova.virt.disk.api [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Checking if we can resize image /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.390 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.434 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.435 182096 DEBUG nova.virt.disk.api [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Cannot resize image /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.435 182096 DEBUG nova.objects.instance [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lazy-loading 'migration_context' on Instance uuid 2c49518d-6513-4224-844e-0aab2ea675e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.443 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.444 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Ensure instance console log exists: /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.444 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.444 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.444 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.711 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159744.71071, 733473e1-d4b4-4076-8736-00a5139d8996 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.711 182096 INFO nova.compute.manager [-] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] VM Stopped (Lifecycle Event)
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.729 182096 DEBUG nova.compute.manager [None req-3ca88fd1-7340-4963-bfbb-4891e027a993 - - - - - -] [instance: 733473e1-d4b4-4076-8736-00a5139d8996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:15:59 compute-0 nova_compute[182092]: 2026-01-23 09:15:59.994 182096 DEBUG nova.network.neutron [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Successfully created port: 1eeea937-6c85-4302-be8a-532452fb9f66 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:16:00 compute-0 nova_compute[182092]: 2026-01-23 09:16:00.783 182096 DEBUG nova.network.neutron [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Successfully updated port: 1eeea937-6c85-4302-be8a-532452fb9f66 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:16:00 compute-0 nova_compute[182092]: 2026-01-23 09:16:00.794 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:00 compute-0 nova_compute[182092]: 2026-01-23 09:16:00.794 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquired lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:00 compute-0 nova_compute[182092]: 2026-01-23 09:16:00.794 182096 DEBUG nova.network.neutron [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:16:00 compute-0 nova_compute[182092]: 2026-01-23 09:16:00.902 182096 DEBUG nova.network.neutron [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.078 182096 DEBUG nova.compute.manager [req-c71760e5-49ac-4512-87f9-5b69613ebc1b req-e19644fa-9746-4474-bd23-ccf8750dea86 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-changed-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.079 182096 DEBUG nova.compute.manager [req-c71760e5-49ac-4512-87f9-5b69613ebc1b req-e19644fa-9746-4474-bd23-ccf8750dea86 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Refreshing instance network info cache due to event network-changed-1eeea937-6c85-4302-be8a-532452fb9f66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.079 182096 DEBUG oslo_concurrency.lockutils [req-c71760e5-49ac-4512-87f9-5b69613ebc1b req-e19644fa-9746-4474-bd23-ccf8750dea86 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.553 182096 DEBUG nova.network.neutron [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updating instance_info_cache with network_info: [{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.571 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Releasing lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.571 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Instance network_info: |[{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.571 182096 DEBUG oslo_concurrency.lockutils [req-c71760e5-49ac-4512-87f9-5b69613ebc1b req-e19644fa-9746-4474-bd23-ccf8750dea86 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.571 182096 DEBUG nova.network.neutron [req-c71760e5-49ac-4512-87f9-5b69613ebc1b req-e19644fa-9746-4474-bd23-ccf8750dea86 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Refreshing network info cache for port 1eeea937-6c85-4302-be8a-532452fb9f66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.573 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Start _get_guest_xml network_info=[{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.577 182096 WARNING nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.581 182096 DEBUG nova.virt.libvirt.host [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.581 182096 DEBUG nova.virt.libvirt.host [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.586 182096 DEBUG nova.virt.libvirt.host [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.586 182096 DEBUG nova.virt.libvirt.host [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.587 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.587 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.587 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.587 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.587 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.588 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.588 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.588 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.588 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.588 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.588 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.589 182096 DEBUG nova.virt.hardware [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.591 182096 DEBUG nova.virt.libvirt.vif [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:15:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-142208586',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-142208586',id=32,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-ouizy327',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:15:59Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=2c49518d-6513-4224-844e-0aab2ea675e7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.591 182096 DEBUG nova.network.os_vif_util [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converting VIF {"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.592 182096 DEBUG nova.network.os_vif_util [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.593 182096 DEBUG nova.objects.instance [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2c49518d-6513-4224-844e-0aab2ea675e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.606 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <uuid>2c49518d-6513-4224-844e-0aab2ea675e7</uuid>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <name>instance-00000020</name>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-142208586</nova:name>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:16:01</nova:creationTime>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:16:01 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:16:01 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:16:01 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:16:01 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:16:01 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:16:01 compute-0 nova_compute[182092]:         <nova:user uuid="93e6ff4cf2404b8db0db1ed141716461">tempest-LiveAutoBlockMigrationV225Test-779044727-project-member</nova:user>
Jan 23 09:16:01 compute-0 nova_compute[182092]:         <nova:project uuid="3b790db365e14138976e54a3cdfc8140">tempest-LiveAutoBlockMigrationV225Test-779044727</nova:project>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:16:01 compute-0 nova_compute[182092]:         <nova:port uuid="1eeea937-6c85-4302-be8a-532452fb9f66">
Jan 23 09:16:01 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <system>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <entry name="serial">2c49518d-6513-4224-844e-0aab2ea675e7</entry>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <entry name="uuid">2c49518d-6513-4224-844e-0aab2ea675e7</entry>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </system>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <os>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   </os>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <features>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   </features>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:15:7b:0b"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <target dev="tap1eeea937-6c"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/console.log" append="off"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <video>
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </video>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:16:01 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:16:01 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:16:01 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:16:01 compute-0 nova_compute[182092]: </domain>
Jan 23 09:16:01 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.607 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Preparing to wait for external event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.607 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.607 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.607 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.608 182096 DEBUG nova.virt.libvirt.vif [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:15:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-142208586',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-142208586',id=32,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-ouizy327',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:15:59Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=2c49518d-6513-4224-844e-0aab2ea675e7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.608 182096 DEBUG nova.network.os_vif_util [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converting VIF {"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.608 182096 DEBUG nova.network.os_vif_util [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.609 182096 DEBUG os_vif [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.609 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.609 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.610 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.611 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.612 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1eeea937-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.612 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1eeea937-6c, col_values=(('external_ids', {'iface-id': '1eeea937-6c85-4302-be8a-532452fb9f66', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:7b:0b', 'vm-uuid': '2c49518d-6513-4224-844e-0aab2ea675e7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.613 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:01 compute-0 NetworkManager[54920]: <info>  [1769159761.6141] manager: (tap1eeea937-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.616 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.617 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.618 182096 INFO os_vif [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c')
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.646 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.646 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.646 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] No VIF found with MAC fa:16:3e:15:7b:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.647 182096 INFO nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Using config drive
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.874 182096 INFO nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Creating config drive at /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.880 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpftg1gcuf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:01 compute-0 nova_compute[182092]: 2026-01-23 09:16:01.895 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.000 182096 DEBUG oslo_concurrency.processutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpftg1gcuf" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:02 compute-0 kernel: tap1eeea937-6c: entered promiscuous mode
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.047 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:02 compute-0 NetworkManager[54920]: <info>  [1769159762.0504] manager: (tap1eeea937-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Jan 23 09:16:02 compute-0 ovn_controller[94697]: 2026-01-23T09:16:02Z|00063|binding|INFO|Claiming lport 1eeea937-6c85-4302-be8a-532452fb9f66 for this chassis.
Jan 23 09:16:02 compute-0 ovn_controller[94697]: 2026-01-23T09:16:02Z|00064|binding|INFO|1eeea937-6c85-4302-be8a-532452fb9f66: Claiming fa:16:3e:15:7b:0b 10.100.0.12
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.051 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.066 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:7b:0b 10.100.0.12'], port_security=['fa:16:3e:15:7b:0b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b790db365e14138976e54a3cdfc8140', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4258f465-de11-4fd7-a213-c5cfeb4a3d2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80efab28-ca7e-4b92-9afd-cf93bba23f1d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1eeea937-6c85-4302-be8a-532452fb9f66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.068 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1eeea937-6c85-4302-be8a-532452fb9f66 in datapath e6fc57f5-3894-49ed-8321-7285a3da09a0 bound to our chassis
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.070 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e6fc57f5-3894-49ed-8321-7285a3da09a0
Jan 23 09:16:02 compute-0 systemd-udevd[212540]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.079 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[86266689-578c-4137-8169-8914f6113432]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 systemd-machined[153562]: New machine qemu-14-instance-00000020.
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.080 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape6fc57f5-31 in ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.083 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape6fc57f5-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.083 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bde71ef7-961d-459e-b324-f6cf2eb46d35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.084 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[496e7057-ffe0-430b-b9c6-b2d3abb0b48b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 NetworkManager[54920]: <info>  [1769159762.0923] device (tap1eeea937-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:16:02 compute-0 NetworkManager[54920]: <info>  [1769159762.0927] device (tap1eeea937-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.094 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[18818ce8-564a-48c5-a38b-f6281436fbc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.108 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.108 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2360d65f-0ea6-4f94-8d6e-74cd324f38c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_controller[94697]: 2026-01-23T09:16:02Z|00065|binding|INFO|Setting lport 1eeea937-6c85-4302-be8a-532452fb9f66 ovn-installed in OVS
Jan 23 09:16:02 compute-0 ovn_controller[94697]: 2026-01-23T09:16:02Z|00066|binding|INFO|Setting lport 1eeea937-6c85-4302-be8a-532452fb9f66 up in Southbound
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.113 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:02 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000020.
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.139 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce8b3b1-7bca-43c5-8df8-4edef5d2e372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 NetworkManager[54920]: <info>  [1769159762.1443] manager: (tape6fc57f5-30): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.144 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6114a10c-8c6f-4434-9e5f-da700a030f04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.167 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e8afa00b-d369-4124-85db-e3d1a4100efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.170 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[14369ade-a7e6-4c99-b0c0-5fb8d0150f81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 NetworkManager[54920]: <info>  [1769159762.1857] device (tape6fc57f5-30): carrier: link connected
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.188 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[71ccc0d3-e471-4e09-b3af-dc2f94013b02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.202 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9e256839-73b9-4417-8137-6af696736947]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape6fc57f5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:e6:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326767, 'reachable_time': 15225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212564, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.213 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4a79b96e-dc80-40ff-aa10-d7dd4cdb00f3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4c:e66b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 326767, 'tstamp': 326767}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212565, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.224 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[742c5e91-8bd7-46b4-9c68-ba83805e46cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape6fc57f5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:e6:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326767, 'reachable_time': 15225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212566, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.246 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9e4580fb-d973-4d91-8fb7-066bb1e3b5d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.283 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea6cf80-b098-4105-9a2f-dec719706c18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.284 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6fc57f5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.284 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.284 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6fc57f5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:02 compute-0 kernel: tape6fc57f5-30: entered promiscuous mode
Jan 23 09:16:02 compute-0 NetworkManager[54920]: <info>  [1769159762.2867] manager: (tape6fc57f5-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.289 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.293 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape6fc57f5-30, col_values=(('external_ids', {'iface-id': '840fa4b8-c4fe-41fd-bd18-1e22e94a7dc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:02 compute-0 ovn_controller[94697]: 2026-01-23T09:16:02Z|00067|binding|INFO|Releasing lport 840fa4b8-c4fe-41fd-bd18-1e22e94a7dc0 from this chassis (sb_readonly=0)
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.294 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.306 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.308 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e6fc57f5-3894-49ed-8321-7285a3da09a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e6fc57f5-3894-49ed-8321-7285a3da09a0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.309 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f25a47-5909-42f2-a5b3-947cf03ec40d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.309 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-e6fc57f5-3894-49ed-8321-7285a3da09a0
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/e6fc57f5-3894-49ed-8321-7285a3da09a0.pid.haproxy
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID e6fc57f5-3894-49ed-8321-7285a3da09a0
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:16:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:02.309 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'env', 'PROCESS_TAG=haproxy-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e6fc57f5-3894-49ed-8321-7285a3da09a0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.491 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159762.4911563, 2c49518d-6513-4224-844e-0aab2ea675e7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.491 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] VM Started (Lifecycle Event)
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.504 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.508 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159762.4912307, 2c49518d-6513-4224-844e-0aab2ea675e7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.508 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] VM Paused (Lifecycle Event)
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.518 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.521 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.534 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:16:02 compute-0 podman[212601]: 2026-01-23 09:16:02.602268983 +0000 UTC m=+0.031351254 container create ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:16:02 compute-0 systemd[1]: Started libpod-conmon-ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58.scope.
Jan 23 09:16:02 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:16:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cef1a3f165bfdc509dac704fac8edf96548451033c7139a066935af99f1ed76/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:16:02 compute-0 podman[212601]: 2026-01-23 09:16:02.658482635 +0000 UTC m=+0.087564926 container init ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:16:02 compute-0 podman[212601]: 2026-01-23 09:16:02.664716466 +0000 UTC m=+0.093798738 container start ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:16:02 compute-0 podman[212601]: 2026-01-23 09:16:02.588233752 +0000 UTC m=+0.017316043 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:16:02 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[212613]: [NOTICE]   (212617) : New worker (212619) forked
Jan 23 09:16:02 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[212613]: [NOTICE]   (212617) : Loading success.
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.757 182096 DEBUG nova.compute.manager [req-0d08be31-17b9-4adb-8852-985644cf7161 req-1a52821e-a45d-454d-8817-da310e8c8d31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.758 182096 DEBUG oslo_concurrency.lockutils [req-0d08be31-17b9-4adb-8852-985644cf7161 req-1a52821e-a45d-454d-8817-da310e8c8d31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.759 182096 DEBUG oslo_concurrency.lockutils [req-0d08be31-17b9-4adb-8852-985644cf7161 req-1a52821e-a45d-454d-8817-da310e8c8d31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.759 182096 DEBUG oslo_concurrency.lockutils [req-0d08be31-17b9-4adb-8852-985644cf7161 req-1a52821e-a45d-454d-8817-da310e8c8d31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.760 182096 DEBUG nova.compute.manager [req-0d08be31-17b9-4adb-8852-985644cf7161 req-1a52821e-a45d-454d-8817-da310e8c8d31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Processing event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.761 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.764 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159762.763702, 2c49518d-6513-4224-844e-0aab2ea675e7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.764 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] VM Resumed (Lifecycle Event)
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.765 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.767 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.767 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.767 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.768 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.768 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.771 182096 INFO nova.virt.libvirt.driver [-] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Instance spawned successfully.
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.772 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.777 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.778 182096 INFO nova.compute.manager [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Terminating instance
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.782 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.783 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "refresh_cache-bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.783 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquired lock "refresh_cache-bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.784 182096 DEBUG nova.network.neutron [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.789 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.789 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.790 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.790 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.790 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.791 182096 DEBUG nova.virt.libvirt.driver [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.810 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.815 182096 DEBUG nova.network.neutron [req-c71760e5-49ac-4512-87f9-5b69613ebc1b req-e19644fa-9746-4474-bd23-ccf8750dea86 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updated VIF entry in instance network info cache for port 1eeea937-6c85-4302-be8a-532452fb9f66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.815 182096 DEBUG nova.network.neutron [req-c71760e5-49ac-4512-87f9-5b69613ebc1b req-e19644fa-9746-4474-bd23-ccf8750dea86 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updating instance_info_cache with network_info: [{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.840 182096 DEBUG oslo_concurrency.lockutils [req-c71760e5-49ac-4512-87f9-5b69613ebc1b req-e19644fa-9746-4474-bd23-ccf8750dea86 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.861 182096 INFO nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Took 3.66 seconds to spawn the instance on the hypervisor.
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.861 182096 DEBUG nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.937 182096 INFO nova.compute.manager [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Took 4.09 seconds to build instance.
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.951 182096 DEBUG oslo_concurrency.lockutils [None req-7624803e-7bb5-4d32-8b82-e5b300699ba5 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:02 compute-0 nova_compute[182092]: 2026-01-23 09:16:02.958 182096 DEBUG nova.network.neutron [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.149 182096 DEBUG nova.network.neutron [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.167 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Releasing lock "refresh_cache-bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.167 182096 DEBUG nova.compute.manager [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:16:03 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 23 09:16:03 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001e.scope: Consumed 10.156s CPU time.
Jan 23 09:16:03 compute-0 systemd-machined[153562]: Machine qemu-13-instance-0000001e terminated.
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.400 182096 INFO nova.virt.libvirt.driver [-] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Instance destroyed successfully.
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.400 182096 DEBUG nova.objects.instance [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lazy-loading 'resources' on Instance uuid bf6fb354-5e55-40b5-b2c5-5d1d55ee992c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.421 182096 INFO nova.virt.libvirt.driver [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Deleting instance files /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c_del
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.422 182096 INFO nova.virt.libvirt.driver [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Deletion of /var/lib/nova/instances/bf6fb354-5e55-40b5-b2c5-5d1d55ee992c_del complete
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.471 182096 INFO nova.compute.manager [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.471 182096 DEBUG oslo.service.loopingcall [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.472 182096 DEBUG nova.compute.manager [-] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.472 182096 DEBUG nova.network.neutron [-] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.642 182096 DEBUG nova.network.neutron [-] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.652 182096 DEBUG nova.network.neutron [-] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.662 182096 INFO nova.compute.manager [-] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Took 0.19 seconds to deallocate network for instance.
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.707 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.708 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.758 182096 DEBUG nova.compute.provider_tree [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.766 182096 DEBUG nova.scheduler.client.report [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.779 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.794 182096 INFO nova.scheduler.client.report [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Deleted allocations for instance bf6fb354-5e55-40b5-b2c5-5d1d55ee992c
Jan 23 09:16:03 compute-0 nova_compute[182092]: 2026-01-23 09:16:03.843 182096 DEBUG oslo_concurrency.lockutils [None req-a6c67d89-16cf-47ad-82c7-6a18a81eabce 1bcdbf60b3f541eaa8352073858601f5 f07a505f47df4c1bb4bc8c49c96a92d4 - - default default] Lock "bf6fb354-5e55-40b5-b2c5-5d1d55ee992c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:04 compute-0 nova_compute[182092]: 2026-01-23 09:16:04.869 182096 DEBUG nova.compute.manager [req-5a1f32a5-b390-4470-b757-46758a09b6b7 req-9c77c8e5-c7d7-4335-84fa-696db3b3c3d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:04 compute-0 nova_compute[182092]: 2026-01-23 09:16:04.870 182096 DEBUG oslo_concurrency.lockutils [req-5a1f32a5-b390-4470-b757-46758a09b6b7 req-9c77c8e5-c7d7-4335-84fa-696db3b3c3d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:04 compute-0 nova_compute[182092]: 2026-01-23 09:16:04.870 182096 DEBUG oslo_concurrency.lockutils [req-5a1f32a5-b390-4470-b757-46758a09b6b7 req-9c77c8e5-c7d7-4335-84fa-696db3b3c3d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:04 compute-0 nova_compute[182092]: 2026-01-23 09:16:04.870 182096 DEBUG oslo_concurrency.lockutils [req-5a1f32a5-b390-4470-b757-46758a09b6b7 req-9c77c8e5-c7d7-4335-84fa-696db3b3c3d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:04 compute-0 nova_compute[182092]: 2026-01-23 09:16:04.871 182096 DEBUG nova.compute.manager [req-5a1f32a5-b390-4470-b757-46758a09b6b7 req-9c77c8e5-c7d7-4335-84fa-696db3b3c3d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:04 compute-0 nova_compute[182092]: 2026-01-23 09:16:04.871 182096 WARNING nova.compute.manager [req-5a1f32a5-b390-4470-b757-46758a09b6b7 req-9c77c8e5-c7d7-4335-84fa-696db3b3c3d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state active and task_state None.
Jan 23 09:16:06 compute-0 nova_compute[182092]: 2026-01-23 09:16:06.151 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Check if temp file /var/lib/nova/instances/tmpwkg52m0w exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 23 09:16:06 compute-0 nova_compute[182092]: 2026-01-23 09:16:06.152 182096 DEBUG nova.compute.manager [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwkg52m0w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2c49518d-6513-4224-844e-0aab2ea675e7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 23 09:16:06 compute-0 nova_compute[182092]: 2026-01-23 09:16:06.615 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:06 compute-0 nova_compute[182092]: 2026-01-23 09:16:06.880 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:06 compute-0 nova_compute[182092]: 2026-01-23 09:16:06.991 182096 DEBUG oslo_concurrency.processutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:07 compute-0 nova_compute[182092]: 2026-01-23 09:16:07.046 182096 DEBUG oslo_concurrency.processutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:07 compute-0 nova_compute[182092]: 2026-01-23 09:16:07.047 182096 DEBUG oslo_concurrency.processutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:07 compute-0 nova_compute[182092]: 2026-01-23 09:16:07.100 182096 DEBUG oslo_concurrency.processutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:07 compute-0 podman[212639]: 2026-01-23 09:16:07.219335071 +0000 UTC m=+0.056977965 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:16:09 compute-0 sshd-session[212662]: Accepted publickey for nova from 192.168.122.101 port 36414 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:16:09 compute-0 systemd-logind[746]: New session 47 of user nova.
Jan 23 09:16:09 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:16:09 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:16:09 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:16:09 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:16:09 compute-0 systemd[212666]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:16:09 compute-0 systemd[212666]: Queued start job for default target Main User Target.
Jan 23 09:16:09 compute-0 systemd[212666]: Created slice User Application Slice.
Jan 23 09:16:09 compute-0 systemd[212666]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:16:09 compute-0 systemd[212666]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:16:09 compute-0 systemd[212666]: Reached target Paths.
Jan 23 09:16:09 compute-0 systemd[212666]: Reached target Timers.
Jan 23 09:16:09 compute-0 systemd[212666]: Starting D-Bus User Message Bus Socket...
Jan 23 09:16:09 compute-0 systemd[212666]: Starting Create User's Volatile Files and Directories...
Jan 23 09:16:09 compute-0 systemd[212666]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:16:09 compute-0 systemd[212666]: Reached target Sockets.
Jan 23 09:16:09 compute-0 systemd[212666]: Finished Create User's Volatile Files and Directories.
Jan 23 09:16:09 compute-0 systemd[212666]: Reached target Basic System.
Jan 23 09:16:09 compute-0 systemd[212666]: Reached target Main User Target.
Jan 23 09:16:09 compute-0 systemd[212666]: Startup finished in 96ms.
Jan 23 09:16:09 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:16:09 compute-0 systemd[1]: Started Session 47 of User nova.
Jan 23 09:16:09 compute-0 sshd-session[212662]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:16:09 compute-0 sshd-session[212682]: Received disconnect from 192.168.122.101 port 36414:11: disconnected by user
Jan 23 09:16:09 compute-0 sshd-session[212682]: Disconnected from user nova 192.168.122.101 port 36414
Jan 23 09:16:09 compute-0 sshd-session[212662]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:16:09 compute-0 systemd-logind[746]: Session 47 logged out. Waiting for processes to exit.
Jan 23 09:16:09 compute-0 systemd[1]: session-47.scope: Deactivated successfully.
Jan 23 09:16:09 compute-0 systemd-logind[746]: Removed session 47.
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.451 182096 DEBUG nova.compute.manager [req-0c9fafa1-2b38-47ee-9baa-3210d24bdf61 req-56ced9f2-6177-41a8-9058-32a33c053996 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.452 182096 DEBUG oslo_concurrency.lockutils [req-0c9fafa1-2b38-47ee-9baa-3210d24bdf61 req-56ced9f2-6177-41a8-9058-32a33c053996 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.452 182096 DEBUG oslo_concurrency.lockutils [req-0c9fafa1-2b38-47ee-9baa-3210d24bdf61 req-56ced9f2-6177-41a8-9058-32a33c053996 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.452 182096 DEBUG oslo_concurrency.lockutils [req-0c9fafa1-2b38-47ee-9baa-3210d24bdf61 req-56ced9f2-6177-41a8-9058-32a33c053996 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.452 182096 DEBUG nova.compute.manager [req-0c9fafa1-2b38-47ee-9baa-3210d24bdf61 req-56ced9f2-6177-41a8-9058-32a33c053996 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.452 182096 DEBUG nova.compute.manager [req-0c9fafa1-2b38-47ee-9baa-3210d24bdf61 req-56ced9f2-6177-41a8-9058-32a33c053996 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.846 182096 INFO nova.compute.manager [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Took 3.74 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.846 182096 DEBUG nova.compute.manager [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.859 182096 DEBUG nova.compute.manager [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwkg52m0w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2c49518d-6513-4224-844e-0aab2ea675e7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(e192f319-ee7a-416f-856f-ccfb02421705),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.876 182096 DEBUG nova.objects.instance [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2c49518d-6513-4224-844e-0aab2ea675e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.877 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.879 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.879 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.890 182096 DEBUG nova.virt.libvirt.vif [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:15:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-142208586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-142208586',id=32,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:16:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-ouizy327',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:16:02Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=2c49518d-6513-4224-844e-0aab2ea675e7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.890 182096 DEBUG nova.network.os_vif_util [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converting VIF {"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.891 182096 DEBUG nova.network.os_vif_util [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.892 182096 DEBUG nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updating guest XML with vif config: <interface type="ethernet">
Jan 23 09:16:10 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:15:7b:0b"/>
Jan 23 09:16:10 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:16:10 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:16:10 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:16:10 compute-0 nova_compute[182092]:   <target dev="tap1eeea937-6c"/>
Jan 23 09:16:10 compute-0 nova_compute[182092]: </interface>
Jan 23 09:16:10 compute-0 nova_compute[182092]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 23 09:16:10 compute-0 nova_compute[182092]: 2026-01-23 09:16:10.892 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 23 09:16:11 compute-0 nova_compute[182092]: 2026-01-23 09:16:11.382 182096 DEBUG nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 23 09:16:11 compute-0 nova_compute[182092]: 2026-01-23 09:16:11.382 182096 INFO nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 23 09:16:11 compute-0 nova_compute[182092]: 2026-01-23 09:16:11.460 182096 INFO nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 23 09:16:11 compute-0 nova_compute[182092]: 2026-01-23 09:16:11.618 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:11 compute-0 nova_compute[182092]: 2026-01-23 09:16:11.882 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:11 compute-0 nova_compute[182092]: 2026-01-23 09:16:11.962 182096 DEBUG nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 23 09:16:11 compute-0 nova_compute[182092]: 2026-01-23 09:16:11.963 182096 DEBUG nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.465 182096 DEBUG nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.466 182096 DEBUG nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.552 182096 DEBUG nova.compute.manager [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.552 182096 DEBUG oslo_concurrency.lockutils [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.553 182096 DEBUG oslo_concurrency.lockutils [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.554 182096 DEBUG oslo_concurrency.lockutils [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.554 182096 DEBUG nova.compute.manager [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.555 182096 WARNING nova.compute.manager [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state active and task_state migrating.
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.556 182096 DEBUG nova.compute.manager [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-changed-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.556 182096 DEBUG nova.compute.manager [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Refreshing instance network info cache due to event network-changed-1eeea937-6c85-4302-be8a-532452fb9f66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.557 182096 DEBUG oslo_concurrency.lockutils [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.557 182096 DEBUG oslo_concurrency.lockutils [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.558 182096 DEBUG nova.network.neutron [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Refreshing network info cache for port 1eeea937-6c85-4302-be8a-532452fb9f66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.971 182096 DEBUG nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 23 09:16:12 compute-0 nova_compute[182092]: 2026-01-23 09:16:12.973 182096 DEBUG nova.virt.libvirt.migration [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.177 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159773.1767719, 2c49518d-6513-4224-844e-0aab2ea675e7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.177 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] VM Paused (Lifecycle Event)
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.193 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.197 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.215 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 23 09:16:13 compute-0 podman[212702]: 2026-01-23 09:16:13.224825306 +0000 UTC m=+0.049354221 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:16:13 compute-0 podman[212701]: 2026-01-23 09:16:13.23025992 +0000 UTC m=+0.057384993 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 23 09:16:13 compute-0 kernel: tap1eeea937-6c (unregistering): left promiscuous mode
Jan 23 09:16:13 compute-0 NetworkManager[54920]: <info>  [1769159773.4029] device (tap1eeea937-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:16:13 compute-0 ovn_controller[94697]: 2026-01-23T09:16:13Z|00068|binding|INFO|Releasing lport 1eeea937-6c85-4302-be8a-532452fb9f66 from this chassis (sb_readonly=0)
Jan 23 09:16:13 compute-0 ovn_controller[94697]: 2026-01-23T09:16:13Z|00069|binding|INFO|Setting lport 1eeea937-6c85-4302-be8a-532452fb9f66 down in Southbound
Jan 23 09:16:13 compute-0 ovn_controller[94697]: 2026-01-23T09:16:13Z|00070|binding|INFO|Removing iface tap1eeea937-6c ovn-installed in OVS
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.405 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.408 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.410 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:7b:0b 10.100.0.12'], port_security=['fa:16:3e:15:7b:0b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1166248c-76b2-44bd-b867-c12c2a2e3d39'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b790db365e14138976e54a3cdfc8140', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4258f465-de11-4fd7-a213-c5cfeb4a3d2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80efab28-ca7e-4b92-9afd-cf93bba23f1d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1eeea937-6c85-4302-be8a-532452fb9f66) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.413 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1eeea937-6c85-4302-be8a-532452fb9f66 in datapath e6fc57f5-3894-49ed-8321-7285a3da09a0 unbound from our chassis
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.415 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e6fc57f5-3894-49ed-8321-7285a3da09a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.416 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4176e0-3c18-4ed9-806e-9b25e8e2570a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.416 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 namespace which is not needed anymore
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.421 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:13 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 23 09:16:13 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000020.scope: Consumed 11.090s CPU time.
Jan 23 09:16:13 compute-0 systemd-machined[153562]: Machine qemu-14-instance-00000020 terminated.
Jan 23 09:16:13 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[212613]: [NOTICE]   (212617) : haproxy version is 2.8.14-c23fe91
Jan 23 09:16:13 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[212613]: [NOTICE]   (212617) : path to executable is /usr/sbin/haproxy
Jan 23 09:16:13 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[212613]: [ALERT]    (212617) : Current worker (212619) exited with code 143 (Terminated)
Jan 23 09:16:13 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[212613]: [WARNING]  (212617) : All workers exited. Exiting... (0)
Jan 23 09:16:13 compute-0 systemd[1]: libpod-ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58.scope: Deactivated successfully.
Jan 23 09:16:13 compute-0 podman[212760]: 2026-01-23 09:16:13.517352206 +0000 UTC m=+0.035460867 container died ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 09:16:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58-userdata-shm.mount: Deactivated successfully.
Jan 23 09:16:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-3cef1a3f165bfdc509dac704fac8edf96548451033c7139a066935af99f1ed76-merged.mount: Deactivated successfully.
Jan 23 09:16:13 compute-0 podman[212760]: 2026-01-23 09:16:13.540064761 +0000 UTC m=+0.058173421 container cleanup ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 09:16:13 compute-0 systemd[1]: libpod-conmon-ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58.scope: Deactivated successfully.
Jan 23 09:16:13 compute-0 podman[212783]: 2026-01-23 09:16:13.580021048 +0000 UTC m=+0.022956094 container remove ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.583 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3538662c-d22f-4389-b105-b5a93f69ae84]: (4, ('Fri Jan 23 09:16:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 (ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58)\nad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58\nFri Jan 23 09:16:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 (ad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58)\nad7be3a4fee6ca491f74c334f018ccf19b3b37691445402e3aa09b3be2869e58\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.584 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b8acd4a7-904f-4dde-8584-3f71b2a5a4b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.585 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6fc57f5-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.586 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:13 compute-0 kernel: tape6fc57f5-30: left promiscuous mode
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.601 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:13 compute-0 NetworkManager[54920]: <info>  [1769159773.6028] manager: (tap1eeea937-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.604 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3043d091-dbd0-40a4-9b42-5e4ad2745939]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.612 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9aadf252-1c8a-45f6-b3d2-d757af865dfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.613 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3541f62f-3649-41cb-b0f0-cc913bdaa12c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.625 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[db2b38d3-cbc3-465b-99d8-54c7c9c83d87]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326762, 'reachable_time': 42709, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212805, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:13 compute-0 systemd[1]: run-netns-ovnmeta\x2de6fc57f5\x2d3894\x2d49ed\x2d8321\x2d7285a3da09a0.mount: Deactivated successfully.
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.628 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:16:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:13.629 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[19f503c0-e2f1-4d1a-a37a-10d2cc5fea4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.630 182096 DEBUG nova.virt.libvirt.guest [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.631 182096 INFO nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Migration operation has completed
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.631 182096 INFO nova.compute.manager [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] _post_live_migration() is started..
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.633 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.634 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 23 09:16:13 compute-0 nova_compute[182092]: 2026-01-23 09:16:13.634 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.405 182096 DEBUG nova.compute.manager [req-2d0c3b0e-78dd-4688-b1a7-f46f6f8db01c req-b58480a2-aa64-42e9-915f-76ba6ed04497 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.405 182096 DEBUG oslo_concurrency.lockutils [req-2d0c3b0e-78dd-4688-b1a7-f46f6f8db01c req-b58480a2-aa64-42e9-915f-76ba6ed04497 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.405 182096 DEBUG oslo_concurrency.lockutils [req-2d0c3b0e-78dd-4688-b1a7-f46f6f8db01c req-b58480a2-aa64-42e9-915f-76ba6ed04497 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.406 182096 DEBUG oslo_concurrency.lockutils [req-2d0c3b0e-78dd-4688-b1a7-f46f6f8db01c req-b58480a2-aa64-42e9-915f-76ba6ed04497 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.406 182096 DEBUG nova.compute.manager [req-2d0c3b0e-78dd-4688-b1a7-f46f6f8db01c req-b58480a2-aa64-42e9-915f-76ba6ed04497 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.406 182096 DEBUG nova.compute.manager [req-2d0c3b0e-78dd-4688-b1a7-f46f6f8db01c req-b58480a2-aa64-42e9-915f-76ba6ed04497 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.457 182096 DEBUG nova.network.neutron [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Activated binding for port 1eeea937-6c85-4302-be8a-532452fb9f66 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.457 182096 DEBUG nova.compute.manager [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.458 182096 DEBUG nova.virt.libvirt.vif [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:15:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-142208586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-142208586',id=32,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:16:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-ouizy327',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:16:05Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=2c49518d-6513-4224-844e-0aab2ea675e7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.458 182096 DEBUG nova.network.os_vif_util [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converting VIF {"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.459 182096 DEBUG nova.network.os_vif_util [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.459 182096 DEBUG os_vif [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.460 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.461 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1eeea937-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.463 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.466 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.468 182096 INFO os_vif [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c')
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.468 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.468 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.468 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.468 182096 DEBUG nova.compute.manager [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.469 182096 INFO nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Deleting instance files /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7_del
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.469 182096 INFO nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Deletion of /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7_del complete
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.547 182096 DEBUG nova.network.neutron [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updated VIF entry in instance network info cache for port 1eeea937-6c85-4302-be8a-532452fb9f66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.547 182096 DEBUG nova.network.neutron [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updating instance_info_cache with network_info: [{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:14 compute-0 nova_compute[182092]: 2026-01-23 09:16:14.565 182096 DEBUG oslo_concurrency.lockutils [req-986b8162-6815-4809-8c5e-bf77f76368b6 req-88e945e1-6276-4f20-9db3-9f88f814d2ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.040 182096 DEBUG nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.040 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.040 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.040 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.041 182096 DEBUG nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.041 182096 DEBUG nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.041 182096 DEBUG nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.041 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.041 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.042 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.042 182096 DEBUG nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.042 182096 WARNING nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state active and task_state migrating.
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.042 182096 DEBUG nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.042 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.043 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.043 182096 DEBUG oslo_concurrency.lockutils [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.043 182096 DEBUG nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.043 182096 WARNING nova.compute.manager [req-65b9eb4b-b094-4dab-96e9-53c0ed044499 req-38dbf541-7872-4080-8095-550a140a9ade 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state active and task_state migrating.
Jan 23 09:16:16 compute-0 nova_compute[182092]: 2026-01-23 09:16:16.883 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.219 182096 DEBUG nova.compute.manager [req-2331374d-178a-458e-9e77-f759838afb4e req-037bb87a-05c1-44a2-a204-0deb6546ea06 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.219 182096 DEBUG oslo_concurrency.lockutils [req-2331374d-178a-458e-9e77-f759838afb4e req-037bb87a-05c1-44a2-a204-0deb6546ea06 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.219 182096 DEBUG oslo_concurrency.lockutils [req-2331374d-178a-458e-9e77-f759838afb4e req-037bb87a-05c1-44a2-a204-0deb6546ea06 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.219 182096 DEBUG oslo_concurrency.lockutils [req-2331374d-178a-458e-9e77-f759838afb4e req-037bb87a-05c1-44a2-a204-0deb6546ea06 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.220 182096 DEBUG nova.compute.manager [req-2331374d-178a-458e-9e77-f759838afb4e req-037bb87a-05c1-44a2-a204-0deb6546ea06 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.220 182096 WARNING nova.compute.manager [req-2331374d-178a-458e-9e77-f759838afb4e req-037bb87a-05c1-44a2-a204-0deb6546ea06 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state active and task_state migrating.
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.399 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159763.398149, bf6fb354-5e55-40b5-b2c5-5d1d55ee992c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.400 182096 INFO nova.compute.manager [-] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] VM Stopped (Lifecycle Event)
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.417 182096 DEBUG nova.compute.manager [None req-8a57b6cf-1b58-448f-b484-ae4fc708b05a - - - - - -] [instance: bf6fb354-5e55-40b5-b2c5-5d1d55ee992c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:18.808 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:16:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:18.809 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:16:18 compute-0 nova_compute[182092]: 2026-01-23 09:16:18.809 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.463 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.513 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.514 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.514 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.527 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.527 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.528 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.528 182096 DEBUG nova.compute.resource_tracker [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:16:19 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:16:19 compute-0 systemd[212666]: Activating special unit Exit the Session...
Jan 23 09:16:19 compute-0 systemd[212666]: Stopped target Main User Target.
Jan 23 09:16:19 compute-0 systemd[212666]: Stopped target Basic System.
Jan 23 09:16:19 compute-0 systemd[212666]: Stopped target Paths.
Jan 23 09:16:19 compute-0 systemd[212666]: Stopped target Sockets.
Jan 23 09:16:19 compute-0 systemd[212666]: Stopped target Timers.
Jan 23 09:16:19 compute-0 systemd[212666]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:16:19 compute-0 systemd[212666]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:16:19 compute-0 systemd[212666]: Closed D-Bus User Message Bus Socket.
Jan 23 09:16:19 compute-0 systemd[212666]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:16:19 compute-0 systemd[212666]: Removed slice User Application Slice.
Jan 23 09:16:19 compute-0 systemd[212666]: Reached target Shutdown.
Jan 23 09:16:19 compute-0 systemd[212666]: Finished Exit the Session.
Jan 23 09:16:19 compute-0 systemd[212666]: Reached target Exit the Session.
Jan 23 09:16:19 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:16:19 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:16:19 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:16:19 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:16:19 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:16:19 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:16:19 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.740 182096 WARNING nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.741 182096 DEBUG nova.compute.resource_tracker [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5686MB free_disk=73.37439727783203GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.742 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.742 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.779 182096 DEBUG nova.compute.resource_tracker [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Migration for instance 2c49518d-6513-4224-844e-0aab2ea675e7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.794 182096 DEBUG nova.compute.resource_tracker [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.824 182096 DEBUG nova.compute.resource_tracker [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Migration e192f319-ee7a-416f-856f-ccfb02421705 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.825 182096 DEBUG nova.compute.resource_tracker [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.825 182096 DEBUG nova.compute.resource_tracker [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.865 182096 DEBUG nova.compute.provider_tree [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.878 182096 DEBUG nova.scheduler.client.report [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.908 182096 DEBUG nova.compute.resource_tracker [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.908 182096 DEBUG oslo_concurrency.lockutils [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.918 182096 INFO nova.compute.manager [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.983 182096 INFO nova.scheduler.client.report [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Deleted allocation for migration e192f319-ee7a-416f-856f-ccfb02421705
Jan 23 09:16:19 compute-0 nova_compute[182092]: 2026-01-23 09:16:19.983 182096 DEBUG nova.virt.libvirt.driver [None req-235af9d9-21a5-40bd-829c-832d006a9e24 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 23 09:16:21 compute-0 nova_compute[182092]: 2026-01-23 09:16:21.883 182096 DEBUG nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Creating tmpfile /var/lib/nova/instances/tmpgccpd_6b to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 23 09:16:21 compute-0 nova_compute[182092]: 2026-01-23 09:16:21.886 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:22 compute-0 nova_compute[182092]: 2026-01-23 09:16:22.013 182096 DEBUG nova.compute.manager [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgccpd_6b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 23 09:16:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:22.810 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:23 compute-0 nova_compute[182092]: 2026-01-23 09:16:23.524 182096 DEBUG nova.compute.manager [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgccpd_6b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2c49518d-6513-4224-844e-0aab2ea675e7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 23 09:16:23 compute-0 nova_compute[182092]: 2026-01-23 09:16:23.542 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:23 compute-0 nova_compute[182092]: 2026-01-23 09:16:23.542 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquired lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:23 compute-0 nova_compute[182092]: 2026-01-23 09:16:23.542 182096 DEBUG nova.network.neutron [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.390 182096 DEBUG nova.network.neutron [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updating instance_info_cache with network_info: [{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.409 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Releasing lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.417 182096 DEBUG nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgccpd_6b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2c49518d-6513-4224-844e-0aab2ea675e7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.417 182096 DEBUG nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Creating instance directory: /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.418 182096 DEBUG nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Creating disk.info with the contents: {'/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk': 'qcow2', '/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.418 182096 DEBUG nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.418 182096 DEBUG nova.objects.instance [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2c49518d-6513-4224-844e-0aab2ea675e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.438 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.464 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.484 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.485 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.485 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.494 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.539 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.539 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.561 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.562 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.563 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.607 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.608 182096 DEBUG nova.virt.disk.api [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Checking if we can resize image /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.608 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.657 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.658 182096 DEBUG nova.virt.disk.api [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Cannot resize image /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.658 182096 DEBUG nova.objects.instance [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2c49518d-6513-4224-844e-0aab2ea675e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.671 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.690 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config 485376" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.691 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config to /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:16:24 compute-0 nova_compute[182092]: 2026-01-23 09:16:24.691 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.018 182096 DEBUG oslo_concurrency.processutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk.config /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7" returned: 0 in 0.327s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.019 182096 DEBUG nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.021 182096 DEBUG nova.virt.libvirt.vif [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:15:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-142208586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-142208586',id=32,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:16:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-ouizy327',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:16:19Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=2c49518d-6513-4224-844e-0aab2ea675e7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.021 182096 DEBUG nova.network.os_vif_util [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converting VIF {"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.022 182096 DEBUG nova.network.os_vif_util [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.022 182096 DEBUG os_vif [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.023 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.023 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.024 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.025 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.025 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1eeea937-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.026 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1eeea937-6c, col_values=(('external_ids', {'iface-id': '1eeea937-6c85-4302-be8a-532452fb9f66', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:7b:0b', 'vm-uuid': '2c49518d-6513-4224-844e-0aab2ea675e7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:25 compute-0 NetworkManager[54920]: <info>  [1769159785.0278] manager: (tap1eeea937-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.027 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.030 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.031 182096 INFO os_vif [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c')
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.031 182096 DEBUG nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.031 182096 DEBUG nova.compute.manager [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgccpd_6b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2c49518d-6513-4224-844e-0aab2ea675e7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.722 182096 DEBUG nova.network.neutron [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Port 1eeea937-6c85-4302-be8a-532452fb9f66 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.732 182096 DEBUG nova.compute.manager [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgccpd_6b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2c49518d-6513-4224-844e-0aab2ea675e7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 23 09:16:25 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 23 09:16:25 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 23 09:16:25 compute-0 podman[212838]: 2026-01-23 09:16:25.867301361 +0000 UTC m=+0.048845932 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:16:25 compute-0 podman[212839]: 2026-01-23 09:16:25.892407671 +0000 UTC m=+0.073845380 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:16:25 compute-0 kernel: tap1eeea937-6c: entered promiscuous mode
Jan 23 09:16:25 compute-0 NetworkManager[54920]: <info>  [1769159785.9399] manager: (tap1eeea937-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Jan 23 09:16:25 compute-0 ovn_controller[94697]: 2026-01-23T09:16:25Z|00071|binding|INFO|Claiming lport 1eeea937-6c85-4302-be8a-532452fb9f66 for this additional chassis.
Jan 23 09:16:25 compute-0 ovn_controller[94697]: 2026-01-23T09:16:25Z|00072|binding|INFO|1eeea937-6c85-4302-be8a-532452fb9f66: Claiming fa:16:3e:15:7b:0b 10.100.0.12
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.941 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:25 compute-0 ovn_controller[94697]: 2026-01-23T09:16:25Z|00073|binding|INFO|Setting lport 1eeea937-6c85-4302-be8a-532452fb9f66 ovn-installed in OVS
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.956 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:25 compute-0 nova_compute[182092]: 2026-01-23 09:16:25.958 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:25 compute-0 systemd-udevd[212907]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:16:25 compute-0 systemd-machined[153562]: New machine qemu-15-instance-00000020.
Jan 23 09:16:25 compute-0 NetworkManager[54920]: <info>  [1769159785.9765] device (tap1eeea937-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:16:25 compute-0 NetworkManager[54920]: <info>  [1769159785.9773] device (tap1eeea937-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:16:25 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000020.
Jan 23 09:16:26 compute-0 nova_compute[182092]: 2026-01-23 09:16:26.887 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.134 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 2c49518d-6513-4224-844e-0aab2ea675e7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.134 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159788.1342528, 2c49518d-6513-4224-844e-0aab2ea675e7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.135 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] VM Started (Lifecycle Event)
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.154 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.927 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159788.9275808, 2c49518d-6513-4224-844e-0aab2ea675e7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.928 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] VM Resumed (Lifecycle Event)
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.948 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.950 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:16:28 compute-0 nova_compute[182092]: 2026-01-23 09:16:28.963 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 23 09:16:30 compute-0 ovn_controller[94697]: 2026-01-23T09:16:30Z|00074|binding|INFO|Claiming lport 1eeea937-6c85-4302-be8a-532452fb9f66 for this chassis.
Jan 23 09:16:30 compute-0 ovn_controller[94697]: 2026-01-23T09:16:30Z|00075|binding|INFO|1eeea937-6c85-4302-be8a-532452fb9f66: Claiming fa:16:3e:15:7b:0b 10.100.0.12
Jan 23 09:16:30 compute-0 ovn_controller[94697]: 2026-01-23T09:16:30Z|00076|binding|INFO|Setting lport 1eeea937-6c85-4302-be8a-532452fb9f66 up in Southbound
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.011 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:7b:0b 10.100.0.12'], port_security=['fa:16:3e:15:7b:0b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b790db365e14138976e54a3cdfc8140', 'neutron:revision_number': '18', 'neutron:security_group_ids': '4258f465-de11-4fd7-a213-c5cfeb4a3d2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80efab28-ca7e-4b92-9afd-cf93bba23f1d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1eeea937-6c85-4302-be8a-532452fb9f66) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.011 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1eeea937-6c85-4302-be8a-532452fb9f66 in datapath e6fc57f5-3894-49ed-8321-7285a3da09a0 bound to our chassis
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.013 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e6fc57f5-3894-49ed-8321-7285a3da09a0
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.022 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0b62ee7b-801f-4f02-8f15-022275c994c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.022 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape6fc57f5-31 in ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.026 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape6fc57f5-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.026 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[618570cc-f170-4b7f-8b5a-6cf636b466c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.026 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0b83eace-667d-46bd-bf0e-61f805a3e57b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.028 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.035 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[6be2cde9-36c5-47a1-9be1-08fdb0ec6bf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.046 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[371fdc6d-a88d-4841-a769-c7a996995f24]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.067 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[522247fa-0d00-4938-9a3b-b9afeabffbd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 NetworkManager[54920]: <info>  [1769159790.0725] manager: (tape6fc57f5-30): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.073 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2f603662-21a6-4730-86b8-7be101d84b72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 systemd-udevd[212956]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.097 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[813e2bee-92f1-484c-b6ad-fae650cf5c6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.099 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[225777af-c8a3-4300-9097-51ca029529a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 NetworkManager[54920]: <info>  [1769159790.1216] device (tape6fc57f5-30): carrier: link connected
Jan 23 09:16:30 compute-0 podman[212941]: 2026-01-23 09:16:30.12280048 +0000 UTC m=+0.063033068 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.125 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[701e588e-625a-4ba1-bb54-c097490b98f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.138 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a3df45bd-945a-405c-966a-12c600577121]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape6fc57f5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:e6:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 329560, 'reachable_time': 30930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212976, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.149 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[41c837eb-c160-40c8-a1a9-8ff91c22898f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4c:e66b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 329560, 'tstamp': 329560}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212978, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.162 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[365fcc01-b340-4254-852f-23d08520abb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape6fc57f5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:e6:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 329560, 'reachable_time': 30930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212979, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.183 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cf090419-4e61-4d45-b635-ed5c7a4e093b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.192 182096 INFO nova.compute.manager [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Post operation of migration started
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.223 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[73c3128c-8c41-4da1-bad1-3a22593aa910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.224 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6fc57f5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.224 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.225 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6fc57f5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.227 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:30 compute-0 kernel: tape6fc57f5-30: entered promiscuous mode
Jan 23 09:16:30 compute-0 NetworkManager[54920]: <info>  [1769159790.2275] manager: (tape6fc57f5-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.228 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.230 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape6fc57f5-30, col_values=(('external_ids', {'iface-id': '840fa4b8-c4fe-41fd-bd18-1e22e94a7dc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.231 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:30 compute-0 ovn_controller[94697]: 2026-01-23T09:16:30Z|00077|binding|INFO|Releasing lport 840fa4b8-c4fe-41fd-bd18-1e22e94a7dc0 from this chassis (sb_readonly=0)
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.232 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.232 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e6fc57f5-3894-49ed-8321-7285a3da09a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e6fc57f5-3894-49ed-8321-7285a3da09a0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.233 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dd454753-0f7b-4239-af3e-aa119f2329c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.233 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-e6fc57f5-3894-49ed-8321-7285a3da09a0
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/e6fc57f5-3894-49ed-8321-7285a3da09a0.pid.haproxy
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID e6fc57f5-3894-49ed-8321-7285a3da09a0
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:16:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:30.234 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'env', 'PROCESS_TAG=haproxy-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e6fc57f5-3894-49ed-8321-7285a3da09a0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.243 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:30 compute-0 podman[213008]: 2026-01-23 09:16:30.517101639 +0000 UTC m=+0.031207251 container create 4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:16:30 compute-0 systemd[1]: Started libpod-conmon-4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583.scope.
Jan 23 09:16:30 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:16:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/738e74df8bbe9b9e44e1a6650d545f8574366dedf35b0e0322fd70e418021ce8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:16:30 compute-0 podman[213008]: 2026-01-23 09:16:30.587532854 +0000 UTC m=+0.101638456 container init 4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.592 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.592 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquired lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:30 compute-0 nova_compute[182092]: 2026-01-23 09:16:30.593 182096 DEBUG nova.network.neutron [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:16:30 compute-0 podman[213008]: 2026-01-23 09:16:30.593555042 +0000 UTC m=+0.107660644 container start 4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:16:30 compute-0 podman[213008]: 2026-01-23 09:16:30.502826693 +0000 UTC m=+0.016932315 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:16:30 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[213020]: [NOTICE]   (213024) : New worker (213026) forked
Jan 23 09:16:30 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[213020]: [NOTICE]   (213024) : Loading success.
Jan 23 09:16:31 compute-0 nova_compute[182092]: 2026-01-23 09:16:31.471 182096 DEBUG nova.network.neutron [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updating instance_info_cache with network_info: [{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:31 compute-0 nova_compute[182092]: 2026-01-23 09:16:31.489 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Releasing lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:31 compute-0 nova_compute[182092]: 2026-01-23 09:16:31.505 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:31 compute-0 nova_compute[182092]: 2026-01-23 09:16:31.505 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:31 compute-0 nova_compute[182092]: 2026-01-23 09:16:31.506 182096 DEBUG oslo_concurrency.lockutils [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:31 compute-0 nova_compute[182092]: 2026-01-23 09:16:31.509 182096 INFO nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 23 09:16:31 compute-0 virtqemud[181713]: Domain id=15 name='instance-00000020' uuid=2c49518d-6513-4224-844e-0aab2ea675e7 is tainted: custom-monitor
Jan 23 09:16:31 compute-0 nova_compute[182092]: 2026-01-23 09:16:31.891 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:32 compute-0 nova_compute[182092]: 2026-01-23 09:16:32.514 182096 INFO nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.000 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000020', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3b790db365e14138976e54a3cdfc8140', 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'hostId': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.000 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.012 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/memory.usage volume: 42.43359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd30960fd-83c4-4450-8350-c004d374a23d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.43359375, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'timestamp': '2026-01-23T09:16:33.000922', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '35a3924a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.541407218, 'message_signature': 'bbd25a45f7591ffba247db438be56a555e96dcad478768828a2f6a039c631607'}]}, 'timestamp': '2026-01-23 09:16:33.012531', '_unique_id': 'a254a71f1bd940b2a051b96ed7df8fb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.013 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.015 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2c49518d-6513-4224-844e-0aab2ea675e7 / tap1eeea937-6c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.015 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c0be82d-b243-406e-9e68-754055954e9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.014099', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35a416e8-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': '840b6d2f93f1f97cf8830af0559ad9b6c221819b41219843ece958dd4a787352'}]}, 'timestamp': '2026-01-23 09:16:33.015897', '_unique_id': '0b3d18a3369446f597152f9bfe0f7179'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.033 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15ee798d-9662-496f-bfe4-dfb9240a0582', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.017046', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35a6dcd4-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': '471e2292db312cec39aa4191fce372445a17c0ac388fde634a0cb970d6339e26'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.017046', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35a6e710-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': 'b645f7ab1ac1dab5dd6b6d34a5491bcbd52105ed327bd9e7f0b7490ddedbdaae'}]}, 'timestamp': '2026-01-23 09:16:33.034308', '_unique_id': 'a63a29244b23462ba597de9e25104526'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.034 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.035 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.035 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.035 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab521266-add0-4b7f-a8cd-01302894e286', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.035402', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35a71a0a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': '5336a6e73b257cf3d174a9240e455d224b7b3c8a41f4afe64872307e9af4ce8b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.035402', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35a72310-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': 'afd8ded59c0d5a57f33b46a5da145c242fd2250ac24e68cad55a8a045e2f5e7b'}]}, 'timestamp': '2026-01-23 09:16:33.035843', '_unique_id': 'c5b4b499bda7456191b185229f17ae76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.036 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c77f2f1e-0062-4306-869c-8215e7999367', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.037302', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35a76460-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': 'c271c634ada9d04297054716bf05905372cdae38b18bbd61b42123b1fb1de800'}]}, 'timestamp': '2026-01-23 09:16:33.037526', '_unique_id': '7fe93ae148c24f4cbbe1910612d5d991'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.037 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.038 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.038 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.write.bytes volume: 12288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.038 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18dd1d27-c5a9-4c82-ab1b-fb9e14b71e41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12288, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.038592', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35a79764-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': '94deffa7bc7a5b2a4c1fb8f33eef7de3a5ca26b826645a0a6ec21c59dd8cd505'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.038592', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35a79f8e-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': '69c0bbe69951b4cf27272a45c56f2ca47ba309df136956ed364dfe246bd9b7cb'}]}, 'timestamp': '2026-01-23 09:16:33.039031', '_unique_id': '2757f971dd7a42b78aa9c303d2b68941'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.039 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.outgoing.bytes volume: 3892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c356d8b7-f7fb-4e05-9de4-6f70e30d8c1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3892, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.040108', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35a7d274-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': 'a293ea8dec1231bd272b3f705795b276645f3e3a22fec1bf7b6ab63e887bc0bc'}]}, 'timestamp': '2026-01-23 09:16:33.040345', '_unique_id': '62d685d464d540d58d9994d5259a54ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.040 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.048 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.048 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8fdae55-f065-41c1-88c6-5d209e559cbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.041386', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35a9174c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.570765518, 'message_signature': '3288cc77e322a6842b8325c49c07c0d8d454d94016f2ab29973633a4c043ec1a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.041386', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35a920e8-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.570765518, 'message_signature': 'a1ae01256354d1ffb71b9d1e038c58e4d09d2d6b0c374e71a449ca2fc91d4f1b'}]}, 'timestamp': '2026-01-23 09:16:33.048897', '_unique_id': 'ca64a6a5ff2e4d019a9dee4ecbd3f36b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.050 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.050 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-142208586>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-142208586>]
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.050 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.050 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b0329c6-e938-4fc8-b94f-9f15331d2297', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.050327', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35a96148-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': '873731a698254fc2c14ef378f8b6708cf08784708fad577a4cde39ca5730e17f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.050327', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35a96a3a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': '6941f30f70137fe044208f72aed36de40b35c96838822e8636a4c3c049cbaffc'}]}, 'timestamp': '2026-01-23 09:16:33.050774', '_unique_id': 'e7f7c6bec41a4babba1ab93aace906e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.051 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-142208586>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-142208586>]
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/cpu volume: 20000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '118433e2-0eb2-4d83-8ed3-52f4710a8001', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20000000, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'timestamp': '2026-01-23T09:16:33.052141', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '35a9a874-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.541407218, 'message_signature': '1c2f8bce95bf63576d541a1cc8a37c020c7ae7ed7745de49c9b70c56aec014b0'}]}, 'timestamp': '2026-01-23 09:16:33.052374', '_unique_id': 'e6e3b1a11d514cad916071e30d5aba75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.053 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '374e757e-21ca-41b3-9838-3d8fce0a0ce3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.053418', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35a9d9d4-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': 'a377d574268995ecca3a46061f31a0de6c3a70be746b9670b54116b68f488914'}]}, 'timestamp': '2026-01-23 09:16:33.053642', '_unique_id': 'b2c5ddc67cc0488e9871babfe36d3bf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.054 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.outgoing.packets volume: 56 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8c1f369-7e7f-4a46-a465-0b1b9da55c73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 56, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.054747', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35aa0dd2-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': 'fdf1f4cb90a701685ef17ae6d0ad158dd1f5028188515e16f522ee7546754bac'}]}, 'timestamp': '2026-01-23 09:16:33.054974', '_unique_id': '5903d5ce08324418a7b7c8df338c5196'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '476f26e2-687a-4b88-ba91-7462662c42d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.056091', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35aa4298-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.570765518, 'message_signature': '684a2d598fc9f3dddd19a986e087315c9670ff44876c855fdaaf118e34f3058e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.056091', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35aa4a90-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.570765518, 'message_signature': '7350a07da076b86a6639b3019374e3364d82ef4082a6acbf2d295026dd972f3d'}]}, 'timestamp': '2026-01-23 09:16:33.056516', '_unique_id': '7dec474439af4232924c80f485915c80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.056 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.057 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b382bae4-a297-4579-9b61-341899cdb497', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.057574', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35aa7cea-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': 'e7f5aa61b60c1a74dc50be220a192a53d269f88f5ffb6dc3479fd21d561fbdd0'}]}, 'timestamp': '2026-01-23 09:16:33.057817', '_unique_id': 'ac48618634cc4032aa086b8958297048'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.058 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc0c3e71-9c47-40f9-90f1-5e8533723e6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.058965', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35aab282-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': '56df04a020e018ae3f7a9835a78768d33d340fdfce08b91f978dc57b044ea0ed'}]}, 'timestamp': '2026-01-23 09:16:33.059201', '_unique_id': '5f856527b32841dbbfc417a9dbec7200'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.060 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.060 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.write.latency volume: 658722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.060 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f151b90-c434-4a75-a7af-830e6ad3164b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 658722, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.060238', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35aae428-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': 'd45ba57dbd87fa8aa8c0c7db89eed62b8bce662cd75eb7f3e1e2903bef8fcabc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.060238', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35aaec20-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': '734b8d2313f98c88fd28c2097c9f0c6e2e4dca04e950f6c410f656a6c85bd26a'}]}, 'timestamp': '2026-01-23 09:16:33.060665', '_unique_id': '01001262861f44a09c653ea87d3e191c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.061 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e7393da-4290-4fdb-a1e3-59e9c1547f63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.061715', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35ab1de4-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.570765518, 'message_signature': 'fea8252ef48cb18798fdc7da5959e33f8f69565da20108bbbc19c134d7c171e0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.061715', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35ab25d2-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.570765518, 'message_signature': '61798af482a91539afc0d9d9de5101f268727138d3a97241a94138e2e5c36c43'}]}, 'timestamp': '2026-01-23 09:16:33.062125', '_unique_id': '64af8d83b7a345aab78295b0b42e1bb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.062 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.063 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.063 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-142208586>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-142208586>]
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.063 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.063 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-142208586>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-142208586>]
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.063 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eb62b1d-383b-4e2d-a19e-35ba5bfe8412', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.063823', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35ab705a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': '1958589d53c04b577689508b71ccdccea7fc607b82e472888a8e7a2ed40fd7f0'}]}, 'timestamp': '2026-01-23 09:16:33.064051', '_unique_id': '407425fffb14413a9b5ce258c86b159c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '887420ae-b389-4e21-8d80-5d5b79cb6294', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.065093', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35aba25a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': 'be09c06f977aef7055d5046a932686ce77aaef9911e1068cd12fd383fed19eb2'}]}, 'timestamp': '2026-01-23 09:16:33.065330', '_unique_id': '4059ff419b714edfbbfbc579aab5de4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.066 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef112c64-c866-4ff5-b6a8-c57ec810a6da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': 'instance-00000020-2c49518d-6513-4224-844e-0aab2ea675e7-tap1eeea937-6c', 'timestamp': '2026-01-23T09:16:33.066355', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'tap1eeea937-6c', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:7b:0b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1eeea937-6c'}, 'message_id': '35abd31a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.543481224, 'message_signature': '91f957cc91928547822de9fbbedce9e7a1bfb91d642fd07f79ca12e0c7d3a338'}]}, 'timestamp': '2026-01-23 09:16:33.066576', '_unique_id': 'af4daffc86be479382a59f2e14bdc900'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.write.requests volume: 2 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.067 12 DEBUG ceilometer.compute.pollsters [-] 2c49518d-6513-4224-844e-0aab2ea675e7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28280534-fc70-4079-a2f3-ff7e47f10efd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 2, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-vda', 'timestamp': '2026-01-23T09:16:33.067599', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35ac047a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': 'd5eb8e33416a1705f2bbd08f6b95b79186449913cef4e727ff6a52bf20a702dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_name': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_name': None, 'resource_id': '2c49518d-6513-4224-844e-0aab2ea675e7-sda', 'timestamp': '2026-01-23T09:16:33.067599', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-142208586', 'name': 'instance-00000020', 'instance_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'instance_type': 'm1.nano', 'host': '5525c15eab99327e78273767fc08c250544621573bb5c99be691efbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35ac0ca4-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3298.546427766, 'message_signature': '630bc28c33753a8ea602ac4392660b52188749375ec2c9812a6b887a10d757d8'}]}, 'timestamp': '2026-01-23 09:16:33.068036', '_unique_id': 'cf9aab9a2ad24ef285532352d4c7e3a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:16:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:16:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:16:33 compute-0 nova_compute[182092]: 2026-01-23 09:16:33.518 182096 INFO nova.virt.libvirt.driver [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 23 09:16:33 compute-0 nova_compute[182092]: 2026-01-23 09:16:33.522 182096 DEBUG nova.compute.manager [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:33 compute-0 nova_compute[182092]: 2026-01-23 09:16:33.538 182096 DEBUG nova.objects.instance [None req-b68df8ea-6603-49f2-bfa8-3b359b12c8c0 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.681 182096 DEBUG nova.compute.manager [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.682 182096 DEBUG oslo_concurrency.lockutils [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.682 182096 DEBUG oslo_concurrency.lockutils [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.682 182096 DEBUG oslo_concurrency.lockutils [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.683 182096 DEBUG nova.compute.manager [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.683 182096 WARNING nova.compute.manager [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state active and task_state None.
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.683 182096 DEBUG nova.compute.manager [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.684 182096 DEBUG oslo_concurrency.lockutils [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.684 182096 DEBUG oslo_concurrency.lockutils [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.684 182096 DEBUG oslo_concurrency.lockutils [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.684 182096 DEBUG nova.compute.manager [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:34 compute-0 nova_compute[182092]: 2026-01-23 09:16:34.685 182096 WARNING nova.compute.manager [req-2e48007f-157f-4178-8365-f58639596db1 req-86b27163-2b58-429a-8f79-47e967358ae1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state active and task_state None.
Jan 23 09:16:35 compute-0 nova_compute[182092]: 2026-01-23 09:16:35.029 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:36 compute-0 nova_compute[182092]: 2026-01-23 09:16:36.892 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:38 compute-0 podman[213031]: 2026-01-23 09:16:38.225830886 +0000 UTC m=+0.057596596 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:16:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:39.853 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:39.854 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:39.854 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.032 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.733 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.733 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.751 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.843 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.843 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.853 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.854 182096 INFO nova.compute.claims [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.971 182096 DEBUG nova.compute.provider_tree [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.981 182096 DEBUG nova.scheduler.client.report [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.995 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:40 compute-0 nova_compute[182092]: 2026-01-23 09:16:40.996 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.048 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.049 182096 DEBUG nova.network.neutron [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.066 182096 INFO nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.076 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.187 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.188 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.189 182096 INFO nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Creating image(s)
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.189 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "/var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.189 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "/var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.190 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "/var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.200 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.248 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.249 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.250 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.259 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.306 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.307 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.328 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.329 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.329 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.375 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.376 182096 DEBUG nova.virt.disk.api [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Checking if we can resize image /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.376 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.422 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.423 182096 DEBUG nova.virt.disk.api [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Cannot resize image /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.423 182096 DEBUG nova.objects.instance [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lazy-loading 'migration_context' on Instance uuid 08d44559-7cdd-441f-9146-8146388afa84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.438 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.438 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Ensure instance console log exists: /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.439 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.439 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.439 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.518 182096 DEBUG nova.policy [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93e6ff4cf2404b8db0db1ed141716461', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b790db365e14138976e54a3cdfc8140', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:16:41 compute-0 nova_compute[182092]: 2026-01-23 09:16:41.893 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:43 compute-0 nova_compute[182092]: 2026-01-23 09:16:43.135 182096 DEBUG nova.network.neutron [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Successfully updated port: 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:16:43 compute-0 nova_compute[182092]: 2026-01-23 09:16:43.155 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:43 compute-0 nova_compute[182092]: 2026-01-23 09:16:43.155 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquired lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:43 compute-0 nova_compute[182092]: 2026-01-23 09:16:43.155 182096 DEBUG nova.network.neutron [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:16:43 compute-0 nova_compute[182092]: 2026-01-23 09:16:43.289 182096 DEBUG nova.compute.manager [req-9c01c18d-46c6-4387-9c48-acb888b77cfd req-a49ebf88-fd22-4e1f-92b2-246a39cb7acf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-changed-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:43 compute-0 nova_compute[182092]: 2026-01-23 09:16:43.289 182096 DEBUG nova.compute.manager [req-9c01c18d-46c6-4387-9c48-acb888b77cfd req-a49ebf88-fd22-4e1f-92b2-246a39cb7acf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Refreshing instance network info cache due to event network-changed-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:16:43 compute-0 nova_compute[182092]: 2026-01-23 09:16:43.290 182096 DEBUG oslo_concurrency.lockutils [req-9c01c18d-46c6-4387-9c48-acb888b77cfd req-a49ebf88-fd22-4e1f-92b2-246a39cb7acf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:43 compute-0 nova_compute[182092]: 2026-01-23 09:16:43.319 182096 DEBUG nova.network.neutron [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:16:44 compute-0 podman[213070]: 2026-01-23 09:16:44.207221937 +0000 UTC m=+0.040154913 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:16:44 compute-0 podman[213069]: 2026-01-23 09:16:44.212211986 +0000 UTC m=+0.046573499 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.647 182096 DEBUG nova.network.neutron [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Updating instance_info_cache with network_info: [{"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.666 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Releasing lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.666 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Instance network_info: |[{"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.667 182096 DEBUG oslo_concurrency.lockutils [req-9c01c18d-46c6-4387-9c48-acb888b77cfd req-a49ebf88-fd22-4e1f-92b2-246a39cb7acf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.667 182096 DEBUG nova.network.neutron [req-9c01c18d-46c6-4387-9c48-acb888b77cfd req-a49ebf88-fd22-4e1f-92b2-246a39cb7acf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Refreshing network info cache for port 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.669 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Start _get_guest_xml network_info=[{"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.672 182096 WARNING nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.676 182096 DEBUG nova.virt.libvirt.host [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.676 182096 DEBUG nova.virt.libvirt.host [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.681 182096 DEBUG nova.virt.libvirt.host [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.681 182096 DEBUG nova.virt.libvirt.host [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.682 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.683 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.683 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.683 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.683 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.684 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.684 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.684 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.684 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.684 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.684 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.685 182096 DEBUG nova.virt.hardware [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.687 182096 DEBUG nova.virt.libvirt.vif [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:16:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1325274361',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1325274361',id=34,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-7050n681',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:16:41Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=08d44559-7cdd-441f-9146-8146388afa84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.687 182096 DEBUG nova.network.os_vif_util [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converting VIF {"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.688 182096 DEBUG nova.network.os_vif_util [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:3e:5b,bridge_name='br-int',has_traffic_filtering=True,id=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7344562c-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.689 182096 DEBUG nova.objects.instance [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lazy-loading 'pci_devices' on Instance uuid 08d44559-7cdd-441f-9146-8146388afa84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.706 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <uuid>08d44559-7cdd-441f-9146-8146388afa84</uuid>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <name>instance-00000022</name>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1325274361</nova:name>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:16:44</nova:creationTime>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:16:44 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:16:44 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:16:44 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:16:44 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:16:44 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:16:44 compute-0 nova_compute[182092]:         <nova:user uuid="93e6ff4cf2404b8db0db1ed141716461">tempest-LiveAutoBlockMigrationV225Test-779044727-project-member</nova:user>
Jan 23 09:16:44 compute-0 nova_compute[182092]:         <nova:project uuid="3b790db365e14138976e54a3cdfc8140">tempest-LiveAutoBlockMigrationV225Test-779044727</nova:project>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:16:44 compute-0 nova_compute[182092]:         <nova:port uuid="7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd">
Jan 23 09:16:44 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <system>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <entry name="serial">08d44559-7cdd-441f-9146-8146388afa84</entry>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <entry name="uuid">08d44559-7cdd-441f-9146-8146388afa84</entry>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </system>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <os>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   </os>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <features>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   </features>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk.config"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:8b:3e:5b"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <target dev="tap7344562c-b7"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/console.log" append="off"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <video>
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </video>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:16:44 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:16:44 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:16:44 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:16:44 compute-0 nova_compute[182092]: </domain>
Jan 23 09:16:44 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.707 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Preparing to wait for external event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.708 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.708 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.708 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.709 182096 DEBUG nova.virt.libvirt.vif [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:16:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1325274361',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1325274361',id=34,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-7050n681',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:16:41Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=08d44559-7cdd-441f-9146-8146388afa84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.709 182096 DEBUG nova.network.os_vif_util [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converting VIF {"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.710 182096 DEBUG nova.network.os_vif_util [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:3e:5b,bridge_name='br-int',has_traffic_filtering=True,id=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7344562c-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.710 182096 DEBUG os_vif [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:3e:5b,bridge_name='br-int',has_traffic_filtering=True,id=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7344562c-b7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.710 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.711 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.711 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.713 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.713 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7344562c-b7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.713 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7344562c-b7, col_values=(('external_ids', {'iface-id': '7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:3e:5b', 'vm-uuid': '08d44559-7cdd-441f-9146-8146388afa84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.714 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:44 compute-0 NetworkManager[54920]: <info>  [1769159804.7160] manager: (tap7344562c-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.717 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.719 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.720 182096 INFO os_vif [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:3e:5b,bridge_name='br-int',has_traffic_filtering=True,id=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7344562c-b7')
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.761 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.761 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.761 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] No VIF found with MAC fa:16:3e:8b:3e:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:16:44 compute-0 nova_compute[182092]: 2026-01-23 09:16:44.762 182096 INFO nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Using config drive
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.096 182096 INFO nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Creating config drive at /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk.config
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.101 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgob5tnof execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.220 182096 DEBUG oslo_concurrency.processutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgob5tnof" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:45 compute-0 kernel: tap7344562c-b7: entered promiscuous mode
Jan 23 09:16:45 compute-0 NetworkManager[54920]: <info>  [1769159805.2554] manager: (tap7344562c-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Jan 23 09:16:45 compute-0 ovn_controller[94697]: 2026-01-23T09:16:45Z|00078|binding|INFO|Claiming lport 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd for this chassis.
Jan 23 09:16:45 compute-0 ovn_controller[94697]: 2026-01-23T09:16:45Z|00079|binding|INFO|7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd: Claiming fa:16:3e:8b:3e:5b 10.100.0.6
Jan 23 09:16:45 compute-0 ovn_controller[94697]: 2026-01-23T09:16:45Z|00080|binding|INFO|Claiming lport ec0d269a-d5e0-4fd2-81cd-c44e42561b85 for this chassis.
Jan 23 09:16:45 compute-0 ovn_controller[94697]: 2026-01-23T09:16:45Z|00081|binding|INFO|ec0d269a-d5e0-4fd2-81cd-c44e42561b85: Claiming fa:16:3e:d0:76:53 19.80.0.162
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.259 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.263 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:76:53 19.80.0.162'], port_security=['fa:16:3e:d0:76:53 19.80.0.162'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-795994029', 'neutron:cidrs': '19.80.0.162/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-795994029', 'neutron:project_id': '3b790db365e14138976e54a3cdfc8140', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4258f465-de11-4fd7-a213-c5cfeb4a3d2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=f3bf5f4f-838d-4900-9aed-b33358455586, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ec0d269a-d5e0-4fd2-81cd-c44e42561b85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.265 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:3e:5b 10.100.0.6'], port_security=['fa:16:3e:8b:3e:5b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2128270494', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '08d44559-7cdd-441f-9146-8146388afa84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2128270494', 'neutron:project_id': '3b790db365e14138976e54a3cdfc8140', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4258f465-de11-4fd7-a213-c5cfeb4a3d2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80efab28-ca7e-4b92-9afd-cf93bba23f1d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.266 103978 INFO neutron.agent.ovn.metadata.agent [-] Port ec0d269a-d5e0-4fd2-81cd-c44e42561b85 in datapath bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2 bound to our chassis
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.267 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.276 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b462dbc9-6acd-49df-8277-7e9bf6bcb41c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.277 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbddcd2e5-71 in ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.278 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbddcd2e5-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.279 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a6fda905-f7bc-4ab6-8125-274dd9912e96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.279 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cc958c59-0c3b-4b55-9a6a-619d1f533440]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.288 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[a507e774-1fe7-4075-8c9d-a820b0ee98b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 systemd-udevd[213131]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:16:45 compute-0 systemd-machined[153562]: New machine qemu-16-instance-00000022.
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.304 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000022.
Jan 23 09:16:45 compute-0 ovn_controller[94697]: 2026-01-23T09:16:45Z|00082|binding|INFO|Setting lport 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd ovn-installed in OVS
Jan 23 09:16:45 compute-0 ovn_controller[94697]: 2026-01-23T09:16:45Z|00083|binding|INFO|Setting lport 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd up in Southbound
Jan 23 09:16:45 compute-0 ovn_controller[94697]: 2026-01-23T09:16:45Z|00084|binding|INFO|Setting lport ec0d269a-d5e0-4fd2-81cd-c44e42561b85 up in Southbound
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.309 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 NetworkManager[54920]: <info>  [1769159805.3141] device (tap7344562c-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:16:45 compute-0 NetworkManager[54920]: <info>  [1769159805.3160] device (tap7344562c-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.316 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cf4a36f7-3f1b-4720-902a-8f88f1ba9f5d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.336 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[40439ba2-899f-4358-bbfd-f8e8ad8a6f41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 NetworkManager[54920]: <info>  [1769159805.3404] manager: (tapbddcd2e5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.339 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[884789e8-30ed-46c8-b2b3-19c13cf19d3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.364 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5f4409eb-0739-4ff7-ab34-375be9d6c8e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.367 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc6aa95-f0a6-4c70-9111-6be42ebbeb90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 NetworkManager[54920]: <info>  [1769159805.3818] device (tapbddcd2e5-70): carrier: link connected
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.384 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[25a9737a-92aa-4bc1-a6f4-d51121554d1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.397 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fab35d32-6ed7-4c3e-9773-9c085845bfc5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbddcd2e5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:2e:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331086, 'reachable_time': 25762, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213153, 'error': None, 'target': 'ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.409 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b165e5-1d00-4f6c-ab04-9564474c4058]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:2ed3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 331086, 'tstamp': 331086}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213154, 'error': None, 'target': 'ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.421 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a64263a3-624a-4801-9d87-8771b97dc3bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbddcd2e5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:2e:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331086, 'reachable_time': 25762, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213155, 'error': None, 'target': 'ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.441 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5516070a-54cc-476a-96b9-1c05f24ee41d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.483 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a8316a17-3bd9-4f7f-a92c-c9fc6ac342fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.484 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbddcd2e5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.484 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.485 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbddcd2e5-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.486 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 NetworkManager[54920]: <info>  [1769159805.4872] manager: (tapbddcd2e5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 23 09:16:45 compute-0 kernel: tapbddcd2e5-70: entered promiscuous mode
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.489 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.490 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbddcd2e5-70, col_values=(('external_ids', {'iface-id': 'ad866155-6974-4330-8545-c4cf26f69fcd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.491 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.492 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.492 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:16:45 compute-0 ovn_controller[94697]: 2026-01-23T09:16:45Z|00085|binding|INFO|Releasing lport ad866155-6974-4330-8545-c4cf26f69fcd from this chassis (sb_readonly=0)
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.493 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fb8f8fdd-55d2-43f4-a826-a2edeb9af726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.493 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2.pid.haproxy
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.495 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2', 'env', 'PROCESS_TAG=haproxy-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.505 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.677 182096 DEBUG nova.compute.manager [req-965bd7e1-7f5f-4eb9-9370-dee06e883734 req-786d9c4a-c3b2-4c84-8549-18e10038af6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.677 182096 DEBUG oslo_concurrency.lockutils [req-965bd7e1-7f5f-4eb9-9370-dee06e883734 req-786d9c4a-c3b2-4c84-8549-18e10038af6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.677 182096 DEBUG oslo_concurrency.lockutils [req-965bd7e1-7f5f-4eb9-9370-dee06e883734 req-786d9c4a-c3b2-4c84-8549-18e10038af6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.677 182096 DEBUG oslo_concurrency.lockutils [req-965bd7e1-7f5f-4eb9-9370-dee06e883734 req-786d9c4a-c3b2-4c84-8549-18e10038af6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.677 182096 DEBUG nova.compute.manager [req-965bd7e1-7f5f-4eb9-9370-dee06e883734 req-786d9c4a-c3b2-4c84-8549-18e10038af6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Processing event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:16:45 compute-0 podman[213183]: 2026-01-23 09:16:45.767811298 +0000 UTC m=+0.030377706 container create 451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 09:16:45 compute-0 systemd[1]: Started libpod-conmon-451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc.scope.
Jan 23 09:16:45 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:16:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/876584badf6287ad538a33760330ac62eb499e6ad2701222258c953b0d793dcd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:16:45 compute-0 podman[213183]: 2026-01-23 09:16:45.817872349 +0000 UTC m=+0.080438776 container init 451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:16:45 compute-0 podman[213183]: 2026-01-23 09:16:45.82224849 +0000 UTC m=+0.084814896 container start 451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:16:45 compute-0 podman[213183]: 2026-01-23 09:16:45.753227347 +0000 UTC m=+0.015793775 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:16:45 compute-0 neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2[213198]: [NOTICE]   (213205) : New worker (213208) forked
Jan 23 09:16:45 compute-0 neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2[213198]: [NOTICE]   (213205) : Loading success.
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.858 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.859 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159805.8572683, 08d44559-7cdd-441f-9146-8146388afa84 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.859 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] VM Started (Lifecycle Event)
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.864 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd in datapath e6fc57f5-3894-49ed-8321-7285a3da09a0 unbound from our chassis
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.865 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.866 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e6fc57f5-3894-49ed-8321-7285a3da09a0
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.868 182096 INFO nova.virt.libvirt.driver [-] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Instance spawned successfully.
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.868 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.879 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[60ad62bd-c97a-4a0c-a94a-ebe9aac9d374]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.881 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.885 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.894 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.894 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.894 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.895 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.895 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.895 182096 DEBUG nova.virt.libvirt.driver [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.898 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.898 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159805.8573647, 08d44559-7cdd-441f-9146-8146388afa84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.898 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] VM Paused (Lifecycle Event)
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.904 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb9869d-cbe2-422a-bc3d-3875f950b193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.907 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ff0580fe-0714-411d-a336-244d2678181b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.917 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.920 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159805.8610268, 08d44559-7cdd-441f-9146-8146388afa84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.920 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] VM Resumed (Lifecycle Event)
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.930 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[69cf0c17-6e82-4e2e-a357-ae438d2483bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.938 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.940 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.949 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bb895e70-0712-40d5-a65e-dcc15cd4b55f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape6fc57f5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:e6:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 329560, 'reachable_time': 30930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213218, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.952 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.961 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c64f2a-d644-44d7-9909-800f020bc8ce]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape6fc57f5-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 329568, 'tstamp': 329568}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213219, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape6fc57f5-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 329570, 'tstamp': 329570}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213219, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.962 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6fc57f5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.963 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.964 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.964 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6fc57f5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.964 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.965 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape6fc57f5-30, col_values=(('external_ids', {'iface-id': '840fa4b8-c4fe-41fd-bd18-1e22e94a7dc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:45.965 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.966 182096 INFO nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Took 4.78 seconds to spawn the instance on the hypervisor.
Jan 23 09:16:45 compute-0 nova_compute[182092]: 2026-01-23 09:16:45.966 182096 DEBUG nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:46 compute-0 nova_compute[182092]: 2026-01-23 09:16:46.039 182096 INFO nova.compute.manager [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Took 5.23 seconds to build instance.
Jan 23 09:16:46 compute-0 nova_compute[182092]: 2026-01-23 09:16:46.058 182096 DEBUG oslo_concurrency.lockutils [None req-54a1542e-c052-4a96-acc0-3b2fddd6590d 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:46 compute-0 nova_compute[182092]: 2026-01-23 09:16:46.702 182096 DEBUG nova.network.neutron [req-9c01c18d-46c6-4387-9c48-acb888b77cfd req-a49ebf88-fd22-4e1f-92b2-246a39cb7acf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Updated VIF entry in instance network info cache for port 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:16:46 compute-0 nova_compute[182092]: 2026-01-23 09:16:46.702 182096 DEBUG nova.network.neutron [req-9c01c18d-46c6-4387-9c48-acb888b77cfd req-a49ebf88-fd22-4e1f-92b2-246a39cb7acf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Updating instance_info_cache with network_info: [{"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:46 compute-0 nova_compute[182092]: 2026-01-23 09:16:46.717 182096 DEBUG oslo_concurrency.lockutils [req-9c01c18d-46c6-4387-9c48-acb888b77cfd req-a49ebf88-fd22-4e1f-92b2-246a39cb7acf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:46 compute-0 nova_compute[182092]: 2026-01-23 09:16:46.895 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:47 compute-0 nova_compute[182092]: 2026-01-23 09:16:47.749 182096 DEBUG nova.compute.manager [req-d215b626-f02f-46cc-a483-71b85884eaad req-6f13100f-29e2-4ff2-b294-cefae0315ee6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:47 compute-0 nova_compute[182092]: 2026-01-23 09:16:47.750 182096 DEBUG oslo_concurrency.lockutils [req-d215b626-f02f-46cc-a483-71b85884eaad req-6f13100f-29e2-4ff2-b294-cefae0315ee6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:47 compute-0 nova_compute[182092]: 2026-01-23 09:16:47.750 182096 DEBUG oslo_concurrency.lockutils [req-d215b626-f02f-46cc-a483-71b85884eaad req-6f13100f-29e2-4ff2-b294-cefae0315ee6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:47 compute-0 nova_compute[182092]: 2026-01-23 09:16:47.750 182096 DEBUG oslo_concurrency.lockutils [req-d215b626-f02f-46cc-a483-71b85884eaad req-6f13100f-29e2-4ff2-b294-cefae0315ee6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:47 compute-0 nova_compute[182092]: 2026-01-23 09:16:47.751 182096 DEBUG nova.compute.manager [req-d215b626-f02f-46cc-a483-71b85884eaad req-6f13100f-29e2-4ff2-b294-cefae0315ee6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] No waiting events found dispatching network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:47 compute-0 nova_compute[182092]: 2026-01-23 09:16:47.751 182096 WARNING nova.compute.manager [req-d215b626-f02f-46cc-a483-71b85884eaad req-6f13100f-29e2-4ff2-b294-cefae0315ee6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received unexpected event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd for instance with vm_state active and task_state None.
Jan 23 09:16:49 compute-0 nova_compute[182092]: 2026-01-23 09:16:49.714 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:50 compute-0 nova_compute[182092]: 2026-01-23 09:16:50.760 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Check if temp file /var/lib/nova/instances/tmp3icrgyih exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 23 09:16:50 compute-0 nova_compute[182092]: 2026-01-23 09:16:50.761 182096 DEBUG nova.compute.manager [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3icrgyih',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='08d44559-7cdd-441f-9146-8146388afa84',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 23 09:16:51 compute-0 nova_compute[182092]: 2026-01-23 09:16:51.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:51 compute-0 nova_compute[182092]: 2026-01-23 09:16:51.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:16:51 compute-0 nova_compute[182092]: 2026-01-23 09:16:51.779 182096 DEBUG oslo_concurrency.processutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:51 compute-0 nova_compute[182092]: 2026-01-23 09:16:51.838 182096 DEBUG oslo_concurrency.processutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:51 compute-0 nova_compute[182092]: 2026-01-23 09:16:51.839 182096 DEBUG oslo_concurrency.processutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:51 compute-0 nova_compute[182092]: 2026-01-23 09:16:51.897 182096 DEBUG oslo_concurrency.processutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:51 compute-0 nova_compute[182092]: 2026-01-23 09:16:51.897 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.676 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.676 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.677 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.736 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.805 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.806 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.868 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.873 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.937 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:52 compute-0 nova_compute[182092]: 2026-01-23 09:16:52.938 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.002 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.242 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.244 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5439MB free_disk=73.3442497253418GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.245 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.245 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.293 182096 INFO nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Updating resource usage from migration 5f3f5758-6cde-449f-89ae-7388b7c40be4
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.314 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 2c49518d-6513-4224-844e-0aab2ea675e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.314 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Migration 5f3f5758-6cde-449f-89ae-7388b7c40be4 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.314 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.315 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.373 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.383 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.399 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:16:53 compute-0 nova_compute[182092]: 2026-01-23 09:16:53.399 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:53 compute-0 sshd-session[213240]: Accepted publickey for nova from 192.168.122.101 port 46580 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:16:53 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:16:53 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:16:53 compute-0 systemd-logind[746]: New session 49 of user nova.
Jan 23 09:16:53 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:16:53 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:16:53 compute-0 systemd[213244]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:16:53 compute-0 systemd[213244]: Queued start job for default target Main User Target.
Jan 23 09:16:53 compute-0 systemd[213244]: Created slice User Application Slice.
Jan 23 09:16:53 compute-0 systemd[213244]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:16:53 compute-0 systemd[213244]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:16:53 compute-0 systemd[213244]: Reached target Paths.
Jan 23 09:16:53 compute-0 systemd[213244]: Reached target Timers.
Jan 23 09:16:53 compute-0 systemd[213244]: Starting D-Bus User Message Bus Socket...
Jan 23 09:16:53 compute-0 systemd[213244]: Starting Create User's Volatile Files and Directories...
Jan 23 09:16:53 compute-0 systemd[213244]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:16:53 compute-0 systemd[213244]: Reached target Sockets.
Jan 23 09:16:53 compute-0 systemd[213244]: Finished Create User's Volatile Files and Directories.
Jan 23 09:16:53 compute-0 systemd[213244]: Reached target Basic System.
Jan 23 09:16:53 compute-0 systemd[213244]: Reached target Main User Target.
Jan 23 09:16:53 compute-0 systemd[213244]: Startup finished in 105ms.
Jan 23 09:16:53 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:16:53 compute-0 systemd[1]: Started Session 49 of User nova.
Jan 23 09:16:53 compute-0 sshd-session[213240]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:16:53 compute-0 sshd-session[213259]: Received disconnect from 192.168.122.101 port 46580:11: disconnected by user
Jan 23 09:16:53 compute-0 sshd-session[213259]: Disconnected from user nova 192.168.122.101 port 46580
Jan 23 09:16:53 compute-0 sshd-session[213240]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:16:53 compute-0 systemd[1]: session-49.scope: Deactivated successfully.
Jan 23 09:16:53 compute-0 systemd-logind[746]: Session 49 logged out. Waiting for processes to exit.
Jan 23 09:16:53 compute-0 systemd-logind[746]: Removed session 49.
Jan 23 09:16:54 compute-0 nova_compute[182092]: 2026-01-23 09:16:54.650 182096 DEBUG nova.compute.manager [req-489ba25a-e8bd-460d-b42c-8d23e846e133 req-558b0a7c-a294-45a1-9d9f-08ffe666333e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-unplugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:54 compute-0 nova_compute[182092]: 2026-01-23 09:16:54.652 182096 DEBUG oslo_concurrency.lockutils [req-489ba25a-e8bd-460d-b42c-8d23e846e133 req-558b0a7c-a294-45a1-9d9f-08ffe666333e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:54 compute-0 nova_compute[182092]: 2026-01-23 09:16:54.652 182096 DEBUG oslo_concurrency.lockutils [req-489ba25a-e8bd-460d-b42c-8d23e846e133 req-558b0a7c-a294-45a1-9d9f-08ffe666333e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:54 compute-0 nova_compute[182092]: 2026-01-23 09:16:54.652 182096 DEBUG oslo_concurrency.lockutils [req-489ba25a-e8bd-460d-b42c-8d23e846e133 req-558b0a7c-a294-45a1-9d9f-08ffe666333e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:54 compute-0 nova_compute[182092]: 2026-01-23 09:16:54.652 182096 DEBUG nova.compute.manager [req-489ba25a-e8bd-460d-b42c-8d23e846e133 req-558b0a7c-a294-45a1-9d9f-08ffe666333e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] No waiting events found dispatching network-vif-unplugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:54 compute-0 nova_compute[182092]: 2026-01-23 09:16:54.652 182096 DEBUG nova.compute.manager [req-489ba25a-e8bd-460d-b42c-8d23e846e133 req-558b0a7c-a294-45a1-9d9f-08ffe666333e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-unplugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:16:54 compute-0 nova_compute[182092]: 2026-01-23 09:16:54.715 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.043 182096 INFO nova.compute.manager [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Took 3.14 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.044 182096 DEBUG nova.compute.manager [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.053 182096 DEBUG nova.compute.manager [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3icrgyih',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='08d44559-7cdd-441f-9146-8146388afa84',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(5f3f5758-6cde-449f-89ae-7388b7c40be4),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.065 182096 DEBUG nova.objects.instance [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lazy-loading 'migration_context' on Instance uuid 08d44559-7cdd-441f-9146-8146388afa84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.066 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.067 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.067 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.079 182096 DEBUG nova.virt.libvirt.vif [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:16:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1325274361',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1325274361',id=34,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:16:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-7050n681',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:16:46Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=08d44559-7cdd-441f-9146-8146388afa84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.079 182096 DEBUG nova.network.os_vif_util [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converting VIF {"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.080 182096 DEBUG nova.network.os_vif_util [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:3e:5b,bridge_name='br-int',has_traffic_filtering=True,id=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7344562c-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.080 182096 DEBUG nova.virt.libvirt.migration [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Updating guest XML with vif config: <interface type="ethernet">
Jan 23 09:16:55 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:8b:3e:5b"/>
Jan 23 09:16:55 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:16:55 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:16:55 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:16:55 compute-0 nova_compute[182092]:   <target dev="tap7344562c-b7"/>
Jan 23 09:16:55 compute-0 nova_compute[182092]: </interface>
Jan 23 09:16:55 compute-0 nova_compute[182092]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.081 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.395 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.395 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.395 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.396 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.534 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.535 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.535 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.535 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2c49518d-6513-4224-844e-0aab2ea675e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.569 182096 DEBUG nova.virt.libvirt.migration [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.569 182096 INFO nova.virt.libvirt.migration [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 23 09:16:55 compute-0 nova_compute[182092]: 2026-01-23 09:16:55.635 182096 INFO nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.137 182096 DEBUG nova.virt.libvirt.migration [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.139 182096 DEBUG nova.virt.libvirt.migration [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 23 09:16:56 compute-0 podman[213284]: 2026-01-23 09:16:56.247048157 +0000 UTC m=+0.079452806 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:16:56 compute-0 podman[213283]: 2026-01-23 09:16:56.250727272 +0000 UTC m=+0.083341806 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.643 182096 DEBUG nova.virt.libvirt.migration [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.644 182096 DEBUG nova.virt.libvirt.migration [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.735 182096 DEBUG nova.compute.manager [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.736 182096 DEBUG oslo_concurrency.lockutils [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.736 182096 DEBUG oslo_concurrency.lockutils [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.737 182096 DEBUG oslo_concurrency.lockutils [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.737 182096 DEBUG nova.compute.manager [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] No waiting events found dispatching network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.737 182096 WARNING nova.compute.manager [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received unexpected event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd for instance with vm_state active and task_state migrating.
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.738 182096 DEBUG nova.compute.manager [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-changed-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.738 182096 DEBUG nova.compute.manager [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Refreshing instance network info cache due to event network-changed-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.739 182096 DEBUG oslo_concurrency.lockutils [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.739 182096 DEBUG oslo_concurrency.lockutils [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.739 182096 DEBUG nova.network.neutron [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Refreshing network info cache for port 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.898 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.922 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159816.9227076, 08d44559-7cdd-441f-9146-8146388afa84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.923 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] VM Paused (Lifecycle Event)
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.941 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.945 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.958 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updating instance_info_cache with network_info: [{"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.977 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.982 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-2c49518d-6513-4224-844e-0aab2ea675e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.982 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.983 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.983 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.983 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:56 compute-0 nova_compute[182092]: 2026-01-23 09:16:56.984 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:16:57 compute-0 kernel: tap7344562c-b7 (unregistering): left promiscuous mode
Jan 23 09:16:57 compute-0 NetworkManager[54920]: <info>  [1769159817.0484] device (tap7344562c-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.054 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:57 compute-0 ovn_controller[94697]: 2026-01-23T09:16:57Z|00086|binding|INFO|Releasing lport 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd from this chassis (sb_readonly=0)
Jan 23 09:16:57 compute-0 ovn_controller[94697]: 2026-01-23T09:16:57Z|00087|binding|INFO|Setting lport 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd down in Southbound
Jan 23 09:16:57 compute-0 ovn_controller[94697]: 2026-01-23T09:16:57Z|00088|binding|INFO|Releasing lport ec0d269a-d5e0-4fd2-81cd-c44e42561b85 from this chassis (sb_readonly=0)
Jan 23 09:16:57 compute-0 ovn_controller[94697]: 2026-01-23T09:16:57Z|00089|binding|INFO|Setting lport ec0d269a-d5e0-4fd2-81cd-c44e42561b85 down in Southbound
Jan 23 09:16:57 compute-0 ovn_controller[94697]: 2026-01-23T09:16:57Z|00090|binding|INFO|Removing iface tap7344562c-b7 ovn-installed in OVS
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.055 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.058 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:76:53 19.80.0.162'], port_security=['fa:16:3e:d0:76:53 19.80.0.162'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-795994029', 'neutron:cidrs': '19.80.0.162/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-795994029', 'neutron:project_id': '3b790db365e14138976e54a3cdfc8140', 'neutron:revision_number': '3', 'neutron:security_group_ids': '4258f465-de11-4fd7-a213-c5cfeb4a3d2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=f3bf5f4f-838d-4900-9aed-b33358455586, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ec0d269a-d5e0-4fd2-81cd-c44e42561b85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:16:57 compute-0 ovn_controller[94697]: 2026-01-23T09:16:57Z|00091|binding|INFO|Releasing lport 840fa4b8-c4fe-41fd-bd18-1e22e94a7dc0 from this chassis (sb_readonly=0)
Jan 23 09:16:57 compute-0 ovn_controller[94697]: 2026-01-23T09:16:57Z|00092|binding|INFO|Releasing lport ad866155-6974-4330-8545-c4cf26f69fcd from this chassis (sb_readonly=0)
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.059 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:3e:5b 10.100.0.6'], port_security=['fa:16:3e:8b:3e:5b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1166248c-76b2-44bd-b867-c12c2a2e3d39'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2128270494', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '08d44559-7cdd-441f-9146-8146388afa84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2128270494', 'neutron:project_id': '3b790db365e14138976e54a3cdfc8140', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4258f465-de11-4fd7-a213-c5cfeb4a3d2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80efab28-ca7e-4b92-9afd-cf93bba23f1d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.060 103978 INFO neutron.agent.ovn.metadata.agent [-] Port ec0d269a-d5e0-4fd2-81cd-c44e42561b85 in datapath bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2 unbound from our chassis
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.061 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.062 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f347519a-9e65-4b51-806e-d897dc2840d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.062 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2 namespace which is not needed anymore
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.078 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.101 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:57 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 23 09:16:57 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000022.scope: Consumed 11.814s CPU time.
Jan 23 09:16:57 compute-0 systemd-machined[153562]: Machine qemu-16-instance-00000022 terminated.
Jan 23 09:16:57 compute-0 neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2[213198]: [NOTICE]   (213205) : haproxy version is 2.8.14-c23fe91
Jan 23 09:16:57 compute-0 neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2[213198]: [NOTICE]   (213205) : path to executable is /usr/sbin/haproxy
Jan 23 09:16:57 compute-0 neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2[213198]: [ALERT]    (213205) : Current worker (213208) exited with code 143 (Terminated)
Jan 23 09:16:57 compute-0 neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2[213198]: [WARNING]  (213205) : All workers exited. Exiting... (0)
Jan 23 09:16:57 compute-0 systemd[1]: libpod-451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc.scope: Deactivated successfully.
Jan 23 09:16:57 compute-0 podman[213341]: 2026-01-23 09:16:57.166856705 +0000 UTC m=+0.031678301 container died 451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 09:16:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc-userdata-shm.mount: Deactivated successfully.
Jan 23 09:16:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-876584badf6287ad538a33760330ac62eb499e6ad2701222258c953b0d793dcd-merged.mount: Deactivated successfully.
Jan 23 09:16:57 compute-0 podman[213341]: 2026-01-23 09:16:57.186708348 +0000 UTC m=+0.051529945 container cleanup 451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:16:57 compute-0 systemd[1]: libpod-conmon-451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc.scope: Deactivated successfully.
Jan 23 09:16:57 compute-0 podman[213365]: 2026-01-23 09:16:57.225001315 +0000 UTC m=+0.022765116 container remove 451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.228 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[092c4bff-12d0-4b98-82aa-f67794fc6b0c]: (4, ('Fri Jan 23 09:16:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2 (451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc)\n451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc\nFri Jan 23 09:16:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2 (451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc)\n451d8bf565f2846316b9dc9e50e183a5f4b64100c85790129f402886617f3afc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.229 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8f89aa-b92d-464e-aa19-1e3a4eb1a2d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.230 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbddcd2e5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.232 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:57 compute-0 kernel: tapbddcd2e5-70: left promiscuous mode
Jan 23 09:16:57 compute-0 NetworkManager[54920]: <info>  [1769159817.2486] manager: (tap7344562c-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.248 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.250 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a934405c-9e59-499c-8ac6-dbef07e4809e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.265 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[89647815-319a-43d6-b6d5-07133c084697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.266 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd4f7d3-359d-4da4-9989-5bd6d9a92ffc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.276 182096 DEBUG nova.virt.libvirt.guest [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.276 182096 INFO nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Migration operation has completed
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.277 182096 INFO nova.compute.manager [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] _post_live_migration() is started..
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.278 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.278 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[779ecd29-992e-4eec-b34a-46c78e45e826]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331081, 'reachable_time': 26849, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213390, 'error': None, 'target': 'ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.279 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.279 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 23 09:16:57 compute-0 systemd[1]: run-netns-ovnmeta\x2dbddcd2e5\x2d7d9d\x2d41ac\x2dbf50\x2ddfb6dd7af3b2.mount: Deactivated successfully.
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.280 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bddcd2e5-7d9d-41ac-bf50-dfb6dd7af3b2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.280 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[10b18cfc-e68c-4520-8c73-947eaf39efd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.281 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd in datapath e6fc57f5-3894-49ed-8321-7285a3da09a0 unbound from our chassis
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.282 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e6fc57f5-3894-49ed-8321-7285a3da09a0
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.292 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d49ea9a3-f391-401e-baf1-4257aced17eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.309 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e3fa6a-65c9-4d5a-8455-19f545f6151a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.311 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[76e5d6b4-6be7-4dc3-8a23-c8c120e5d14a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.327 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[22a28762-fc22-420c-ab19-c0f6b921cb40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.338 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c473561-9d6d-414e-9bfb-88bfbdf7d7cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape6fc57f5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:e6:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 329560, 'reachable_time': 30930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213400, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.346 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d817c8-51ab-4209-bf62-d7cd7439489c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape6fc57f5-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 329568, 'tstamp': 329568}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213401, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape6fc57f5-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 329570, 'tstamp': 329570}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213401, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.348 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6fc57f5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.349 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:57 compute-0 nova_compute[182092]: 2026-01-23 09:16:57.352 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.352 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6fc57f5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.352 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.353 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape6fc57f5-30, col_values=(('external_ids', {'iface-id': '840fa4b8-c4fe-41fd-bd18-1e22e94a7dc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:16:57.353 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.820 182096 DEBUG nova.compute.manager [req-bebd4822-9b89-42d2-92b4-fc8443dd2778 req-6c69f131-5af3-4e1e-a222-6b6d68bc17a7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-unplugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.820 182096 DEBUG oslo_concurrency.lockutils [req-bebd4822-9b89-42d2-92b4-fc8443dd2778 req-6c69f131-5af3-4e1e-a222-6b6d68bc17a7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.821 182096 DEBUG oslo_concurrency.lockutils [req-bebd4822-9b89-42d2-92b4-fc8443dd2778 req-6c69f131-5af3-4e1e-a222-6b6d68bc17a7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.821 182096 DEBUG oslo_concurrency.lockutils [req-bebd4822-9b89-42d2-92b4-fc8443dd2778 req-6c69f131-5af3-4e1e-a222-6b6d68bc17a7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.821 182096 DEBUG nova.compute.manager [req-bebd4822-9b89-42d2-92b4-fc8443dd2778 req-6c69f131-5af3-4e1e-a222-6b6d68bc17a7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] No waiting events found dispatching network-vif-unplugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.821 182096 DEBUG nova.compute.manager [req-bebd4822-9b89-42d2-92b4-fc8443dd2778 req-6c69f131-5af3-4e1e-a222-6b6d68bc17a7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-unplugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.898 182096 DEBUG nova.network.neutron [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Activated binding for port 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.899 182096 DEBUG nova.compute.manager [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.899 182096 DEBUG nova.virt.libvirt.vif [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:16:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1325274361',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1325274361',id=34,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:16:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-7050n681',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:16:50Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=08d44559-7cdd-441f-9146-8146388afa84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.900 182096 DEBUG nova.network.os_vif_util [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converting VIF {"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.900 182096 DEBUG nova.network.os_vif_util [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:3e:5b,bridge_name='br-int',has_traffic_filtering=True,id=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7344562c-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.900 182096 DEBUG os_vif [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:3e:5b,bridge_name='br-int',has_traffic_filtering=True,id=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7344562c-b7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.902 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.902 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7344562c-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.903 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.906 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.907 182096 INFO os_vif [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:3e:5b,bridge_name='br-int',has_traffic_filtering=True,id=7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7344562c-b7')
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.908 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.908 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.908 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.908 182096 DEBUG nova.compute.manager [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.908 182096 INFO nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Deleting instance files /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84_del
Jan 23 09:16:58 compute-0 nova_compute[182092]: 2026-01-23 09:16:58.909 182096 INFO nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Deletion of /var/lib/nova/instances/08d44559-7cdd-441f-9146-8146388afa84_del complete
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.165 182096 DEBUG nova.network.neutron [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Updated VIF entry in instance network info cache for port 7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.165 182096 DEBUG nova.network.neutron [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Updating instance_info_cache with network_info: [{"id": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "address": "fa:16:3e:8b:3e:5b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7344562c-b7", "ovs_interfaceid": "7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.186 182096 DEBUG oslo_concurrency.lockutils [req-a96db924-aab2-44ec-825a-450b1f1249a3 req-2ef95ef4-6461-4124-9f10-323b8683d41a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-08d44559-7cdd-441f-9146-8146388afa84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.851 182096 DEBUG nova.compute.manager [req-6374bc4a-3f8b-46f8-b1ea-9b5b15b517ca req-f8352dd7-604c-4da4-a323-be4cd14762c6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.851 182096 DEBUG oslo_concurrency.lockutils [req-6374bc4a-3f8b-46f8-b1ea-9b5b15b517ca req-f8352dd7-604c-4da4-a323-be4cd14762c6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.852 182096 DEBUG oslo_concurrency.lockutils [req-6374bc4a-3f8b-46f8-b1ea-9b5b15b517ca req-f8352dd7-604c-4da4-a323-be4cd14762c6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.852 182096 DEBUG oslo_concurrency.lockutils [req-6374bc4a-3f8b-46f8-b1ea-9b5b15b517ca req-f8352dd7-604c-4da4-a323-be4cd14762c6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.852 182096 DEBUG nova.compute.manager [req-6374bc4a-3f8b-46f8-b1ea-9b5b15b517ca req-f8352dd7-604c-4da4-a323-be4cd14762c6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] No waiting events found dispatching network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:16:59 compute-0 nova_compute[182092]: 2026-01-23 09:16:59.852 182096 WARNING nova.compute.manager [req-6374bc4a-3f8b-46f8-b1ea-9b5b15b517ca req-f8352dd7-604c-4da4-a323-be4cd14762c6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received unexpected event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd for instance with vm_state active and task_state migrating.
Jan 23 09:17:00 compute-0 podman[213404]: 2026-01-23 09:17:00.203360124 +0000 UTC m=+0.042389931 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.900 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.950 182096 DEBUG nova.compute.manager [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.950 182096 DEBUG oslo_concurrency.lockutils [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.950 182096 DEBUG oslo_concurrency.lockutils [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.950 182096 DEBUG oslo_concurrency.lockutils [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.951 182096 DEBUG nova.compute.manager [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] No waiting events found dispatching network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.951 182096 WARNING nova.compute.manager [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received unexpected event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd for instance with vm_state active and task_state migrating.
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.951 182096 DEBUG nova.compute.manager [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.951 182096 DEBUG oslo_concurrency.lockutils [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.951 182096 DEBUG oslo_concurrency.lockutils [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.951 182096 DEBUG oslo_concurrency.lockutils [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.951 182096 DEBUG nova.compute.manager [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] No waiting events found dispatching network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:01 compute-0 nova_compute[182092]: 2026-01-23 09:17:01.951 182096 WARNING nova.compute.manager [req-ad53c4dd-525c-4631-a179-9f91f801d507 req-90c751b0-27da-4b75-bc14-9b4f98c1ce3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Received unexpected event network-vif-plugged-7344562c-b7a8-4b23-8ae1-d2fb17c0b5fd for instance with vm_state active and task_state migrating.
Jan 23 09:17:03 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:17:03 compute-0 systemd[213244]: Activating special unit Exit the Session...
Jan 23 09:17:03 compute-0 systemd[213244]: Stopped target Main User Target.
Jan 23 09:17:03 compute-0 systemd[213244]: Stopped target Basic System.
Jan 23 09:17:03 compute-0 systemd[213244]: Stopped target Paths.
Jan 23 09:17:03 compute-0 systemd[213244]: Stopped target Sockets.
Jan 23 09:17:03 compute-0 systemd[213244]: Stopped target Timers.
Jan 23 09:17:03 compute-0 systemd[213244]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:17:03 compute-0 systemd[213244]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:17:03 compute-0 systemd[213244]: Closed D-Bus User Message Bus Socket.
Jan 23 09:17:03 compute-0 systemd[213244]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:17:03 compute-0 systemd[213244]: Removed slice User Application Slice.
Jan 23 09:17:03 compute-0 systemd[213244]: Reached target Shutdown.
Jan 23 09:17:03 compute-0 systemd[213244]: Finished Exit the Session.
Jan 23 09:17:03 compute-0 systemd[213244]: Reached target Exit the Session.
Jan 23 09:17:03 compute-0 nova_compute[182092]: 2026-01-23 09:17:03.904 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:03 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:17:03 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:17:03 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:17:03 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:17:03 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:17:03 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:17:03 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.480 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "08d44559-7cdd-441f-9146-8146388afa84-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.481 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.481 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "08d44559-7cdd-441f-9146-8146388afa84-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.503 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.503 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.504 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.504 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.554 182096 DEBUG oslo_concurrency.processutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.611 182096 DEBUG oslo_concurrency.processutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.612 182096 DEBUG oslo_concurrency.processutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.658 182096 DEBUG oslo_concurrency.processutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.863 182096 WARNING nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.866 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5527MB free_disk=73.3452033996582GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.866 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.867 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.924 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Migration for instance 08d44559-7cdd-441f-9146-8146388afa84 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.942 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.967 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Instance 2c49518d-6513-4224-844e-0aab2ea675e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.968 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Migration 5f3f5758-6cde-449f-89ae-7388b7c40be4 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.968 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:17:04 compute-0 nova_compute[182092]: 2026-01-23 09:17:04.968 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.013 182096 DEBUG nova.compute.provider_tree [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.029 182096 DEBUG nova.scheduler.client.report [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.047 182096 DEBUG nova.compute.resource_tracker [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.047 182096 DEBUG oslo_concurrency.lockutils [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.057 182096 INFO nova.compute.manager [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.142 182096 INFO nova.scheduler.client.report [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] Deleted allocation for migration 5f3f5758-6cde-449f-89ae-7388b7c40be4
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.143 182096 DEBUG nova.virt.libvirt.driver [None req-c3443dac-660a-4b5a-96cd-ca929648b505 fafdd8c5d7e045168751bef2bc933cbb 6e7f568f2a1d464a84ccf4cbe7109ce8 - - default default] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.968 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.968 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:05 compute-0 nova_compute[182092]: 2026-01-23 09:17:05.980 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.078 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.078 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.083 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.084 182096 INFO nova.compute.claims [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.178 182096 DEBUG nova.compute.provider_tree [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.199 182096 DEBUG nova.scheduler.client.report [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.228 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.229 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.290 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.291 182096 DEBUG nova.network.neutron [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.300 182096 INFO nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.315 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.414 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.415 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.416 182096 INFO nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Creating image(s)
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.416 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "/var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.417 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "/var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.417 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "/var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.427 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.485 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.486 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.487 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.496 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.543 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.544 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.567 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.568 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.568 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.615 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.616 182096 DEBUG nova.virt.disk.api [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Checking if we can resize image /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.616 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.662 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.663 182096 DEBUG nova.virt.disk.api [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Cannot resize image /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.664 182096 DEBUG nova.objects.instance [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lazy-loading 'migration_context' on Instance uuid f1821388-9fca-4290-87d4-9dd6bcf8a43c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.765 182096 DEBUG nova.policy [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '59ffc99eb9794e72b6a87f9d75fce29d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70ddb54a12cf4d1985e6acd166753f21', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.778 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.778 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Ensure instance console log exists: /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.779 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.779 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.779 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:06 compute-0 nova_compute[182092]: 2026-01-23 09:17:06.901 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:07 compute-0 nova_compute[182092]: 2026-01-23 09:17:07.446 182096 DEBUG nova.network.neutron [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Successfully created port: 206e535c-3563-4417-9605-fd69dab3e3e8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.091 182096 DEBUG nova.network.neutron [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Successfully updated port: 206e535c-3563-4417-9605-fd69dab3e3e8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.105 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.105 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquired lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.105 182096 DEBUG nova.network.neutron [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.258 182096 DEBUG nova.network.neutron [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.516 182096 DEBUG nova.compute.manager [req-8d88744a-7b14-4cbd-86af-2ddd447832b1 req-59d02fc6-aeed-43ec-a7e1-3a1045302b31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received event network-changed-206e535c-3563-4417-9605-fd69dab3e3e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.517 182096 DEBUG nova.compute.manager [req-8d88744a-7b14-4cbd-86af-2ddd447832b1 req-59d02fc6-aeed-43ec-a7e1-3a1045302b31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Refreshing instance network info cache due to event network-changed-206e535c-3563-4417-9605-fd69dab3e3e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.517 182096 DEBUG oslo_concurrency.lockutils [req-8d88744a-7b14-4cbd-86af-2ddd447832b1 req-59d02fc6-aeed-43ec-a7e1-3a1045302b31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:17:08 compute-0 nova_compute[182092]: 2026-01-23 09:17:08.906 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:08 compute-0 podman[213445]: 2026-01-23 09:17:08.976573684 +0000 UTC m=+0.059875167 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.159 182096 DEBUG nova.network.neutron [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Updating instance_info_cache with network_info: [{"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.171 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Releasing lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.171 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Instance network_info: |[{"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.171 182096 DEBUG oslo_concurrency.lockutils [req-8d88744a-7b14-4cbd-86af-2ddd447832b1 req-59d02fc6-aeed-43ec-a7e1-3a1045302b31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.172 182096 DEBUG nova.network.neutron [req-8d88744a-7b14-4cbd-86af-2ddd447832b1 req-59d02fc6-aeed-43ec-a7e1-3a1045302b31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Refreshing network info cache for port 206e535c-3563-4417-9605-fd69dab3e3e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.174 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Start _get_guest_xml network_info=[{"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.178 182096 WARNING nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.184 182096 DEBUG nova.virt.libvirt.host [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.184 182096 DEBUG nova.virt.libvirt.host [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.187 182096 DEBUG nova.virt.libvirt.host [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.187 182096 DEBUG nova.virt.libvirt.host [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.188 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.188 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.189 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.189 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.189 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.189 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.189 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.189 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.190 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.190 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.190 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.190 182096 DEBUG nova.virt.hardware [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.193 182096 DEBUG nova.virt.libvirt.vif [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:17:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-5851706',display_name='tempest-ServersTestJSON-server-5851706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-5851706',id=37,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIxx4zHnKAmeEKCZnsx+0+xWt6VmmMvNp5BlPZqzrMqbNnRLMUfgTclMkdYj4pdqpdfo7DP5+3XLcqI3TwrIZwGyqbMdO7pU7ys2L/v/E6T5JeBKyiRbSCSz6rTdC31xFA==',key_name='tempest-keypair-195960502',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70ddb54a12cf4d1985e6acd166753f21',ramdisk_id='',reservation_id='r-hlgux4hw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-104714381',owner_user_name='tempest-ServersTestJSON-104714381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:17:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='59ffc99eb9794e72b6a87f9d75fce29d',uuid=f1821388-9fca-4290-87d4-9dd6bcf8a43c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.193 182096 DEBUG nova.network.os_vif_util [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Converting VIF {"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.194 182096 DEBUG nova.network.os_vif_util [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:0d:16,bridge_name='br-int',has_traffic_filtering=True,id=206e535c-3563-4417-9605-fd69dab3e3e8,network=Network(b3483cba-74b1-4ca2-b026-c9f53cccfe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap206e535c-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.195 182096 DEBUG nova.objects.instance [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lazy-loading 'pci_devices' on Instance uuid f1821388-9fca-4290-87d4-9dd6bcf8a43c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.202 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <uuid>f1821388-9fca-4290-87d4-9dd6bcf8a43c</uuid>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <name>instance-00000025</name>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersTestJSON-server-5851706</nova:name>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:17:09</nova:creationTime>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:17:09 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:17:09 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:17:09 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:17:09 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:17:09 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:17:09 compute-0 nova_compute[182092]:         <nova:user uuid="59ffc99eb9794e72b6a87f9d75fce29d">tempest-ServersTestJSON-104714381-project-member</nova:user>
Jan 23 09:17:09 compute-0 nova_compute[182092]:         <nova:project uuid="70ddb54a12cf4d1985e6acd166753f21">tempest-ServersTestJSON-104714381</nova:project>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:17:09 compute-0 nova_compute[182092]:         <nova:port uuid="206e535c-3563-4417-9605-fd69dab3e3e8">
Jan 23 09:17:09 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <system>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <entry name="serial">f1821388-9fca-4290-87d4-9dd6bcf8a43c</entry>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <entry name="uuid">f1821388-9fca-4290-87d4-9dd6bcf8a43c</entry>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </system>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <os>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   </os>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <features>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   </features>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk.config"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:07:0d:16"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <target dev="tap206e535c-35"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/console.log" append="off"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <video>
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </video>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:17:09 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:17:09 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:17:09 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:17:09 compute-0 nova_compute[182092]: </domain>
Jan 23 09:17:09 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.203 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Preparing to wait for external event network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.203 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.204 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.204 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.204 182096 DEBUG nova.virt.libvirt.vif [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:17:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-5851706',display_name='tempest-ServersTestJSON-server-5851706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-5851706',id=37,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIxx4zHnKAmeEKCZnsx+0+xWt6VmmMvNp5BlPZqzrMqbNnRLMUfgTclMkdYj4pdqpdfo7DP5+3XLcqI3TwrIZwGyqbMdO7pU7ys2L/v/E6T5JeBKyiRbSCSz6rTdC31xFA==',key_name='tempest-keypair-195960502',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70ddb54a12cf4d1985e6acd166753f21',ramdisk_id='',reservation_id='r-hlgux4hw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-104714381',owner_user_name='tempest-ServersTestJSON-104714381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:17:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='59ffc99eb9794e72b6a87f9d75fce29d',uuid=f1821388-9fca-4290-87d4-9dd6bcf8a43c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.205 182096 DEBUG nova.network.os_vif_util [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Converting VIF {"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.205 182096 DEBUG nova.network.os_vif_util [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:0d:16,bridge_name='br-int',has_traffic_filtering=True,id=206e535c-3563-4417-9605-fd69dab3e3e8,network=Network(b3483cba-74b1-4ca2-b026-c9f53cccfe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap206e535c-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.205 182096 DEBUG os_vif [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:0d:16,bridge_name='br-int',has_traffic_filtering=True,id=206e535c-3563-4417-9605-fd69dab3e3e8,network=Network(b3483cba-74b1-4ca2-b026-c9f53cccfe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap206e535c-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.206 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.206 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.206 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.208 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.209 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap206e535c-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.209 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap206e535c-35, col_values=(('external_ids', {'iface-id': '206e535c-3563-4417-9605-fd69dab3e3e8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:07:0d:16', 'vm-uuid': 'f1821388-9fca-4290-87d4-9dd6bcf8a43c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:09 compute-0 NetworkManager[54920]: <info>  [1769159829.2110] manager: (tap206e535c-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.210 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.214 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.216 182096 INFO os_vif [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:0d:16,bridge_name='br-int',has_traffic_filtering=True,id=206e535c-3563-4417-9605-fd69dab3e3e8,network=Network(b3483cba-74b1-4ca2-b026-c9f53cccfe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap206e535c-35')
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.253 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.253 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.253 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] No VIF found with MAC fa:16:3e:07:0d:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.254 182096 INFO nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Using config drive
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.705 182096 INFO nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Creating config drive at /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk.config
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.710 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ge6462y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.829 182096 DEBUG oslo_concurrency.processutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ge6462y" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:09 compute-0 kernel: tap206e535c-35: entered promiscuous mode
Jan 23 09:17:09 compute-0 NetworkManager[54920]: <info>  [1769159829.8686] manager: (tap206e535c-35): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.870 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:09 compute-0 ovn_controller[94697]: 2026-01-23T09:17:09Z|00093|binding|INFO|Claiming lport 206e535c-3563-4417-9605-fd69dab3e3e8 for this chassis.
Jan 23 09:17:09 compute-0 ovn_controller[94697]: 2026-01-23T09:17:09Z|00094|binding|INFO|206e535c-3563-4417-9605-fd69dab3e3e8: Claiming fa:16:3e:07:0d:16 10.100.0.4
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.874 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.881 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:0d:16 10.100.0.4'], port_security=['fa:16:3e:07:0d:16 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f1821388-9fca-4290-87d4-9dd6bcf8a43c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3483cba-74b1-4ca2-b026-c9f53cccfe25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70ddb54a12cf4d1985e6acd166753f21', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fd9c031f-2dd7-47fe-8fec-005527785ac2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7ebffb0-186d-47d8-996c-b51a6ab2cd05, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=206e535c-3563-4417-9605-fd69dab3e3e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.881 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 206e535c-3563-4417-9605-fd69dab3e3e8 in datapath b3483cba-74b1-4ca2-b026-c9f53cccfe25 bound to our chassis
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.883 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b3483cba-74b1-4ca2-b026-c9f53cccfe25
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.891 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d75301-20ca-453c-8f13-b2c2bd64b213]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.891 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb3483cba-71 in ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.893 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb3483cba-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.893 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[20a965bf-60d1-4ad1-ab1a-1907ccea4129]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.893 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1285aa-7dc8-49b2-bdb7-13314bb52e93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:09 compute-0 systemd-machined[153562]: New machine qemu-17-instance-00000025.
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.903 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[39560bd2-7ff1-4a9d-adcc-3d133de46c42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:09 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000025.
Jan 23 09:17:09 compute-0 systemd-udevd[213490]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:17:09 compute-0 NetworkManager[54920]: <info>  [1769159829.9253] device (tap206e535c-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:17:09 compute-0 NetworkManager[54920]: <info>  [1769159829.9261] device (tap206e535c-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:17:09 compute-0 ovn_controller[94697]: 2026-01-23T09:17:09Z|00095|binding|INFO|Setting lport 206e535c-3563-4417-9605-fd69dab3e3e8 ovn-installed in OVS
Jan 23 09:17:09 compute-0 ovn_controller[94697]: 2026-01-23T09:17:09Z|00096|binding|INFO|Setting lport 206e535c-3563-4417-9605-fd69dab3e3e8 up in Southbound
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.934 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[02adaf2c-4687-4ce7-a50a-5a20c3904d5f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:09 compute-0 nova_compute[182092]: 2026-01-23 09:17:09.935 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.957 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e087ad2a-5831-4c7e-ad37-9997cfe9241d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:09 compute-0 NetworkManager[54920]: <info>  [1769159829.9628] manager: (tapb3483cba-70): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Jan 23 09:17:09 compute-0 systemd-udevd[213493]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.962 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c601cd-d7f9-427f-9f32-19ddb1e740f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.985 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1a115a25-8079-4fef-95d9-956207d7db02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:09.989 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd552d0-7bc2-47c8-bd6b-ef86ba468bbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:10 compute-0 NetworkManager[54920]: <info>  [1769159830.0054] device (tapb3483cba-70): carrier: link connected
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.008 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[cddb213c-8e56-4dd4-a5de-d8f748f1de99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.020 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd5313b-34ca-42ae-8ad8-f037d0a3c8b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3483cba-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:ff:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333549, 'reachable_time': 34689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213512, 'error': None, 'target': 'ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.031 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f1736fd3-4a95-4be2-aa07-c7750665b81d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:ff71'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333549, 'tstamp': 333549}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213513, 'error': None, 'target': 'ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.042 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3a62c2-2407-440b-a32c-576750a8797a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3483cba-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:ff:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333549, 'reachable_time': 34689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213514, 'error': None, 'target': 'ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.061 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7b79af20-fc1d-471c-a374-665352998ad1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.099 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d85fd618-6aca-4407-a496-4cc1149887ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.100 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3483cba-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.100 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.101 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3483cba-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:10 compute-0 NetworkManager[54920]: <info>  [1769159830.1027] manager: (tapb3483cba-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.102 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:10 compute-0 kernel: tapb3483cba-70: entered promiscuous mode
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.105 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb3483cba-70, col_values=(('external_ids', {'iface-id': 'f96f92b7-7bab-4a4d-9a20-b726b45d7b26'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:10 compute-0 ovn_controller[94697]: 2026-01-23T09:17:10Z|00097|binding|INFO|Releasing lport f96f92b7-7bab-4a4d-9a20-b726b45d7b26 from this chassis (sb_readonly=0)
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.106 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.119 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.119 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b3483cba-74b1-4ca2-b026-c9f53cccfe25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b3483cba-74b1-4ca2-b026-c9f53cccfe25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.120 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7707a8f7-fc24-4883-99a1-bb4b2ab09f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.121 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-b3483cba-74b1-4ca2-b026-c9f53cccfe25
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/b3483cba-74b1-4ca2-b026-c9f53cccfe25.pid.haproxy
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID b3483cba-74b1-4ca2-b026-c9f53cccfe25
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:17:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:10.122 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25', 'env', 'PROCESS_TAG=haproxy-b3483cba-74b1-4ca2-b026-c9f53cccfe25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b3483cba-74b1-4ca2-b026-c9f53cccfe25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.232 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159830.2315059, f1821388-9fca-4290-87d4-9dd6bcf8a43c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.232 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] VM Started (Lifecycle Event)
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.237 182096 DEBUG nova.network.neutron [req-8d88744a-7b14-4cbd-86af-2ddd447832b1 req-59d02fc6-aeed-43ec-a7e1-3a1045302b31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Updated VIF entry in instance network info cache for port 206e535c-3563-4417-9605-fd69dab3e3e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.238 182096 DEBUG nova.network.neutron [req-8d88744a-7b14-4cbd-86af-2ddd447832b1 req-59d02fc6-aeed-43ec-a7e1-3a1045302b31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Updating instance_info_cache with network_info: [{"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.255 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.256 182096 DEBUG oslo_concurrency.lockutils [req-8d88744a-7b14-4cbd-86af-2ddd447832b1 req-59d02fc6-aeed-43ec-a7e1-3a1045302b31 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.258 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159830.2341032, f1821388-9fca-4290-87d4-9dd6bcf8a43c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.258 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] VM Paused (Lifecycle Event)
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.269 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.271 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:17:10 compute-0 nova_compute[182092]: 2026-01-23 09:17:10.285 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:17:10 compute-0 podman[213549]: 2026-01-23 09:17:10.407263872 +0000 UTC m=+0.030141669 container create e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:17:10 compute-0 systemd[1]: Started libpod-conmon-e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b.scope.
Jan 23 09:17:10 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:17:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07d105a212fa154ba899b5255a24cea46943fb93b0eabb02c907132cd69b688a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:17:10 compute-0 podman[213549]: 2026-01-23 09:17:10.458831126 +0000 UTC m=+0.081708945 container init e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:17:10 compute-0 podman[213549]: 2026-01-23 09:17:10.463201375 +0000 UTC m=+0.086079173 container start e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:17:10 compute-0 podman[213549]: 2026-01-23 09:17:10.393456979 +0000 UTC m=+0.016334797 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:17:10 compute-0 neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25[213562]: [NOTICE]   (213566) : New worker (213568) forked
Jan 23 09:17:10 compute-0 neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25[213562]: [NOTICE]   (213566) : Loading success.
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.424 182096 DEBUG nova.compute.manager [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received event network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.425 182096 DEBUG oslo_concurrency.lockutils [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.425 182096 DEBUG oslo_concurrency.lockutils [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.425 182096 DEBUG oslo_concurrency.lockutils [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.425 182096 DEBUG nova.compute.manager [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Processing event network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.426 182096 DEBUG nova.compute.manager [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received event network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.426 182096 DEBUG oslo_concurrency.lockutils [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.426 182096 DEBUG oslo_concurrency.lockutils [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.426 182096 DEBUG oslo_concurrency.lockutils [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.427 182096 DEBUG nova.compute.manager [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] No waiting events found dispatching network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.427 182096 WARNING nova.compute.manager [req-2df013d7-fbf6-4508-bd79-45214b815bbd req-212b740d-02f7-4107-be79-0c57d4080bf3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received unexpected event network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 for instance with vm_state building and task_state spawning.
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.427 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.431 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159831.4307818, f1821388-9fca-4290-87d4-9dd6bcf8a43c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.431 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] VM Resumed (Lifecycle Event)
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.433 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.435 182096 INFO nova.virt.libvirt.driver [-] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Instance spawned successfully.
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.435 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.451 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.454 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.455 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.455 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.455 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.456 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.456 182096 DEBUG nova.virt.libvirt.driver [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.459 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.485 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.510 182096 INFO nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Took 5.10 seconds to spawn the instance on the hypervisor.
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.511 182096 DEBUG nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.570 182096 INFO nova.compute.manager [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Took 5.53 seconds to build instance.
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.588 182096 DEBUG oslo_concurrency.lockutils [None req-fe87d2f7-4271-4b9e-9add-e57b19bc6969 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.902 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.983 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.983 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.983 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.984 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.984 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.991 182096 INFO nova.compute.manager [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Terminating instance
Jan 23 09:17:11 compute-0 nova_compute[182092]: 2026-01-23 09:17:11.996 182096 DEBUG nova.compute.manager [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:17:12 compute-0 kernel: tap1eeea937-6c (unregistering): left promiscuous mode
Jan 23 09:17:12 compute-0 NetworkManager[54920]: <info>  [1769159832.0238] device (tap1eeea937-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:17:12 compute-0 ovn_controller[94697]: 2026-01-23T09:17:12Z|00098|binding|INFO|Releasing lport 1eeea937-6c85-4302-be8a-532452fb9f66 from this chassis (sb_readonly=0)
Jan 23 09:17:12 compute-0 ovn_controller[94697]: 2026-01-23T09:17:12Z|00099|binding|INFO|Setting lport 1eeea937-6c85-4302-be8a-532452fb9f66 down in Southbound
Jan 23 09:17:12 compute-0 ovn_controller[94697]: 2026-01-23T09:17:12Z|00100|binding|INFO|Removing iface tap1eeea937-6c ovn-installed in OVS
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.035 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.037 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.041 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:7b:0b 10.100.0.12'], port_security=['fa:16:3e:15:7b:0b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2c49518d-6513-4224-844e-0aab2ea675e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b790db365e14138976e54a3cdfc8140', 'neutron:revision_number': '20', 'neutron:security_group_ids': '4258f465-de11-4fd7-a213-c5cfeb4a3d2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80efab28-ca7e-4b92-9afd-cf93bba23f1d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1eeea937-6c85-4302-be8a-532452fb9f66) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.042 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1eeea937-6c85-4302-be8a-532452fb9f66 in datapath e6fc57f5-3894-49ed-8321-7285a3da09a0 unbound from our chassis
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.044 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e6fc57f5-3894-49ed-8321-7285a3da09a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.047 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf31f40-8a9c-4913-a209-e0bc09783a13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.048 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 namespace which is not needed anymore
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.053 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 23 09:17:12 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000020.scope: Consumed 3.858s CPU time.
Jan 23 09:17:12 compute-0 systemd-machined[153562]: Machine qemu-15-instance-00000020 terminated.
Jan 23 09:17:12 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[213020]: [NOTICE]   (213024) : haproxy version is 2.8.14-c23fe91
Jan 23 09:17:12 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[213020]: [NOTICE]   (213024) : path to executable is /usr/sbin/haproxy
Jan 23 09:17:12 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[213020]: [ALERT]    (213024) : Current worker (213026) exited with code 143 (Terminated)
Jan 23 09:17:12 compute-0 neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0[213020]: [WARNING]  (213024) : All workers exited. Exiting... (0)
Jan 23 09:17:12 compute-0 systemd[1]: libpod-4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583.scope: Deactivated successfully.
Jan 23 09:17:12 compute-0 podman[213591]: 2026-01-23 09:17:12.156753232 +0000 UTC m=+0.038830752 container died 4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:17:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583-userdata-shm.mount: Deactivated successfully.
Jan 23 09:17:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-738e74df8bbe9b9e44e1a6650d545f8574366dedf35b0e0322fd70e418021ce8-merged.mount: Deactivated successfully.
Jan 23 09:17:12 compute-0 podman[213591]: 2026-01-23 09:17:12.179969187 +0000 UTC m=+0.062046698 container cleanup 4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 09:17:12 compute-0 systemd[1]: libpod-conmon-4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583.scope: Deactivated successfully.
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.212 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.242 182096 INFO nova.virt.libvirt.driver [-] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Instance destroyed successfully.
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.242 182096 DEBUG nova.objects.instance [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lazy-loading 'resources' on Instance uuid 2c49518d-6513-4224-844e-0aab2ea675e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:17:12 compute-0 podman[213615]: 2026-01-23 09:17:12.249288853 +0000 UTC m=+0.053459906 container remove 4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.253 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa7ded8-7725-40f5-bac6-d00940c8c47f]: (4, ('Fri Jan 23 09:17:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 (4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583)\n4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583\nFri Jan 23 09:17:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 (4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583)\n4bf9b407b9dcf62cd2e32dcce5720c582e4be96b5b8f87eab62cf5bd1e5cd583\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.254 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c84e811-710e-4327-b0ff-ad5f46211704]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.255 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6fc57f5-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:12 compute-0 kernel: tape6fc57f5-30: left promiscuous mode
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.256 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.260 182096 DEBUG nova.virt.libvirt.vif [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:15:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-142208586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-142208586',id=32,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:16:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3b790db365e14138976e54a3cdfc8140',ramdisk_id='',reservation_id='r-ouizy327',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-779044727',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-779044727-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:16:33Z,user_data=None,user_id='93e6ff4cf2404b8db0db1ed141716461',uuid=2c49518d-6513-4224-844e-0aab2ea675e7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.260 182096 DEBUG nova.network.os_vif_util [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converting VIF {"id": "1eeea937-6c85-4302-be8a-532452fb9f66", "address": "fa:16:3e:15:7b:0b", "network": {"id": "e6fc57f5-3894-49ed-8321-7285a3da09a0", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1230031000-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b790db365e14138976e54a3cdfc8140", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1eeea937-6c", "ovs_interfaceid": "1eeea937-6c85-4302-be8a-532452fb9f66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.261 182096 DEBUG nova.network.os_vif_util [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.261 182096 DEBUG os_vif [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.263 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.263 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1eeea937-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.264 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.266 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.271 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.272 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.273 182096 INFO os_vif [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:7b:0b,bridge_name='br-int',has_traffic_filtering=True,id=1eeea937-6c85-4302-be8a-532452fb9f66,network=Network(e6fc57f5-3894-49ed-8321-7285a3da09a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1eeea937-6c')
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.274 182096 INFO nova.virt.libvirt.driver [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Deleting instance files /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7_del
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.274 182096 INFO nova.virt.libvirt.driver [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Deletion of /var/lib/nova/instances/2c49518d-6513-4224-844e-0aab2ea675e7_del complete
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.276 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159817.2752671, 08d44559-7cdd-441f-9146-8146388afa84 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.276 182096 INFO nova.compute.manager [-] [instance: 08d44559-7cdd-441f-9146-8146388afa84] VM Stopped (Lifecycle Event)
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.274 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a1e21d-c9a3-4624-a656-16f448bcf29a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.284 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[df08df33-3494-41e4-9bfc-1b18a6456d25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.285 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[80432eeb-fa18-4a49-aa19-f386238a3f88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.299 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[71920787-01d8-4055-a935-59f86a3abae9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 329555, 'reachable_time': 19634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213642, 'error': None, 'target': 'ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.301 182096 DEBUG nova.compute.manager [None req-c4db8b96-b019-4c3f-9cbb-4185fd476f5a - - - - - -] [instance: 08d44559-7cdd-441f-9146-8146388afa84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:12 compute-0 systemd[1]: run-netns-ovnmeta\x2de6fc57f5\x2d3894\x2d49ed\x2d8321\x2d7285a3da09a0.mount: Deactivated successfully.
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.304 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e6fc57f5-3894-49ed-8321-7285a3da09a0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:17:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:12.304 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[147ef673-8205-4d69-9a84-39269aad6de7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.329 182096 INFO nova.compute.manager [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.330 182096 DEBUG oslo.service.loopingcall [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.330 182096 DEBUG nova.compute.manager [-] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.330 182096 DEBUG nova.network.neutron [-] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.890 182096 DEBUG nova.network.neutron [-] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.913 182096 INFO nova.compute.manager [-] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Took 0.58 seconds to deallocate network for instance.
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.965 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.966 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:12 compute-0 nova_compute[182092]: 2026-01-23 09:17:12.971 182096 DEBUG nova.compute.manager [req-c8b5085b-8da2-4f0d-99e8-f7dffee2cf1f req-0a496b2e-1c47-4e75-ac8d-00d724cabd40 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-deleted-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.038 182096 DEBUG nova.compute.provider_tree [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.049 182096 DEBUG nova.scheduler.client.report [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.065 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.093 182096 INFO nova.scheduler.client.report [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Deleted allocations for instance 2c49518d-6513-4224-844e-0aab2ea675e7
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.145 182096 DEBUG oslo_concurrency.lockutils [None req-79e3a4fa-c11e-4d56-99a5-053ff83b8c57 93e6ff4cf2404b8db0db1ed141716461 3b790db365e14138976e54a3cdfc8140 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.505 182096 DEBUG nova.compute.manager [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.505 182096 DEBUG oslo_concurrency.lockutils [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.506 182096 DEBUG oslo_concurrency.lockutils [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.506 182096 DEBUG oslo_concurrency.lockutils [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.506 182096 DEBUG nova.compute.manager [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.507 182096 WARNING nova.compute.manager [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-unplugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state deleted and task_state None.
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.507 182096 DEBUG nova.compute.manager [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.507 182096 DEBUG oslo_concurrency.lockutils [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.507 182096 DEBUG oslo_concurrency.lockutils [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.508 182096 DEBUG oslo_concurrency.lockutils [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2c49518d-6513-4224-844e-0aab2ea675e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.508 182096 DEBUG nova.compute.manager [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] No waiting events found dispatching network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:13 compute-0 nova_compute[182092]: 2026-01-23 09:17:13.508 182096 WARNING nova.compute.manager [req-d4714b53-2cd2-46b4-a553-3b7357479ca4 req-9c647a56-8613-4603-99eb-d6e031e43c45 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Received unexpected event network-vif-plugged-1eeea937-6c85-4302-be8a-532452fb9f66 for instance with vm_state deleted and task_state None.
Jan 23 09:17:14 compute-0 NetworkManager[54920]: <info>  [1769159834.2378] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 23 09:17:14 compute-0 NetworkManager[54920]: <info>  [1769159834.2384] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 23 09:17:14 compute-0 nova_compute[182092]: 2026-01-23 09:17:14.239 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:14 compute-0 nova_compute[182092]: 2026-01-23 09:17:14.394 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:14 compute-0 ovn_controller[94697]: 2026-01-23T09:17:14Z|00101|binding|INFO|Releasing lport f96f92b7-7bab-4a4d-9a20-b726b45d7b26 from this chassis (sb_readonly=0)
Jan 23 09:17:14 compute-0 nova_compute[182092]: 2026-01-23 09:17:14.408 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:14 compute-0 nova_compute[182092]: 2026-01-23 09:17:14.429 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:15 compute-0 podman[213644]: 2026-01-23 09:17:15.204315205 +0000 UTC m=+0.042546567 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 09:17:15 compute-0 podman[213645]: 2026-01-23 09:17:15.215175607 +0000 UTC m=+0.051825183 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:17:15 compute-0 nova_compute[182092]: 2026-01-23 09:17:15.580 182096 DEBUG nova.compute.manager [req-989703fb-9416-487d-893e-55f65ebf3a7b req-cc2bb423-4036-4d76-8e25-c7793eed56ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received event network-changed-206e535c-3563-4417-9605-fd69dab3e3e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:15 compute-0 nova_compute[182092]: 2026-01-23 09:17:15.580 182096 DEBUG nova.compute.manager [req-989703fb-9416-487d-893e-55f65ebf3a7b req-cc2bb423-4036-4d76-8e25-c7793eed56ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Refreshing instance network info cache due to event network-changed-206e535c-3563-4417-9605-fd69dab3e3e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:17:15 compute-0 nova_compute[182092]: 2026-01-23 09:17:15.580 182096 DEBUG oslo_concurrency.lockutils [req-989703fb-9416-487d-893e-55f65ebf3a7b req-cc2bb423-4036-4d76-8e25-c7793eed56ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:17:15 compute-0 nova_compute[182092]: 2026-01-23 09:17:15.580 182096 DEBUG oslo_concurrency.lockutils [req-989703fb-9416-487d-893e-55f65ebf3a7b req-cc2bb423-4036-4d76-8e25-c7793eed56ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:17:15 compute-0 nova_compute[182092]: 2026-01-23 09:17:15.581 182096 DEBUG nova.network.neutron [req-989703fb-9416-487d-893e-55f65ebf3a7b req-cc2bb423-4036-4d76-8e25-c7793eed56ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Refreshing network info cache for port 206e535c-3563-4417-9605-fd69dab3e3e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:17:15 compute-0 ovn_controller[94697]: 2026-01-23T09:17:15Z|00102|binding|INFO|Releasing lport f96f92b7-7bab-4a4d-9a20-b726b45d7b26 from this chassis (sb_readonly=0)
Jan 23 09:17:15 compute-0 nova_compute[182092]: 2026-01-23 09:17:15.848 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:16 compute-0 nova_compute[182092]: 2026-01-23 09:17:16.903 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:17 compute-0 nova_compute[182092]: 2026-01-23 09:17:17.264 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:18 compute-0 nova_compute[182092]: 2026-01-23 09:17:18.125 182096 DEBUG nova.network.neutron [req-989703fb-9416-487d-893e-55f65ebf3a7b req-cc2bb423-4036-4d76-8e25-c7793eed56ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Updated VIF entry in instance network info cache for port 206e535c-3563-4417-9605-fd69dab3e3e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:17:18 compute-0 nova_compute[182092]: 2026-01-23 09:17:18.126 182096 DEBUG nova.network.neutron [req-989703fb-9416-487d-893e-55f65ebf3a7b req-cc2bb423-4036-4d76-8e25-c7793eed56ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Updating instance_info_cache with network_info: [{"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:17:18 compute-0 nova_compute[182092]: 2026-01-23 09:17:18.147 182096 DEBUG oslo_concurrency.lockutils [req-989703fb-9416-487d-893e-55f65ebf3a7b req-cc2bb423-4036-4d76-8e25-c7793eed56ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-f1821388-9fca-4290-87d4-9dd6bcf8a43c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:17:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:19.186 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:17:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:19.186 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:17:19 compute-0 nova_compute[182092]: 2026-01-23 09:17:19.188 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:21 compute-0 ovn_controller[94697]: 2026-01-23T09:17:21Z|00103|binding|INFO|Releasing lport f96f92b7-7bab-4a4d-9a20-b726b45d7b26 from this chassis (sb_readonly=0)
Jan 23 09:17:21 compute-0 nova_compute[182092]: 2026-01-23 09:17:21.870 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:21 compute-0 nova_compute[182092]: 2026-01-23 09:17:21.905 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:22 compute-0 nova_compute[182092]: 2026-01-23 09:17:22.265 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:22 compute-0 ovn_controller[94697]: 2026-01-23T09:17:22Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:07:0d:16 10.100.0.4
Jan 23 09:17:22 compute-0 ovn_controller[94697]: 2026-01-23T09:17:22Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:07:0d:16 10.100.0.4
Jan 23 09:17:26 compute-0 nova_compute[182092]: 2026-01-23 09:17:26.906 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:27 compute-0 podman[213695]: 2026-01-23 09:17:27.207159241 +0000 UTC m=+0.038732057 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:17:27 compute-0 podman[213694]: 2026-01-23 09:17:27.229415455 +0000 UTC m=+0.063013833 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 09:17:27 compute-0 nova_compute[182092]: 2026-01-23 09:17:27.240 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159832.2391603, 2c49518d-6513-4224-844e-0aab2ea675e7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:27 compute-0 nova_compute[182092]: 2026-01-23 09:17:27.241 182096 INFO nova.compute.manager [-] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] VM Stopped (Lifecycle Event)
Jan 23 09:17:27 compute-0 nova_compute[182092]: 2026-01-23 09:17:27.260 182096 DEBUG nova.compute.manager [None req-be19a7f9-1ee7-46a1-87dc-f6ba664c93ac - - - - - -] [instance: 2c49518d-6513-4224-844e-0aab2ea675e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:27 compute-0 nova_compute[182092]: 2026-01-23 09:17:27.266 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:28.188 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:29 compute-0 ovn_controller[94697]: 2026-01-23T09:17:29Z|00104|binding|INFO|Releasing lport f96f92b7-7bab-4a4d-9a20-b726b45d7b26 from this chassis (sb_readonly=0)
Jan 23 09:17:29 compute-0 nova_compute[182092]: 2026-01-23 09:17:29.827 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 podman[213731]: 2026-01-23 09:17:31.206182464 +0000 UTC m=+0.042780738 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.578 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.578 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.578 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.579 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.579 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.585 182096 INFO nova.compute.manager [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Terminating instance
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.590 182096 DEBUG nova.compute.manager [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:17:31 compute-0 kernel: tap206e535c-35 (unregistering): left promiscuous mode
Jan 23 09:17:31 compute-0 NetworkManager[54920]: <info>  [1769159851.6167] device (tap206e535c-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:17:31 compute-0 ovn_controller[94697]: 2026-01-23T09:17:31Z|00105|binding|INFO|Releasing lport 206e535c-3563-4417-9605-fd69dab3e3e8 from this chassis (sb_readonly=0)
Jan 23 09:17:31 compute-0 ovn_controller[94697]: 2026-01-23T09:17:31Z|00106|binding|INFO|Setting lport 206e535c-3563-4417-9605-fd69dab3e3e8 down in Southbound
Jan 23 09:17:31 compute-0 ovn_controller[94697]: 2026-01-23T09:17:31Z|00107|binding|INFO|Removing iface tap206e535c-35 ovn-installed in OVS
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.621 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.623 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:0d:16 10.100.0.4'], port_security=['fa:16:3e:07:0d:16 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f1821388-9fca-4290-87d4-9dd6bcf8a43c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3483cba-74b1-4ca2-b026-c9f53cccfe25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70ddb54a12cf4d1985e6acd166753f21', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fd9c031f-2dd7-47fe-8fec-005527785ac2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.224'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7ebffb0-186d-47d8-996c-b51a6ab2cd05, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=206e535c-3563-4417-9605-fd69dab3e3e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.624 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 206e535c-3563-4417-9605-fd69dab3e3e8 in datapath b3483cba-74b1-4ca2-b026-c9f53cccfe25 unbound from our chassis
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.625 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3483cba-74b1-4ca2-b026-c9f53cccfe25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.627 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7e916d4c-577a-4c38-b801-b5e090657ed9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.627 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25 namespace which is not needed anymore
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.634 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 23 09:17:31 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000025.scope: Consumed 11.079s CPU time.
Jan 23 09:17:31 compute-0 systemd-machined[153562]: Machine qemu-17-instance-00000025 terminated.
Jan 23 09:17:31 compute-0 neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25[213562]: [NOTICE]   (213566) : haproxy version is 2.8.14-c23fe91
Jan 23 09:17:31 compute-0 neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25[213562]: [NOTICE]   (213566) : path to executable is /usr/sbin/haproxy
Jan 23 09:17:31 compute-0 neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25[213562]: [WARNING]  (213566) : Exiting Master process...
Jan 23 09:17:31 compute-0 neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25[213562]: [WARNING]  (213566) : Exiting Master process...
Jan 23 09:17:31 compute-0 neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25[213562]: [ALERT]    (213566) : Current worker (213568) exited with code 143 (Terminated)
Jan 23 09:17:31 compute-0 neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25[213562]: [WARNING]  (213566) : All workers exited. Exiting... (0)
Jan 23 09:17:31 compute-0 systemd[1]: libpod-e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b.scope: Deactivated successfully.
Jan 23 09:17:31 compute-0 podman[213771]: 2026-01-23 09:17:31.72386092 +0000 UTC m=+0.035174609 container died e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:17:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b-userdata-shm.mount: Deactivated successfully.
Jan 23 09:17:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-07d105a212fa154ba899b5255a24cea46943fb93b0eabb02c907132cd69b688a-merged.mount: Deactivated successfully.
Jan 23 09:17:31 compute-0 podman[213771]: 2026-01-23 09:17:31.744908855 +0000 UTC m=+0.056222544 container cleanup e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:17:31 compute-0 systemd[1]: libpod-conmon-e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b.scope: Deactivated successfully.
Jan 23 09:17:31 compute-0 podman[213796]: 2026-01-23 09:17:31.783348685 +0000 UTC m=+0.023286541 container remove e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.786 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d99a3d67-d0e2-4c4b-b0ac-3aa6c0b629a8]: (4, ('Fri Jan 23 09:17:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25 (e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b)\ne1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b\nFri Jan 23 09:17:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25 (e1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b)\ne1af96152c7891b32c2ddacf1d5e7da504224dd6493b099bcf609abb2193e84b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.788 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[573a5b58-3f80-4360-a706-9809b48f514e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.789 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3483cba-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.790 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 kernel: tapb3483cba-70: left promiscuous mode
Jan 23 09:17:31 compute-0 NetworkManager[54920]: <info>  [1769159851.8055] manager: (tap206e535c-35): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.805 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.807 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[854ecfb5-5ccd-4175-b89e-73ce0c30a4b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.815 182096 DEBUG nova.compute.manager [req-f4fc6e18-e73d-4ca0-951d-98d09c36439e req-f0203a6a-f474-4f60-9b2d-f3cd252710b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received event network-vif-unplugged-206e535c-3563-4417-9605-fd69dab3e3e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.815 182096 DEBUG oslo_concurrency.lockutils [req-f4fc6e18-e73d-4ca0-951d-98d09c36439e req-f0203a6a-f474-4f60-9b2d-f3cd252710b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.815 182096 DEBUG oslo_concurrency.lockutils [req-f4fc6e18-e73d-4ca0-951d-98d09c36439e req-f0203a6a-f474-4f60-9b2d-f3cd252710b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.815 182096 DEBUG oslo_concurrency.lockutils [req-f4fc6e18-e73d-4ca0-951d-98d09c36439e req-f0203a6a-f474-4f60-9b2d-f3cd252710b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.815 182096 DEBUG nova.compute.manager [req-f4fc6e18-e73d-4ca0-951d-98d09c36439e req-f0203a6a-f474-4f60-9b2d-f3cd252710b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] No waiting events found dispatching network-vif-unplugged-206e535c-3563-4417-9605-fd69dab3e3e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.816 182096 DEBUG nova.compute.manager [req-f4fc6e18-e73d-4ca0-951d-98d09c36439e req-f0203a6a-f474-4f60-9b2d-f3cd252710b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received event network-vif-unplugged-206e535c-3563-4417-9605-fd69dab3e3e8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.816 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd62d2d-f566-4178-9348-e482c88eaf34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.817 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d55a7b6c-cff8-4ef1-b0b7-713b717fb6f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.830 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f870df-3b33-41cf-b0a4-e91ec87e7bc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333543, 'reachable_time': 39025, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213819, 'error': None, 'target': 'ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:31 compute-0 systemd[1]: run-netns-ovnmeta\x2db3483cba\x2d74b1\x2d4ca2\x2db026\x2dc9f53cccfe25.mount: Deactivated successfully.
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.833 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b3483cba-74b1-4ca2-b026-c9f53cccfe25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:17:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:31.833 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[a3414469-2ed3-41dc-9644-be70d18ee24c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.837 182096 INFO nova.virt.libvirt.driver [-] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Instance destroyed successfully.
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.837 182096 DEBUG nova.objects.instance [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lazy-loading 'resources' on Instance uuid f1821388-9fca-4290-87d4-9dd6bcf8a43c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.857 182096 DEBUG nova.virt.libvirt.vif [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:17:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-5851706',display_name='tempest-ServersTestJSON-server-5851706',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-5851706',id=37,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIxx4zHnKAmeEKCZnsx+0+xWt6VmmMvNp5BlPZqzrMqbNnRLMUfgTclMkdYj4pdqpdfo7DP5+3XLcqI3TwrIZwGyqbMdO7pU7ys2L/v/E6T5JeBKyiRbSCSz6rTdC31xFA==',key_name='tempest-keypair-195960502',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:17:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70ddb54a12cf4d1985e6acd166753f21',ramdisk_id='',reservation_id='r-hlgux4hw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-104714381',owner_user_name='tempest-ServersTestJSON-104714381-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:17:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='59ffc99eb9794e72b6a87f9d75fce29d',uuid=f1821388-9fca-4290-87d4-9dd6bcf8a43c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.857 182096 DEBUG nova.network.os_vif_util [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Converting VIF {"id": "206e535c-3563-4417-9605-fd69dab3e3e8", "address": "fa:16:3e:07:0d:16", "network": {"id": "b3483cba-74b1-4ca2-b026-c9f53cccfe25", "bridge": "br-int", "label": "tempest-ServersTestJSON-2090952204-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ddb54a12cf4d1985e6acd166753f21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap206e535c-35", "ovs_interfaceid": "206e535c-3563-4417-9605-fd69dab3e3e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.858 182096 DEBUG nova.network.os_vif_util [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:07:0d:16,bridge_name='br-int',has_traffic_filtering=True,id=206e535c-3563-4417-9605-fd69dab3e3e8,network=Network(b3483cba-74b1-4ca2-b026-c9f53cccfe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap206e535c-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.858 182096 DEBUG os_vif [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:07:0d:16,bridge_name='br-int',has_traffic_filtering=True,id=206e535c-3563-4417-9605-fd69dab3e3e8,network=Network(b3483cba-74b1-4ca2-b026-c9f53cccfe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap206e535c-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.859 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.859 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap206e535c-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.860 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.862 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.863 182096 INFO os_vif [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:07:0d:16,bridge_name='br-int',has_traffic_filtering=True,id=206e535c-3563-4417-9605-fd69dab3e3e8,network=Network(b3483cba-74b1-4ca2-b026-c9f53cccfe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap206e535c-35')
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.863 182096 INFO nova.virt.libvirt.driver [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Deleting instance files /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c_del
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.864 182096 INFO nova.virt.libvirt.driver [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Deletion of /var/lib/nova/instances/f1821388-9fca-4290-87d4-9dd6bcf8a43c_del complete
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.907 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.937 182096 INFO nova.compute.manager [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.937 182096 DEBUG oslo.service.loopingcall [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.937 182096 DEBUG nova.compute.manager [-] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.937 182096 DEBUG nova.network.neutron [-] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:17:31 compute-0 nova_compute[182092]: 2026-01-23 09:17:31.963 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.194 182096 DEBUG nova.network.neutron [-] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.213 182096 INFO nova.compute.manager [-] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Took 1.28 seconds to deallocate network for instance.
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.273 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.273 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.330 182096 DEBUG nova.compute.provider_tree [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.358 182096 DEBUG nova.scheduler.client.report [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.372 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.405 182096 INFO nova.scheduler.client.report [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Deleted allocations for instance f1821388-9fca-4290-87d4-9dd6bcf8a43c
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.439 182096 DEBUG nova.compute.manager [req-2a4a65e7-f0a9-4766-9674-5a933c5ef747 req-803c2d56-de88-46f9-9749-6a1c0a8cbaf6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received event network-vif-deleted-206e535c-3563-4417-9605-fd69dab3e3e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.458 182096 DEBUG oslo_concurrency.lockutils [None req-e98d7769-069a-4ba7-beb6-93dd5b7e31e5 59ffc99eb9794e72b6a87f9d75fce29d 70ddb54a12cf4d1985e6acd166753f21 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.913 182096 DEBUG nova.compute.manager [req-907e679d-3007-4699-bbb6-adfad582722d req-3f4eb195-a068-4ec6-bfb0-4c0b10414ea2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received event network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.914 182096 DEBUG oslo_concurrency.lockutils [req-907e679d-3007-4699-bbb6-adfad582722d req-3f4eb195-a068-4ec6-bfb0-4c0b10414ea2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.914 182096 DEBUG oslo_concurrency.lockutils [req-907e679d-3007-4699-bbb6-adfad582722d req-3f4eb195-a068-4ec6-bfb0-4c0b10414ea2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.914 182096 DEBUG oslo_concurrency.lockutils [req-907e679d-3007-4699-bbb6-adfad582722d req-3f4eb195-a068-4ec6-bfb0-4c0b10414ea2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "f1821388-9fca-4290-87d4-9dd6bcf8a43c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.914 182096 DEBUG nova.compute.manager [req-907e679d-3007-4699-bbb6-adfad582722d req-3f4eb195-a068-4ec6-bfb0-4c0b10414ea2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] No waiting events found dispatching network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:33 compute-0 nova_compute[182092]: 2026-01-23 09:17:33.915 182096 WARNING nova.compute.manager [req-907e679d-3007-4699-bbb6-adfad582722d req-3f4eb195-a068-4ec6-bfb0-4c0b10414ea2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Received unexpected event network-vif-plugged-206e535c-3563-4417-9605-fd69dab3e3e8 for instance with vm_state deleted and task_state None.
Jan 23 09:17:34 compute-0 nova_compute[182092]: 2026-01-23 09:17:34.342 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:36 compute-0 nova_compute[182092]: 2026-01-23 09:17:36.587 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:36 compute-0 nova_compute[182092]: 2026-01-23 09:17:36.694 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:36 compute-0 nova_compute[182092]: 2026-01-23 09:17:36.860 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:36 compute-0 nova_compute[182092]: 2026-01-23 09:17:36.862 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:36 compute-0 nova_compute[182092]: 2026-01-23 09:17:36.908 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:39 compute-0 podman[213826]: 2026-01-23 09:17:39.223889389 +0000 UTC m=+0.056645554 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 09:17:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:39.853 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:39.854 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:39.854 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:41 compute-0 nova_compute[182092]: 2026-01-23 09:17:41.860 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:41 compute-0 nova_compute[182092]: 2026-01-23 09:17:41.910 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.034 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "a5a9581e-f52e-4598-ae15-8e339c74934a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.035 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.051 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.170 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.171 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.177 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.177 182096 INFO nova.compute.claims [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.298 182096 DEBUG nova.compute.provider_tree [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.312 182096 DEBUG nova.scheduler.client.report [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.329 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.329 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.382 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.382 182096 DEBUG nova.network.neutron [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.409 182096 INFO nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.423 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.494 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.495 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.495 182096 INFO nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Creating image(s)
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.496 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "/var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.496 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "/var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.497 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "/var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.506 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.553 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.554 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.554 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.564 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.610 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.611 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.630 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.631 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.631 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.673 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.674 182096 DEBUG nova.virt.disk.api [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Checking if we can resize image /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.675 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.717 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.718 182096 DEBUG nova.virt.disk.api [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Cannot resize image /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.719 182096 DEBUG nova.objects.instance [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lazy-loading 'migration_context' on Instance uuid a5a9581e-f52e-4598-ae15-8e339c74934a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.731 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.731 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Ensure instance console log exists: /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.731 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.732 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.732 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:44 compute-0 nova_compute[182092]: 2026-01-23 09:17:44.779 182096 DEBUG nova.policy [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '253337df3a1e419f8f4a54d802b8c9ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1206bf77516840f091e751991f350872', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:17:45 compute-0 nova_compute[182092]: 2026-01-23 09:17:45.581 182096 DEBUG nova.network.neutron [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Successfully created port: 3795ea91-1b70-4c7a-b447-4db72566d706 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:17:46 compute-0 podman[213865]: 2026-01-23 09:17:46.204678678 +0000 UTC m=+0.039187264 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:17:46 compute-0 podman[213864]: 2026-01-23 09:17:46.209138134 +0000 UTC m=+0.045963844 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.834 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159851.8337307, f1821388-9fca-4290-87d4-9dd6bcf8a43c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.835 182096 INFO nova.compute.manager [-] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] VM Stopped (Lifecycle Event)
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.852 182096 DEBUG nova.compute.manager [None req-8984dc23-6c42-40fc-9d3a-abeae2d6f729 - - - - - -] [instance: f1821388-9fca-4290-87d4-9dd6bcf8a43c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.861 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.911 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.984 182096 DEBUG nova.network.neutron [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Successfully updated port: 3795ea91-1b70-4c7a-b447-4db72566d706 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.996 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "refresh_cache-a5a9581e-f52e-4598-ae15-8e339c74934a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.996 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquired lock "refresh_cache-a5a9581e-f52e-4598-ae15-8e339c74934a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:17:46 compute-0 nova_compute[182092]: 2026-01-23 09:17:46.997 182096 DEBUG nova.network.neutron [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:17:47 compute-0 nova_compute[182092]: 2026-01-23 09:17:47.243 182096 DEBUG nova.network.neutron [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:17:47 compute-0 nova_compute[182092]: 2026-01-23 09:17:47.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:47 compute-0 nova_compute[182092]: 2026-01-23 09:17:47.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.197 182096 DEBUG nova.compute.manager [req-890d0757-732d-4200-a92f-14ffe1a8dbc2 req-01860e94-4a7f-4421-bce5-ef64f57cf013 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received event network-changed-3795ea91-1b70-4c7a-b447-4db72566d706 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.197 182096 DEBUG nova.compute.manager [req-890d0757-732d-4200-a92f-14ffe1a8dbc2 req-01860e94-4a7f-4421-bce5-ef64f57cf013 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Refreshing instance network info cache due to event network-changed-3795ea91-1b70-4c7a-b447-4db72566d706. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.197 182096 DEBUG oslo_concurrency.lockutils [req-890d0757-732d-4200-a92f-14ffe1a8dbc2 req-01860e94-4a7f-4421-bce5-ef64f57cf013 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a5a9581e-f52e-4598-ae15-8e339c74934a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.374 182096 DEBUG nova.network.neutron [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Updating instance_info_cache with network_info: [{"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.395 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Releasing lock "refresh_cache-a5a9581e-f52e-4598-ae15-8e339c74934a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.395 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Instance network_info: |[{"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.396 182096 DEBUG oslo_concurrency.lockutils [req-890d0757-732d-4200-a92f-14ffe1a8dbc2 req-01860e94-4a7f-4421-bce5-ef64f57cf013 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a5a9581e-f52e-4598-ae15-8e339c74934a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.396 182096 DEBUG nova.network.neutron [req-890d0757-732d-4200-a92f-14ffe1a8dbc2 req-01860e94-4a7f-4421-bce5-ef64f57cf013 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Refreshing network info cache for port 3795ea91-1b70-4c7a-b447-4db72566d706 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.398 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Start _get_guest_xml network_info=[{"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.402 182096 WARNING nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.407 182096 DEBUG nova.virt.libvirt.host [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.407 182096 DEBUG nova.virt.libvirt.host [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.413 182096 DEBUG nova.virt.libvirt.host [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.413 182096 DEBUG nova.virt.libvirt.host [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.414 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.414 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.415 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.415 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.415 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.415 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.416 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.416 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.416 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.416 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.416 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.417 182096 DEBUG nova.virt.hardware [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.419 182096 DEBUG nova.virt.libvirt.vif [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:17:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1504457031',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1504457031',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1504457031',id=39,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1206bf77516840f091e751991f350872',ramdisk_id='',reservation_id='r-iv3dvlip',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-39172396',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-39172396-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:17:44Z,user_data=None,user_id='253337df3a1e419f8f4a54d802b8c9ce',uuid=a5a9581e-f52e-4598-ae15-8e339c74934a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.419 182096 DEBUG nova.network.os_vif_util [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Converting VIF {"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.420 182096 DEBUG nova.network.os_vif_util [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:54:4e,bridge_name='br-int',has_traffic_filtering=True,id=3795ea91-1b70-4c7a-b447-4db72566d706,network=Network(3313e41f-d6ab-4945-ae6f-a417b9e33e09),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3795ea91-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.420 182096 DEBUG nova.objects.instance [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lazy-loading 'pci_devices' on Instance uuid a5a9581e-f52e-4598-ae15-8e339c74934a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.429 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <uuid>a5a9581e-f52e-4598-ae15-8e339c74934a</uuid>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <name>instance-00000027</name>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1504457031</nova:name>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:17:48</nova:creationTime>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:17:48 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:17:48 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:17:48 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:17:48 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:17:48 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:17:48 compute-0 nova_compute[182092]:         <nova:user uuid="253337df3a1e419f8f4a54d802b8c9ce">tempest-ImagesOneServerNegativeTestJSON-39172396-project-member</nova:user>
Jan 23 09:17:48 compute-0 nova_compute[182092]:         <nova:project uuid="1206bf77516840f091e751991f350872">tempest-ImagesOneServerNegativeTestJSON-39172396</nova:project>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:17:48 compute-0 nova_compute[182092]:         <nova:port uuid="3795ea91-1b70-4c7a-b447-4db72566d706">
Jan 23 09:17:48 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <system>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <entry name="serial">a5a9581e-f52e-4598-ae15-8e339c74934a</entry>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <entry name="uuid">a5a9581e-f52e-4598-ae15-8e339c74934a</entry>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </system>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <os>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   </os>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <features>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   </features>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk.config"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:ed:54:4e"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <target dev="tap3795ea91-1b"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/console.log" append="off"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <video>
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </video>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:17:48 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:17:48 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:17:48 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:17:48 compute-0 nova_compute[182092]: </domain>
Jan 23 09:17:48 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.430 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Preparing to wait for external event network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.430 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.430 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.430 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.431 182096 DEBUG nova.virt.libvirt.vif [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:17:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1504457031',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1504457031',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1504457031',id=39,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1206bf77516840f091e751991f350872',ramdisk_id='',reservation_id='r-iv3dvlip',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-39172396',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-39172396-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:17:44Z,user_data=None,user_id='253337df3a1e419f8f4a54d802b8c9ce',uuid=a5a9581e-f52e-4598-ae15-8e339c74934a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.431 182096 DEBUG nova.network.os_vif_util [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Converting VIF {"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.431 182096 DEBUG nova.network.os_vif_util [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:54:4e,bridge_name='br-int',has_traffic_filtering=True,id=3795ea91-1b70-4c7a-b447-4db72566d706,network=Network(3313e41f-d6ab-4945-ae6f-a417b9e33e09),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3795ea91-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.432 182096 DEBUG os_vif [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:54:4e,bridge_name='br-int',has_traffic_filtering=True,id=3795ea91-1b70-4c7a-b447-4db72566d706,network=Network(3313e41f-d6ab-4945-ae6f-a417b9e33e09),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3795ea91-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.432 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.432 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.433 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.436 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.437 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3795ea91-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.437 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3795ea91-1b, col_values=(('external_ids', {'iface-id': '3795ea91-1b70-4c7a-b447-4db72566d706', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:54:4e', 'vm-uuid': 'a5a9581e-f52e-4598-ae15-8e339c74934a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.438 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:48 compute-0 NetworkManager[54920]: <info>  [1769159868.4389] manager: (tap3795ea91-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.440 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.443 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.443 182096 INFO os_vif [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:54:4e,bridge_name='br-int',has_traffic_filtering=True,id=3795ea91-1b70-4c7a-b447-4db72566d706,network=Network(3313e41f-d6ab-4945-ae6f-a417b9e33e09),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3795ea91-1b')
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.482 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.483 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.483 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] No VIF found with MAC fa:16:3e:ed:54:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.483 182096 INFO nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Using config drive
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.789 182096 INFO nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Creating config drive at /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk.config
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.794 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb2naoy3b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.910 182096 DEBUG oslo_concurrency.processutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb2naoy3b" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:48 compute-0 kernel: tap3795ea91-1b: entered promiscuous mode
Jan 23 09:17:48 compute-0 NetworkManager[54920]: <info>  [1769159868.9600] manager: (tap3795ea91-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.958 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:48 compute-0 ovn_controller[94697]: 2026-01-23T09:17:48Z|00108|binding|INFO|Claiming lport 3795ea91-1b70-4c7a-b447-4db72566d706 for this chassis.
Jan 23 09:17:48 compute-0 ovn_controller[94697]: 2026-01-23T09:17:48Z|00109|binding|INFO|3795ea91-1b70-4c7a-b447-4db72566d706: Claiming fa:16:3e:ed:54:4e 10.100.0.12
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.961 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:48 compute-0 nova_compute[182092]: 2026-01-23 09:17:48.964 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.971 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:54:4e 10.100.0.12'], port_security=['fa:16:3e:ed:54:4e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a5a9581e-f52e-4598-ae15-8e339c74934a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3313e41f-d6ab-4945-ae6f-a417b9e33e09', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1206bf77516840f091e751991f350872', 'neutron:revision_number': '2', 'neutron:security_group_ids': '234838e2-87ef-4f1a-9117-0457e7a6503e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=292c17ac-dbc7-49ac-b916-0b8266b77000, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=3795ea91-1b70-4c7a-b447-4db72566d706) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.972 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 3795ea91-1b70-4c7a-b447-4db72566d706 in datapath 3313e41f-d6ab-4945-ae6f-a417b9e33e09 bound to our chassis
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.973 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3313e41f-d6ab-4945-ae6f-a417b9e33e09
Jan 23 09:17:48 compute-0 systemd-udevd[213923]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.985 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2449e364-c078-4067-ad16-8436faba37b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.986 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3313e41f-d1 in ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.987 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3313e41f-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.987 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0080d171-000c-4897-bec5-1d80d4c362c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.989 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[20fd1f2d-960e-42cd-b35a-f44510afeb4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:48 compute-0 systemd-machined[153562]: New machine qemu-18-instance-00000027.
Jan 23 09:17:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:48.997 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[aaeb4ad4-2d88-45f3-bc4c-bbaf16572e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 NetworkManager[54920]: <info>  [1769159869.0016] device (tap3795ea91-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:17:49 compute-0 NetworkManager[54920]: <info>  [1769159869.0024] device (tap3795ea91-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:17:49 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000027.
Jan 23 09:17:49 compute-0 ovn_controller[94697]: 2026-01-23T09:17:49Z|00110|binding|INFO|Setting lport 3795ea91-1b70-4c7a-b447-4db72566d706 ovn-installed in OVS
Jan 23 09:17:49 compute-0 ovn_controller[94697]: 2026-01-23T09:17:49Z|00111|binding|INFO|Setting lport 3795ea91-1b70-4c7a-b447-4db72566d706 up in Southbound
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.028 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[993d0947-df9c-480b-b060-6bca3a86ae85]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.029 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.049 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f777d7-0764-4e73-8170-bd0930a96256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.052 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2f0915-1901-4e56-a1be-5ac267f839c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 NetworkManager[54920]: <info>  [1769159869.0538] manager: (tap3313e41f-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Jan 23 09:17:49 compute-0 systemd-udevd[213926]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.075 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[964b3381-6592-4ddc-a510-84a1f7becf1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.077 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[15377485-12e4-4082-8c26-ad1446cb56ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 NetworkManager[54920]: <info>  [1769159869.0939] device (tap3313e41f-d0): carrier: link connected
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.098 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[162d3b3f-b13c-4556-9f0a-25d6a46069eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.110 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bc820fa7-dd63-4646-8692-7ae1e147ca0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3313e41f-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:a2:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337457, 'reachable_time': 38572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213947, 'error': None, 'target': 'ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.123 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf1aa29-1b41-4a7f-a028-021d10a7bd12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:a2b9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 337457, 'tstamp': 337457}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213948, 'error': None, 'target': 'ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.135 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[edefb836-71ce-4303-8905-82b270997b69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3313e41f-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:a2:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337457, 'reachable_time': 38572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213949, 'error': None, 'target': 'ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.158 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[673e4110-ed28-49b4-9fdf-7b8e5b7546bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.202 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc079b7-f8a9-4152-be11-a3e8a0934be7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.203 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3313e41f-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.203 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.204 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3313e41f-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:49 compute-0 kernel: tap3313e41f-d0: entered promiscuous mode
Jan 23 09:17:49 compute-0 NetworkManager[54920]: <info>  [1769159869.2062] manager: (tap3313e41f-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.209 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3313e41f-d0, col_values=(('external_ids', {'iface-id': 'd57e9987-e490-4f63-893d-ef75bd2efa1d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:49 compute-0 ovn_controller[94697]: 2026-01-23T09:17:49Z|00112|binding|INFO|Releasing lport d57e9987-e490-4f63-893d-ef75bd2efa1d from this chassis (sb_readonly=0)
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.212 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3313e41f-d6ab-4945-ae6f-a417b9e33e09.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3313e41f-d6ab-4945-ae6f-a417b9e33e09.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.213 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a406e0bc-6f9a-4afd-b8e5-286f6222f365]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.214 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-3313e41f-d6ab-4945-ae6f-a417b9e33e09
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/3313e41f-d6ab-4945-ae6f-a417b9e33e09.pid.haproxy
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 3313e41f-d6ab-4945-ae6f-a417b9e33e09
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:17:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:49.215 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09', 'env', 'PROCESS_TAG=haproxy-3313e41f-d6ab-4945-ae6f-a417b9e33e09', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3313e41f-d6ab-4945-ae6f-a417b9e33e09.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.205 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.224 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.327 182096 DEBUG nova.compute.manager [req-63444c26-d93b-48a0-8691-6775e036a15c req-7e913484-00f6-49cf-8d34-b6f99087a6dc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received event network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.327 182096 DEBUG oslo_concurrency.lockutils [req-63444c26-d93b-48a0-8691-6775e036a15c req-7e913484-00f6-49cf-8d34-b6f99087a6dc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.328 182096 DEBUG oslo_concurrency.lockutils [req-63444c26-d93b-48a0-8691-6775e036a15c req-7e913484-00f6-49cf-8d34-b6f99087a6dc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.328 182096 DEBUG oslo_concurrency.lockutils [req-63444c26-d93b-48a0-8691-6775e036a15c req-7e913484-00f6-49cf-8d34-b6f99087a6dc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.328 182096 DEBUG nova.compute.manager [req-63444c26-d93b-48a0-8691-6775e036a15c req-7e913484-00f6-49cf-8d34-b6f99087a6dc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Processing event network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:17:49 compute-0 podman[213977]: 2026-01-23 09:17:49.510139065 +0000 UTC m=+0.038412408 container create 18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 09:17:49 compute-0 systemd[1]: Started libpod-conmon-18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa.scope.
Jan 23 09:17:49 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:17:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b310c804f6dbfd9b0cb7e242be63ff27632d035dd3be90ffe9b6fde359d6f6a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:17:49 compute-0 podman[213977]: 2026-01-23 09:17:49.576733795 +0000 UTC m=+0.105007138 container init 18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 09:17:49 compute-0 podman[213977]: 2026-01-23 09:17:49.581437232 +0000 UTC m=+0.109710565 container start 18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 09:17:49 compute-0 podman[213977]: 2026-01-23 09:17:49.489440158 +0000 UTC m=+0.017713512 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:17:49 compute-0 neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09[213989]: [NOTICE]   (213995) : New worker (214001) forked
Jan 23 09:17:49 compute-0 neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09[213989]: [NOTICE]   (213995) : Loading success.
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.664 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.665 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159869.6636624, a5a9581e-f52e-4598-ae15-8e339c74934a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.666 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] VM Started (Lifecycle Event)
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.675 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.678 182096 INFO nova.virt.libvirt.driver [-] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Instance spawned successfully.
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.679 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.680 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.682 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.701 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.701 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159869.6643705, a5a9581e-f52e-4598-ae15-8e339c74934a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.702 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] VM Paused (Lifecycle Event)
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.705 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.705 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.705 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.706 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.706 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.707 182096 DEBUG nova.virt.libvirt.driver [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.723 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.725 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159869.6659107, a5a9581e-f52e-4598-ae15-8e339c74934a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.726 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] VM Resumed (Lifecycle Event)
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.745 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.747 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.765 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.769 182096 INFO nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Took 5.27 seconds to spawn the instance on the hypervisor.
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.769 182096 DEBUG nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.828 182096 INFO nova.compute.manager [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Took 5.71 seconds to build instance.
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.862 182096 DEBUG oslo_concurrency.lockutils [None req-8533291d-63ee-434e-8f06-530470bc1d09 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.890 182096 DEBUG nova.network.neutron [req-890d0757-732d-4200-a92f-14ffe1a8dbc2 req-01860e94-4a7f-4421-bce5-ef64f57cf013 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Updated VIF entry in instance network info cache for port 3795ea91-1b70-4c7a-b447-4db72566d706. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.891 182096 DEBUG nova.network.neutron [req-890d0757-732d-4200-a92f-14ffe1a8dbc2 req-01860e94-4a7f-4421-bce5-ef64f57cf013 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Updating instance_info_cache with network_info: [{"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:17:49 compute-0 nova_compute[182092]: 2026-01-23 09:17:49.906 182096 DEBUG oslo_concurrency.lockutils [req-890d0757-732d-4200-a92f-14ffe1a8dbc2 req-01860e94-4a7f-4421-bce5-ef64f57cf013 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a5a9581e-f52e-4598-ae15-8e339c74934a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:17:51 compute-0 nova_compute[182092]: 2026-01-23 09:17:51.539 182096 DEBUG nova.compute.manager [req-59dad271-d2ed-41d6-8c94-be0ebbc4a486 req-79bb764d-86d6-488b-bbc4-ca38ce00fc2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received event network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:51 compute-0 nova_compute[182092]: 2026-01-23 09:17:51.540 182096 DEBUG oslo_concurrency.lockutils [req-59dad271-d2ed-41d6-8c94-be0ebbc4a486 req-79bb764d-86d6-488b-bbc4-ca38ce00fc2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:51 compute-0 nova_compute[182092]: 2026-01-23 09:17:51.540 182096 DEBUG oslo_concurrency.lockutils [req-59dad271-d2ed-41d6-8c94-be0ebbc4a486 req-79bb764d-86d6-488b-bbc4-ca38ce00fc2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:51 compute-0 nova_compute[182092]: 2026-01-23 09:17:51.541 182096 DEBUG oslo_concurrency.lockutils [req-59dad271-d2ed-41d6-8c94-be0ebbc4a486 req-79bb764d-86d6-488b-bbc4-ca38ce00fc2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:51 compute-0 nova_compute[182092]: 2026-01-23 09:17:51.541 182096 DEBUG nova.compute.manager [req-59dad271-d2ed-41d6-8c94-be0ebbc4a486 req-79bb764d-86d6-488b-bbc4-ca38ce00fc2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] No waiting events found dispatching network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:51 compute-0 nova_compute[182092]: 2026-01-23 09:17:51.541 182096 WARNING nova.compute.manager [req-59dad271-d2ed-41d6-8c94-be0ebbc4a486 req-79bb764d-86d6-488b-bbc4-ca38ce00fc2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received unexpected event network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 for instance with vm_state active and task_state None.
Jan 23 09:17:51 compute-0 nova_compute[182092]: 2026-01-23 09:17:51.913 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:52 compute-0 nova_compute[182092]: 2026-01-23 09:17:52.662 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:52 compute-0 nova_compute[182092]: 2026-01-23 09:17:52.663 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.439 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.662 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.663 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.663 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.664 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.707 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.765 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.766 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:53 compute-0 nova_compute[182092]: 2026-01-23 09:17:53.824 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.035 182096 DEBUG nova.compute.manager [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.046 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.047 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5553MB free_disk=73.37330627441406GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.047 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.047 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.126 182096 INFO nova.compute.manager [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] instance snapshotting
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.226 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance a5a9581e-f52e-4598-ae15-8e339c74934a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.227 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.227 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.296 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.368 182096 INFO nova.virt.libvirt.driver [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Beginning live snapshot process
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.370 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.370 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.397 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.420 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.458 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.468 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.491 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.491 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:54 compute-0 virtqemud[181713]: invalid argument: disk vda does not have an active block job
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.527 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.584 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.585 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.641 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.652 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.665 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.665 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.677 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.677 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.710 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.711 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpmil6vrns/7b30d26fd66741319e5b1c179d662a0c.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.737 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpmil6vrns/7b30d26fd66741319e5b1c179d662a0c.delta 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.738 182096 INFO nova.virt.libvirt.driver [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 23 09:17:54 compute-0 nova_compute[182092]: 2026-01-23 09:17:54.776 182096 DEBUG nova.virt.libvirt.guest [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 23 09:17:55 compute-0 nova_compute[182092]: 2026-01-23 09:17:55.278 182096 DEBUG nova.virt.libvirt.guest [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 23 09:17:55 compute-0 nova_compute[182092]: 2026-01-23 09:17:55.281 182096 INFO nova.virt.libvirt.driver [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 23 09:17:55 compute-0 nova_compute[182092]: 2026-01-23 09:17:55.302 182096 DEBUG nova.privsep.utils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:17:55 compute-0 nova_compute[182092]: 2026-01-23 09:17:55.302 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpmil6vrns/7b30d26fd66741319e5b1c179d662a0c.delta /var/lib/nova/instances/snapshots/tmpmil6vrns/7b30d26fd66741319e5b1c179d662a0c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:17:55 compute-0 nova_compute[182092]: 2026-01-23 09:17:55.374 182096 DEBUG oslo_concurrency.processutils [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpmil6vrns/7b30d26fd66741319e5b1c179d662a0c.delta /var/lib/nova/instances/snapshots/tmpmil6vrns/7b30d26fd66741319e5b1c179d662a0c" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:17:55 compute-0 nova_compute[182092]: 2026-01-23 09:17:55.375 182096 INFO nova.virt.libvirt.driver [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Snapshot extracted, beginning image upload
Jan 23 09:17:55 compute-0 nova_compute[182092]: 2026-01-23 09:17:55.660 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:55 compute-0 nova_compute[182092]: 2026-01-23 09:17:55.677 182096 WARNING nova.compute.manager [None req-07102151-3184-44b8-984d-aff1bc05253e 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Image not found during snapshot: nova.exception.ImageNotFound: Image f68e9dd0-c68c-47eb-82c1-124513bbce21 could not be found.
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.590 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "a5a9581e-f52e-4598-ae15-8e339c74934a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.591 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.591 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.591 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.592 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.598 182096 INFO nova.compute.manager [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Terminating instance
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.604 182096 DEBUG nova.compute.manager [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:17:56 compute-0 kernel: tap3795ea91-1b (unregistering): left promiscuous mode
Jan 23 09:17:56 compute-0 NetworkManager[54920]: <info>  [1769159876.6207] device (tap3795ea91-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.626 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 ovn_controller[94697]: 2026-01-23T09:17:56Z|00113|binding|INFO|Releasing lport 3795ea91-1b70-4c7a-b447-4db72566d706 from this chassis (sb_readonly=0)
Jan 23 09:17:56 compute-0 ovn_controller[94697]: 2026-01-23T09:17:56Z|00114|binding|INFO|Setting lport 3795ea91-1b70-4c7a-b447-4db72566d706 down in Southbound
Jan 23 09:17:56 compute-0 ovn_controller[94697]: 2026-01-23T09:17:56Z|00115|binding|INFO|Removing iface tap3795ea91-1b ovn-installed in OVS
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.629 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.641 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:54:4e 10.100.0.12'], port_security=['fa:16:3e:ed:54:4e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a5a9581e-f52e-4598-ae15-8e339c74934a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3313e41f-d6ab-4945-ae6f-a417b9e33e09', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1206bf77516840f091e751991f350872', 'neutron:revision_number': '4', 'neutron:security_group_ids': '234838e2-87ef-4f1a-9117-0457e7a6503e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=292c17ac-dbc7-49ac-b916-0b8266b77000, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=3795ea91-1b70-4c7a-b447-4db72566d706) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.642 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 3795ea91-1b70-4c7a-b447-4db72566d706 in datapath 3313e41f-d6ab-4945-ae6f-a417b9e33e09 unbound from our chassis
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.644 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3313e41f-d6ab-4945-ae6f-a417b9e33e09, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.646 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.645 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3400ad6c-a89c-4201-87b6-5ca8fcff2b3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.647 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09 namespace which is not needed anymore
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:17:56 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 23 09:17:56 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000027.scope: Consumed 7.651s CPU time.
Jan 23 09:17:56 compute-0 systemd-machined[153562]: Machine qemu-18-instance-00000027 terminated.
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.663 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:56 compute-0 neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09[213989]: [NOTICE]   (213995) : haproxy version is 2.8.14-c23fe91
Jan 23 09:17:56 compute-0 neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09[213989]: [NOTICE]   (213995) : path to executable is /usr/sbin/haproxy
Jan 23 09:17:56 compute-0 neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09[213989]: [WARNING]  (213995) : Exiting Master process...
Jan 23 09:17:56 compute-0 neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09[213989]: [ALERT]    (213995) : Current worker (214001) exited with code 143 (Terminated)
Jan 23 09:17:56 compute-0 neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09[213989]: [WARNING]  (213995) : All workers exited. Exiting... (0)
Jan 23 09:17:56 compute-0 systemd[1]: libpod-18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa.scope: Deactivated successfully.
Jan 23 09:17:56 compute-0 podman[214062]: 2026-01-23 09:17:56.744196758 +0000 UTC m=+0.035183066 container died 18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa-userdata-shm.mount: Deactivated successfully.
Jan 23 09:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-4b310c804f6dbfd9b0cb7e242be63ff27632d035dd3be90ffe9b6fde359d6f6a-merged.mount: Deactivated successfully.
Jan 23 09:17:56 compute-0 podman[214062]: 2026-01-23 09:17:56.764947261 +0000 UTC m=+0.055933569 container cleanup 18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:17:56 compute-0 systemd[1]: libpod-conmon-18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa.scope: Deactivated successfully.
Jan 23 09:17:56 compute-0 podman[214084]: 2026-01-23 09:17:56.8064171 +0000 UTC m=+0.023993846 container remove 18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.811 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4e502f7a-2f90-42e2-87e8-272ecc88d5e1]: (4, ('Fri Jan 23 09:17:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09 (18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa)\n18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa\nFri Jan 23 09:17:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09 (18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa)\n18b3b0ba1ca7fc8b86c278aec6f79a5e65959113943f8050da76d8149e2beeaa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.812 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2549a0c6-dfad-482f-b19a-7909cb5af073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.813 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3313e41f-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.814 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 kernel: tap3313e41f-d0: left promiscuous mode
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.828 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.831 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.833 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb4ccdc-962b-4dad-93a1-5980535bba8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.841 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[409b6448-feb4-4250-94d3-b18bdeeec597]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.842 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b01716a3-046d-4935-a15f-9d08842c122a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.858 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[242abbe4-f7a4-47b5-8b89-950a3f58eb70]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337453, 'reachable_time': 31314, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214109, 'error': None, 'target': 'ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.858 182096 INFO nova.virt.libvirt.driver [-] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Instance destroyed successfully.
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.859 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3313e41f-d6ab-4945-ae6f-a417b9e33e09 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:17:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:17:56.860 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[92912be4-eade-41f6-9fd1-9391a628842d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:17:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d3313e41f\x2dd6ab\x2d4945\x2dae6f\x2da417b9e33e09.mount: Deactivated successfully.
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.859 182096 DEBUG nova.objects.instance [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lazy-loading 'resources' on Instance uuid a5a9581e-f52e-4598-ae15-8e339c74934a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.878 182096 DEBUG nova.virt.libvirt.vif [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:17:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1504457031',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1504457031',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1504457031',id=39,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:17:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1206bf77516840f091e751991f350872',ramdisk_id='',reservation_id='r-iv3dvlip',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-39172396',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-39172396-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:17:55Z,user_data=None,user_id='253337df3a1e419f8f4a54d802b8c9ce',uuid=a5a9581e-f52e-4598-ae15-8e339c74934a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.878 182096 DEBUG nova.network.os_vif_util [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Converting VIF {"id": "3795ea91-1b70-4c7a-b447-4db72566d706", "address": "fa:16:3e:ed:54:4e", "network": {"id": "3313e41f-d6ab-4945-ae6f-a417b9e33e09", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1490083839-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1206bf77516840f091e751991f350872", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3795ea91-1b", "ovs_interfaceid": "3795ea91-1b70-4c7a-b447-4db72566d706", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.879 182096 DEBUG nova.network.os_vif_util [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:54:4e,bridge_name='br-int',has_traffic_filtering=True,id=3795ea91-1b70-4c7a-b447-4db72566d706,network=Network(3313e41f-d6ab-4945-ae6f-a417b9e33e09),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3795ea91-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.879 182096 DEBUG os_vif [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:54:4e,bridge_name='br-int',has_traffic_filtering=True,id=3795ea91-1b70-4c7a-b447-4db72566d706,network=Network(3313e41f-d6ab-4945-ae6f-a417b9e33e09),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3795ea91-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.880 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.881 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3795ea91-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.882 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.883 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.885 182096 INFO os_vif [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:54:4e,bridge_name='br-int',has_traffic_filtering=True,id=3795ea91-1b70-4c7a-b447-4db72566d706,network=Network(3313e41f-d6ab-4945-ae6f-a417b9e33e09),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3795ea91-1b')
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.885 182096 INFO nova.virt.libvirt.driver [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Deleting instance files /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a_del
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.886 182096 INFO nova.virt.libvirt.driver [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Deletion of /var/lib/nova/instances/a5a9581e-f52e-4598-ae15-8e339c74934a_del complete
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.914 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.947 182096 INFO nova.compute.manager [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.947 182096 DEBUG oslo.service.loopingcall [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.947 182096 DEBUG nova.compute.manager [-] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:17:56 compute-0 nova_compute[182092]: 2026-01-23 09:17:56.947 182096 DEBUG nova.network.neutron [-] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.514 182096 DEBUG nova.network.neutron [-] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.534 182096 INFO nova.compute.manager [-] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Took 0.59 seconds to deallocate network for instance.
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.582 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.582 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.643 182096 DEBUG nova.compute.provider_tree [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.651 182096 DEBUG nova.scheduler.client.report [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.664 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.695 182096 INFO nova.scheduler.client.report [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Deleted allocations for instance a5a9581e-f52e-4598-ae15-8e339c74934a
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.764 182096 DEBUG oslo_concurrency.lockutils [None req-07a0d167-96a8-497a-854f-b27ffc7092b6 253337df3a1e419f8f4a54d802b8c9ce 1206bf77516840f091e751991f350872 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.926 182096 DEBUG nova.compute.manager [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received event network-vif-unplugged-3795ea91-1b70-4c7a-b447-4db72566d706 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.926 182096 DEBUG oslo_concurrency.lockutils [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.927 182096 DEBUG oslo_concurrency.lockutils [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.927 182096 DEBUG oslo_concurrency.lockutils [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.927 182096 DEBUG nova.compute.manager [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] No waiting events found dispatching network-vif-unplugged-3795ea91-1b70-4c7a-b447-4db72566d706 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.927 182096 WARNING nova.compute.manager [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received unexpected event network-vif-unplugged-3795ea91-1b70-4c7a-b447-4db72566d706 for instance with vm_state deleted and task_state None.
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.927 182096 DEBUG nova.compute.manager [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received event network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.928 182096 DEBUG oslo_concurrency.lockutils [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.928 182096 DEBUG oslo_concurrency.lockutils [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.928 182096 DEBUG oslo_concurrency.lockutils [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a5a9581e-f52e-4598-ae15-8e339c74934a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.928 182096 DEBUG nova.compute.manager [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] No waiting events found dispatching network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:17:57 compute-0 nova_compute[182092]: 2026-01-23 09:17:57.928 182096 WARNING nova.compute.manager [req-2eb1157c-f99b-418d-a1e0-1397190afaed req-2639748d-11af-4f4e-bdaf-5fe1fd3e49f9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received unexpected event network-vif-plugged-3795ea91-1b70-4c7a-b447-4db72566d706 for instance with vm_state deleted and task_state None.
Jan 23 09:17:58 compute-0 podman[214115]: 2026-01-23 09:17:58.197768355 +0000 UTC m=+0.036486033 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 23 09:17:58 compute-0 podman[214116]: 2026-01-23 09:17:58.205142502 +0000 UTC m=+0.043004687 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:17:58 compute-0 nova_compute[182092]: 2026-01-23 09:17:58.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:17:58 compute-0 nova_compute[182092]: 2026-01-23 09:17:58.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:17:58 compute-0 nova_compute[182092]: 2026-01-23 09:17:58.660 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:17:59 compute-0 nova_compute[182092]: 2026-01-23 09:17:59.320 182096 DEBUG nova.compute.manager [req-37883b2c-3285-4b33-ad4b-4e65d4ad383e req-d5253948-90ca-43a3-80e3-fbde850e8b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Received event network-vif-deleted-3795ea91-1b70-4c7a-b447-4db72566d706 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:18:01 compute-0 nova_compute[182092]: 2026-01-23 09:18:01.883 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:01 compute-0 nova_compute[182092]: 2026-01-23 09:18:01.915 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:02 compute-0 podman[214153]: 2026-01-23 09:18:02.229152395 +0000 UTC m=+0.064197477 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.120 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.120 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.134 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.198 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.198 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.202 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.202 182096 INFO nova.compute.claims [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.295 182096 DEBUG nova.compute.provider_tree [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.303 182096 DEBUG nova.scheduler.client.report [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.316 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.316 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.381 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.382 182096 DEBUG nova.network.neutron [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.396 182096 INFO nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.408 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.489 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.489 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.490 182096 INFO nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Creating image(s)
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.490 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "/var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.490 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "/var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.491 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "/var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.501 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.535 182096 DEBUG nova.policy [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fcad5e4e431544b3a7a9714deb863d62', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '125c5d17683c4bbca1aa1dbf19f80a04', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.547 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.547 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.548 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.557 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.603 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.603 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.626 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.626 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.627 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.674 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.674 182096 DEBUG nova.virt.disk.api [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Checking if we can resize image /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.675 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.720 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.721 182096 DEBUG nova.virt.disk.api [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Cannot resize image /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.721 182096 DEBUG nova.objects.instance [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ddf78b5-aedc-410a-a697-b44ad9c0e826 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.742 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.742 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Ensure instance console log exists: /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.742 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.743 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:04 compute-0 nova_compute[182092]: 2026-01-23 09:18:04.743 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:05 compute-0 nova_compute[182092]: 2026-01-23 09:18:05.162 182096 DEBUG nova.network.neutron [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Successfully created port: 3668ddc0-e958-448b-80e8-3a5f1c2432ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:18:06 compute-0 nova_compute[182092]: 2026-01-23 09:18:06.758 182096 DEBUG nova.network.neutron [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Successfully updated port: 3668ddc0-e958-448b-80e8-3a5f1c2432ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:18:06 compute-0 nova_compute[182092]: 2026-01-23 09:18:06.770 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "refresh_cache-5ddf78b5-aedc-410a-a697-b44ad9c0e826" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:18:06 compute-0 nova_compute[182092]: 2026-01-23 09:18:06.771 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquired lock "refresh_cache-5ddf78b5-aedc-410a-a697-b44ad9c0e826" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:18:06 compute-0 nova_compute[182092]: 2026-01-23 09:18:06.771 182096 DEBUG nova.network.neutron [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:18:06 compute-0 nova_compute[182092]: 2026-01-23 09:18:06.884 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:06 compute-0 nova_compute[182092]: 2026-01-23 09:18:06.916 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.049 182096 DEBUG nova.network.neutron [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.154 182096 DEBUG nova.compute.manager [req-03026941-eb95-4843-ab42-a1598f88b792 req-d9bb9b83-9a74-4371-afdf-9584b5f04eda 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received event network-changed-3668ddc0-e958-448b-80e8-3a5f1c2432ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.155 182096 DEBUG nova.compute.manager [req-03026941-eb95-4843-ab42-a1598f88b792 req-d9bb9b83-9a74-4371-afdf-9584b5f04eda 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Refreshing instance network info cache due to event network-changed-3668ddc0-e958-448b-80e8-3a5f1c2432ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.155 182096 DEBUG oslo_concurrency.lockutils [req-03026941-eb95-4843-ab42-a1598f88b792 req-d9bb9b83-9a74-4371-afdf-9584b5f04eda 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-5ddf78b5-aedc-410a-a697-b44ad9c0e826" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.621 182096 DEBUG nova.network.neutron [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Updating instance_info_cache with network_info: [{"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.636 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Releasing lock "refresh_cache-5ddf78b5-aedc-410a-a697-b44ad9c0e826" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.636 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Instance network_info: |[{"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.636 182096 DEBUG oslo_concurrency.lockutils [req-03026941-eb95-4843-ab42-a1598f88b792 req-d9bb9b83-9a74-4371-afdf-9584b5f04eda 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-5ddf78b5-aedc-410a-a697-b44ad9c0e826" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.637 182096 DEBUG nova.network.neutron [req-03026941-eb95-4843-ab42-a1598f88b792 req-d9bb9b83-9a74-4371-afdf-9584b5f04eda 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Refreshing network info cache for port 3668ddc0-e958-448b-80e8-3a5f1c2432ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.639 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Start _get_guest_xml network_info=[{"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.641 182096 WARNING nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.646 182096 DEBUG nova.virt.libvirt.host [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.646 182096 DEBUG nova.virt.libvirt.host [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.648 182096 DEBUG nova.virt.libvirt.host [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.649 182096 DEBUG nova.virt.libvirt.host [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.650 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.650 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.650 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.650 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.651 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.651 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.651 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.651 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.651 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.652 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.652 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.652 182096 DEBUG nova.virt.hardware [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.654 182096 DEBUG nova.virt.libvirt.vif [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:18:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2023714302',display_name='tempest-ImagesTestJSON-server-2023714302',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-2023714302',id=43,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='125c5d17683c4bbca1aa1dbf19f80a04',ramdisk_id='',reservation_id='r-tmt0dw0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1039257294',owner_user_name='tempest-ImagesTestJSON-1039257294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:18:04Z,user_data=None,user_id='fcad5e4e431544b3a7a9714deb863d62',uuid=5ddf78b5-aedc-410a-a697-b44ad9c0e826,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.655 182096 DEBUG nova.network.os_vif_util [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Converting VIF {"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.655 182096 DEBUG nova.network.os_vif_util [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:c0:dd,bridge_name='br-int',has_traffic_filtering=True,id=3668ddc0-e958-448b-80e8-3a5f1c2432ae,network=Network(68fb53fe-64d5-42ad-8aa7-576b3663d3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3668ddc0-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.656 182096 DEBUG nova.objects.instance [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ddf78b5-aedc-410a-a697-b44ad9c0e826 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.665 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <uuid>5ddf78b5-aedc-410a-a697-b44ad9c0e826</uuid>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <name>instance-0000002b</name>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <nova:name>tempest-ImagesTestJSON-server-2023714302</nova:name>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:18:07</nova:creationTime>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:18:07 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:18:07 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:18:07 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:18:07 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:18:07 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:18:07 compute-0 nova_compute[182092]:         <nova:user uuid="fcad5e4e431544b3a7a9714deb863d62">tempest-ImagesTestJSON-1039257294-project-member</nova:user>
Jan 23 09:18:07 compute-0 nova_compute[182092]:         <nova:project uuid="125c5d17683c4bbca1aa1dbf19f80a04">tempest-ImagesTestJSON-1039257294</nova:project>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:18:07 compute-0 nova_compute[182092]:         <nova:port uuid="3668ddc0-e958-448b-80e8-3a5f1c2432ae">
Jan 23 09:18:07 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <system>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <entry name="serial">5ddf78b5-aedc-410a-a697-b44ad9c0e826</entry>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <entry name="uuid">5ddf78b5-aedc-410a-a697-b44ad9c0e826</entry>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </system>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <os>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   </os>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <features>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   </features>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk.config"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:8b:c0:dd"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <target dev="tap3668ddc0-e9"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/console.log" append="off"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <video>
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </video>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:18:07 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:18:07 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:18:07 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:18:07 compute-0 nova_compute[182092]: </domain>
Jan 23 09:18:07 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.667 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Preparing to wait for external event network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.668 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.668 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.668 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.669 182096 DEBUG nova.virt.libvirt.vif [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:18:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2023714302',display_name='tempest-ImagesTestJSON-server-2023714302',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-2023714302',id=43,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='125c5d17683c4bbca1aa1dbf19f80a04',ramdisk_id='',reservation_id='r-tmt0dw0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1039257294',owner_user_name='tempest-ImagesTestJSON-1039257294-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:18:04Z,user_data=None,user_id='fcad5e4e431544b3a7a9714deb863d62',uuid=5ddf78b5-aedc-410a-a697-b44ad9c0e826,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.670 182096 DEBUG nova.network.os_vif_util [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Converting VIF {"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.671 182096 DEBUG nova.network.os_vif_util [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:c0:dd,bridge_name='br-int',has_traffic_filtering=True,id=3668ddc0-e958-448b-80e8-3a5f1c2432ae,network=Network(68fb53fe-64d5-42ad-8aa7-576b3663d3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3668ddc0-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.671 182096 DEBUG os_vif [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:c0:dd,bridge_name='br-int',has_traffic_filtering=True,id=3668ddc0-e958-448b-80e8-3a5f1c2432ae,network=Network(68fb53fe-64d5-42ad-8aa7-576b3663d3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3668ddc0-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.672 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.672 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.673 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.676 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.677 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3668ddc0-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.677 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3668ddc0-e9, col_values=(('external_ids', {'iface-id': '3668ddc0-e958-448b-80e8-3a5f1c2432ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:c0:dd', 'vm-uuid': '5ddf78b5-aedc-410a-a697-b44ad9c0e826'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.678 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:07 compute-0 NetworkManager[54920]: <info>  [1769159887.6798] manager: (tap3668ddc0-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.683 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.685 182096 INFO os_vif [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:c0:dd,bridge_name='br-int',has_traffic_filtering=True,id=3668ddc0-e958-448b-80e8-3a5f1c2432ae,network=Network(68fb53fe-64d5-42ad-8aa7-576b3663d3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3668ddc0-e9')
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.723 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.723 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.724 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] No VIF found with MAC fa:16:3e:8b:c0:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:18:07 compute-0 nova_compute[182092]: 2026-01-23 09:18:07.724 182096 INFO nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Using config drive
Jan 23 09:18:09 compute-0 nova_compute[182092]: 2026-01-23 09:18:09.737 182096 INFO nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Creating config drive at /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk.config
Jan 23 09:18:09 compute-0 nova_compute[182092]: 2026-01-23 09:18:09.741 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq3a75mp6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:09 compute-0 nova_compute[182092]: 2026-01-23 09:18:09.858 182096 DEBUG oslo_concurrency.processutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq3a75mp6" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:09 compute-0 kernel: tap3668ddc0-e9: entered promiscuous mode
Jan 23 09:18:09 compute-0 NetworkManager[54920]: <info>  [1769159889.9081] manager: (tap3668ddc0-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Jan 23 09:18:09 compute-0 nova_compute[182092]: 2026-01-23 09:18:09.911 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:09 compute-0 nova_compute[182092]: 2026-01-23 09:18:09.914 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:09 compute-0 ovn_controller[94697]: 2026-01-23T09:18:09Z|00116|binding|INFO|Claiming lport 3668ddc0-e958-448b-80e8-3a5f1c2432ae for this chassis.
Jan 23 09:18:09 compute-0 ovn_controller[94697]: 2026-01-23T09:18:09Z|00117|binding|INFO|3668ddc0-e958-448b-80e8-3a5f1c2432ae: Claiming fa:16:3e:8b:c0:dd 10.100.0.7
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.921 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:c0:dd 10.100.0.7'], port_security=['fa:16:3e:8b:c0:dd 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '5ddf78b5-aedc-410a-a697-b44ad9c0e826', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-68fb53fe-64d5-42ad-8aa7-576b3663d3b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '125c5d17683c4bbca1aa1dbf19f80a04', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7c3ff6e5-7d5e-4201-8797-1b4034623bb2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85936a0-d6b2-4ff9-bcff-558e06bcddd5, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=3668ddc0-e958-448b-80e8-3a5f1c2432ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.922 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 3668ddc0-e958-448b-80e8-3a5f1c2432ae in datapath 68fb53fe-64d5-42ad-8aa7-576b3663d3b5 bound to our chassis
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.923 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 68fb53fe-64d5-42ad-8aa7-576b3663d3b5
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.933 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6e4d9205-6084-4991-a5d5-f1778904477a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.933 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap68fb53fe-61 in ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.934 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap68fb53fe-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.935 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a797c345-5e38-4f0e-8800-fe1914e98939]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.935 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f8c09f-62ba-454c-be02-ca9da8d726b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.947 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[3eadfe43-b273-4a02-a864-6d5add4c4972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:09 compute-0 systemd-udevd[214216]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:18:09 compute-0 systemd-machined[153562]: New machine qemu-19-instance-0000002b.
Jan 23 09:18:09 compute-0 NetworkManager[54920]: <info>  [1769159889.9681] device (tap3668ddc0-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:18:09 compute-0 NetworkManager[54920]: <info>  [1769159889.9689] device (tap3668ddc0-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:18:09 compute-0 nova_compute[182092]: 2026-01-23 09:18:09.970 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:09 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-0000002b.
Jan 23 09:18:09 compute-0 ovn_controller[94697]: 2026-01-23T09:18:09Z|00118|binding|INFO|Setting lport 3668ddc0-e958-448b-80e8-3a5f1c2432ae ovn-installed in OVS
Jan 23 09:18:09 compute-0 ovn_controller[94697]: 2026-01-23T09:18:09Z|00119|binding|INFO|Setting lport 3668ddc0-e958-448b-80e8-3a5f1c2432ae up in Southbound
Jan 23 09:18:09 compute-0 nova_compute[182092]: 2026-01-23 09:18:09.976 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:09.979 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9d010a6c-494a-44ae-9fd8-82e9c5476a78]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.002 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c200aa99-f708-46f3-b936-cc46703552b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.008 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea04c12-f3f7-407a-a645-3fbf30384227]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 systemd-udevd[214222]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:18:10 compute-0 NetworkManager[54920]: <info>  [1769159890.0097] manager: (tap68fb53fe-60): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.034 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8b86f56c-77ba-4814-9ded-c056c692f177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.036 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5c723217-a004-4556-b113-58c4398ebad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 NetworkManager[54920]: <info>  [1769159890.0542] device (tap68fb53fe-60): carrier: link connected
Jan 23 09:18:10 compute-0 podman[214198]: 2026-01-23 09:18:10.055536369 +0000 UTC m=+0.151385323 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.061 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1a1e00-a5af-4a1f-8596-817724165761]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.074 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9f80701f-b382-481f-93ec-5246df452173]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap68fb53fe-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:66:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339554, 'reachable_time': 30193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214255, 'error': None, 'target': 'ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.083 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[07812dee-232c-4c85-8509-cc9a946889bb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:6680'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339554, 'tstamp': 339554}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214256, 'error': None, 'target': 'ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.091 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.093 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b2133500-90b3-44fa-8cff-46e9aa4d4f63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap68fb53fe-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:66:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339554, 'reachable_time': 30193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214257, 'error': None, 'target': 'ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.112 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Triggering sync for uuid 5ddf78b5-aedc-410a-a697-b44ad9c0e826 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.112 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.111 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[631bf07e-1ad6-405b-b63f-d85d435b24c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.152 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2a11bd24-4f1c-471a-893d-c6ee817c1850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.153 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68fb53fe-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.154 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.154 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68fb53fe-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:10 compute-0 NetworkManager[54920]: <info>  [1769159890.1560] manager: (tap68fb53fe-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 23 09:18:10 compute-0 kernel: tap68fb53fe-60: entered promiscuous mode
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.157 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap68fb53fe-60, col_values=(('external_ids', {'iface-id': '3c76f8c3-52a0-4002-a39c-21104d6dda78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:10 compute-0 ovn_controller[94697]: 2026-01-23T09:18:10Z|00120|binding|INFO|Releasing lport 3c76f8c3-52a0-4002-a39c-21104d6dda78 from this chassis (sb_readonly=0)
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.167 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.171 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/68fb53fe-64d5-42ad-8aa7-576b3663d3b5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/68fb53fe-64d5-42ad-8aa7-576b3663d3b5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.171 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[130bfe78-070b-4c8a-b492-d43facd0e67f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.172 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-68fb53fe-64d5-42ad-8aa7-576b3663d3b5
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/68fb53fe-64d5-42ad-8aa7-576b3663d3b5.pid.haproxy
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 68fb53fe-64d5-42ad-8aa7-576b3663d3b5
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:18:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:10.173 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5', 'env', 'PROCESS_TAG=haproxy-68fb53fe-64d5-42ad-8aa7-576b3663d3b5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/68fb53fe-64d5-42ad-8aa7-576b3663d3b5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:18:10 compute-0 podman[214285]: 2026-01-23 09:18:10.452333955 +0000 UTC m=+0.030457636 container create a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:18:10 compute-0 systemd[1]: Started libpod-conmon-a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31.scope.
Jan 23 09:18:10 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:18:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c421aac8bbddbb9f9e58591c9020a08a4e2f84169a28e249fbede59ec0cbb37/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:18:10 compute-0 podman[214285]: 2026-01-23 09:18:10.504089277 +0000 UTC m=+0.082212978 container init a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 09:18:10 compute-0 podman[214285]: 2026-01-23 09:18:10.509958044 +0000 UTC m=+0.088081724 container start a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:18:10 compute-0 podman[214285]: 2026-01-23 09:18:10.437912487 +0000 UTC m=+0.016036188 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:18:10 compute-0 neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5[214297]: [NOTICE]   (214301) : New worker (214303) forked
Jan 23 09:18:10 compute-0 neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5[214297]: [NOTICE]   (214301) : Loading success.
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.836 182096 DEBUG nova.compute.manager [req-0a998d8a-5f8f-43c5-8615-fad548d57d77 req-5cdcd91c-59ef-4b1c-93e4-7ac6332d145e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received event network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.837 182096 DEBUG oslo_concurrency.lockutils [req-0a998d8a-5f8f-43c5-8615-fad548d57d77 req-5cdcd91c-59ef-4b1c-93e4-7ac6332d145e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.837 182096 DEBUG oslo_concurrency.lockutils [req-0a998d8a-5f8f-43c5-8615-fad548d57d77 req-5cdcd91c-59ef-4b1c-93e4-7ac6332d145e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.837 182096 DEBUG oslo_concurrency.lockutils [req-0a998d8a-5f8f-43c5-8615-fad548d57d77 req-5cdcd91c-59ef-4b1c-93e4-7ac6332d145e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.838 182096 DEBUG nova.compute.manager [req-0a998d8a-5f8f-43c5-8615-fad548d57d77 req-5cdcd91c-59ef-4b1c-93e4-7ac6332d145e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Processing event network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.981 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159890.9808216, 5ddf78b5-aedc-410a-a697-b44ad9c0e826 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.981 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] VM Started (Lifecycle Event)
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.982 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.986 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.988 182096 INFO nova.virt.libvirt.driver [-] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Instance spawned successfully.
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.988 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:18:10 compute-0 nova_compute[182092]: 2026-01-23 09:18:10.998 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.002 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.005 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.005 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.006 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.006 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.006 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.007 182096 DEBUG nova.virt.libvirt.driver [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.027 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.027 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159890.981472, 5ddf78b5-aedc-410a-a697-b44ad9c0e826 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.027 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] VM Paused (Lifecycle Event)
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.042 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.044 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159890.9845726, 5ddf78b5-aedc-410a-a697-b44ad9c0e826 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.044 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] VM Resumed (Lifecycle Event)
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.055 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.057 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.070 182096 INFO nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Took 6.58 seconds to spawn the instance on the hypervisor.
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.071 182096 DEBUG nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.072 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.165 182096 INFO nova.compute.manager [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Took 6.99 seconds to build instance.
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.181 182096 DEBUG oslo_concurrency.lockutils [None req-81d2bb2c-7ba4-49ce-94bb-bf56dae357bb fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.181 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.182 182096 INFO nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.182 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.859 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159876.8576794, a5a9581e-f52e-4598-ae15-8e339c74934a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.859 182096 INFO nova.compute.manager [-] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] VM Stopped (Lifecycle Event)
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.874 182096 DEBUG nova.compute.manager [None req-81f82d09-6c76-4cf1-a032-45fd6535ab06 - - - - - -] [instance: a5a9581e-f52e-4598-ae15-8e339c74934a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:18:11 compute-0 nova_compute[182092]: 2026-01-23 09:18:11.917 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:12 compute-0 nova_compute[182092]: 2026-01-23 09:18:12.679 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:12 compute-0 nova_compute[182092]: 2026-01-23 09:18:12.772 182096 DEBUG nova.network.neutron [req-03026941-eb95-4843-ab42-a1598f88b792 req-d9bb9b83-9a74-4371-afdf-9584b5f04eda 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Updated VIF entry in instance network info cache for port 3668ddc0-e958-448b-80e8-3a5f1c2432ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:18:12 compute-0 nova_compute[182092]: 2026-01-23 09:18:12.773 182096 DEBUG nova.network.neutron [req-03026941-eb95-4843-ab42-a1598f88b792 req-d9bb9b83-9a74-4371-afdf-9584b5f04eda 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Updating instance_info_cache with network_info: [{"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:18:12 compute-0 nova_compute[182092]: 2026-01-23 09:18:12.796 182096 DEBUG oslo_concurrency.lockutils [req-03026941-eb95-4843-ab42-a1598f88b792 req-d9bb9b83-9a74-4371-afdf-9584b5f04eda 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-5ddf78b5-aedc-410a-a697-b44ad9c0e826" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:18:13 compute-0 nova_compute[182092]: 2026-01-23 09:18:13.204 182096 DEBUG nova.compute.manager [req-d77ff94d-284e-4197-b110-faf61c75d95f req-d5f7a8ae-0270-4160-97bc-e7c0337966ca 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received event network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:18:13 compute-0 nova_compute[182092]: 2026-01-23 09:18:13.204 182096 DEBUG oslo_concurrency.lockutils [req-d77ff94d-284e-4197-b110-faf61c75d95f req-d5f7a8ae-0270-4160-97bc-e7c0337966ca 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:13 compute-0 nova_compute[182092]: 2026-01-23 09:18:13.205 182096 DEBUG oslo_concurrency.lockutils [req-d77ff94d-284e-4197-b110-faf61c75d95f req-d5f7a8ae-0270-4160-97bc-e7c0337966ca 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:13 compute-0 nova_compute[182092]: 2026-01-23 09:18:13.205 182096 DEBUG oslo_concurrency.lockutils [req-d77ff94d-284e-4197-b110-faf61c75d95f req-d5f7a8ae-0270-4160-97bc-e7c0337966ca 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:13 compute-0 nova_compute[182092]: 2026-01-23 09:18:13.205 182096 DEBUG nova.compute.manager [req-d77ff94d-284e-4197-b110-faf61c75d95f req-d5f7a8ae-0270-4160-97bc-e7c0337966ca 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] No waiting events found dispatching network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:18:13 compute-0 nova_compute[182092]: 2026-01-23 09:18:13.205 182096 WARNING nova.compute.manager [req-d77ff94d-284e-4197-b110-faf61c75d95f req-d5f7a8ae-0270-4160-97bc-e7c0337966ca 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received unexpected event network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae for instance with vm_state active and task_state None.
Jan 23 09:18:14 compute-0 nova_compute[182092]: 2026-01-23 09:18:14.930 182096 DEBUG nova.compute.manager [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.002 182096 INFO nova.compute.manager [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] instance snapshotting
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.215 182096 INFO nova.virt.libvirt.driver [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Beginning live snapshot process
Jan 23 09:18:15 compute-0 virtqemud[181713]: invalid argument: disk vda does not have an active block job
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.482 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.537 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk --force-share --output=json -f qcow2" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.538 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.594 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826/disk --force-share --output=json -f qcow2" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.605 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.661 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.662 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpz9sq8p6s/233b0baa98bf438db74d656180e5ea58.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.692 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpz9sq8p6s/233b0baa98bf438db74d656180e5ea58.delta 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.693 182096 INFO nova.virt.libvirt.driver [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.732 182096 DEBUG nova.virt.libvirt.guest [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.735 182096 INFO nova.virt.libvirt.driver [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.759 182096 DEBUG nova.privsep.utils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.759 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpz9sq8p6s/233b0baa98bf438db74d656180e5ea58.delta /var/lib/nova/instances/snapshots/tmpz9sq8p6s/233b0baa98bf438db74d656180e5ea58 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.830 182096 DEBUG oslo_concurrency.processutils [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpz9sq8p6s/233b0baa98bf438db74d656180e5ea58.delta /var/lib/nova/instances/snapshots/tmpz9sq8p6s/233b0baa98bf438db74d656180e5ea58" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:18:15 compute-0 nova_compute[182092]: 2026-01-23 09:18:15.831 182096 INFO nova.virt.libvirt.driver [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Snapshot extracted, beginning image upload
Jan 23 09:18:16 compute-0 nova_compute[182092]: 2026-01-23 09:18:16.918 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:17 compute-0 podman[214342]: 2026-01-23 09:18:17.217273384 +0000 UTC m=+0.052140319 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:18:17 compute-0 podman[214341]: 2026-01-23 09:18:17.235369578 +0000 UTC m=+0.072396741 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:18:17 compute-0 nova_compute[182092]: 2026-01-23 09:18:17.543 182096 INFO nova.virt.libvirt.driver [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Snapshot image upload complete
Jan 23 09:18:17 compute-0 nova_compute[182092]: 2026-01-23 09:18:17.545 182096 INFO nova.compute.manager [None req-884b9363-aa0a-4efb-b4f1-a641ff45d5b3 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Took 2.54 seconds to snapshot the instance on the hypervisor.
Jan 23 09:18:17 compute-0 nova_compute[182092]: 2026-01-23 09:18:17.681 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:19.239 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:18:19 compute-0 nova_compute[182092]: 2026-01-23 09:18:19.240 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:19.241 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:18:21 compute-0 nova_compute[182092]: 2026-01-23 09:18:21.919 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:22 compute-0 nova_compute[182092]: 2026-01-23 09:18:22.682 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:23 compute-0 ovn_controller[94697]: 2026-01-23T09:18:23Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:c0:dd 10.100.0.7
Jan 23 09:18:23 compute-0 ovn_controller[94697]: 2026-01-23T09:18:23Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:c0:dd 10.100.0.7
Jan 23 09:18:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:23.243 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:26 compute-0 nova_compute[182092]: 2026-01-23 09:18:26.921 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:27 compute-0 nova_compute[182092]: 2026-01-23 09:18:27.683 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 podman[214391]: 2026-01-23 09:18:29.203203549 +0000 UTC m=+0.038437016 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.204 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.205 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.205 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.205 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.205 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:29 compute-0 podman[214392]: 2026-01-23 09:18:29.208298937 +0000 UTC m=+0.041539252 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:18:29 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.220 182096 INFO nova.compute.manager [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Terminating instance
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.226 182096 DEBUG nova.compute.manager [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:18:29 compute-0 kernel: tap3668ddc0-e9 (unregistering): left promiscuous mode
Jan 23 09:18:29 compute-0 NetworkManager[54920]: <info>  [1769159909.2480] device (tap3668ddc0-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:18:29 compute-0 ovn_controller[94697]: 2026-01-23T09:18:29Z|00121|binding|INFO|Releasing lport 3668ddc0-e958-448b-80e8-3a5f1c2432ae from this chassis (sb_readonly=0)
Jan 23 09:18:29 compute-0 ovn_controller[94697]: 2026-01-23T09:18:29Z|00122|binding|INFO|Setting lport 3668ddc0-e958-448b-80e8-3a5f1c2432ae down in Southbound
Jan 23 09:18:29 compute-0 ovn_controller[94697]: 2026-01-23T09:18:29Z|00123|binding|INFO|Removing iface tap3668ddc0-e9 ovn-installed in OVS
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.254 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.260 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:c0:dd 10.100.0.7'], port_security=['fa:16:3e:8b:c0:dd 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '5ddf78b5-aedc-410a-a697-b44ad9c0e826', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-68fb53fe-64d5-42ad-8aa7-576b3663d3b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '125c5d17683c4bbca1aa1dbf19f80a04', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7c3ff6e5-7d5e-4201-8797-1b4034623bb2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85936a0-d6b2-4ff9-bcff-558e06bcddd5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=3668ddc0-e958-448b-80e8-3a5f1c2432ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.261 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 3668ddc0-e958-448b-80e8-3a5f1c2432ae in datapath 68fb53fe-64d5-42ad-8aa7-576b3663d3b5 unbound from our chassis
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.263 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 68fb53fe-64d5-42ad-8aa7-576b3663d3b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.268 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed254cb-f6af-4420-a92f-3e796dbb65b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.268 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5 namespace which is not needed anymore
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.272 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Jan 23 09:18:29 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002b.scope: Consumed 11.614s CPU time.
Jan 23 09:18:29 compute-0 systemd-machined[153562]: Machine qemu-19-instance-0000002b terminated.
Jan 23 09:18:29 compute-0 neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5[214297]: [NOTICE]   (214301) : haproxy version is 2.8.14-c23fe91
Jan 23 09:18:29 compute-0 neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5[214297]: [NOTICE]   (214301) : path to executable is /usr/sbin/haproxy
Jan 23 09:18:29 compute-0 neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5[214297]: [WARNING]  (214301) : Exiting Master process...
Jan 23 09:18:29 compute-0 neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5[214297]: [WARNING]  (214301) : Exiting Master process...
Jan 23 09:18:29 compute-0 neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5[214297]: [ALERT]    (214301) : Current worker (214303) exited with code 143 (Terminated)
Jan 23 09:18:29 compute-0 neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5[214297]: [WARNING]  (214301) : All workers exited. Exiting... (0)
Jan 23 09:18:29 compute-0 systemd[1]: libpod-a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31.scope: Deactivated successfully.
Jan 23 09:18:29 compute-0 podman[214450]: 2026-01-23 09:18:29.364112696 +0000 UTC m=+0.031240372 container died a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 09:18:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31-userdata-shm.mount: Deactivated successfully.
Jan 23 09:18:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-9c421aac8bbddbb9f9e58591c9020a08a4e2f84169a28e249fbede59ec0cbb37-merged.mount: Deactivated successfully.
Jan 23 09:18:29 compute-0 podman[214450]: 2026-01-23 09:18:29.389098042 +0000 UTC m=+0.056225719 container cleanup a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:18:29 compute-0 systemd[1]: libpod-conmon-a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31.scope: Deactivated successfully.
Jan 23 09:18:29 compute-0 podman[214473]: 2026-01-23 09:18:29.428024883 +0000 UTC m=+0.024276400 container remove a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.432 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[20ea60e5-b58b-47f2-a79d-2fcfffa1952c]: (4, ('Fri Jan 23 09:18:29 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5 (a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31)\na7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31\nFri Jan 23 09:18:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5 (a7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31)\na7b35340a366fc30b153b7c73fc1ac115a9dcb80e4c14e9e65317a33c7ea5f31\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.433 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[78ce3314-0516-45e2-a5cb-6b3fdec1e024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.434 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68fb53fe-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.436 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.448 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 kernel: tap68fb53fe-60: left promiscuous mode
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.454 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.457 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d2fce58e-9196-4603-9412-5d470a5be6e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.464 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[25133ed9-9667-45cd-9924-9aaccb9dbd0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.465 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0571d025-9908-4ec9-8cb9-fc5504162bad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.477 182096 INFO nova.virt.libvirt.driver [-] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Instance destroyed successfully.
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.477 182096 DEBUG nova.objects.instance [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lazy-loading 'resources' on Instance uuid 5ddf78b5-aedc-410a-a697-b44ad9c0e826 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:18:29 compute-0 systemd[1]: run-netns-ovnmeta\x2d68fb53fe\x2d64d5\x2d42ad\x2d8aa7\x2d576b3663d3b5.mount: Deactivated successfully.
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.477 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc57a3e-1081-43d4-bb23-f69135075b28]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339548, 'reachable_time': 39696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214498, 'error': None, 'target': 'ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.480 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-68fb53fe-64d5-42ad-8aa7-576b3663d3b5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:18:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:29.480 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3a5c73-d54f-4470-91c1-45d15283a829]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.493 182096 DEBUG nova.virt.libvirt.vif [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:18:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2023714302',display_name='tempest-ImagesTestJSON-server-2023714302',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-2023714302',id=43,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:18:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='125c5d17683c4bbca1aa1dbf19f80a04',ramdisk_id='',reservation_id='r-tmt0dw0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1039257294',owner_user_name='tempest-ImagesTestJSON-1039257294-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:18:17Z,user_data=None,user_id='fcad5e4e431544b3a7a9714deb863d62',uuid=5ddf78b5-aedc-410a-a697-b44ad9c0e826,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.493 182096 DEBUG nova.network.os_vif_util [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Converting VIF {"id": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "address": "fa:16:3e:8b:c0:dd", "network": {"id": "68fb53fe-64d5-42ad-8aa7-576b3663d3b5", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1746602613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "125c5d17683c4bbca1aa1dbf19f80a04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3668ddc0-e9", "ovs_interfaceid": "3668ddc0-e958-448b-80e8-3a5f1c2432ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.494 182096 DEBUG nova.network.os_vif_util [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:c0:dd,bridge_name='br-int',has_traffic_filtering=True,id=3668ddc0-e958-448b-80e8-3a5f1c2432ae,network=Network(68fb53fe-64d5-42ad-8aa7-576b3663d3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3668ddc0-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.494 182096 DEBUG os_vif [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:c0:dd,bridge_name='br-int',has_traffic_filtering=True,id=3668ddc0-e958-448b-80e8-3a5f1c2432ae,network=Network(68fb53fe-64d5-42ad-8aa7-576b3663d3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3668ddc0-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.495 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.496 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3668ddc0-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.497 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.498 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.500 182096 INFO os_vif [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:c0:dd,bridge_name='br-int',has_traffic_filtering=True,id=3668ddc0-e958-448b-80e8-3a5f1c2432ae,network=Network(68fb53fe-64d5-42ad-8aa7-576b3663d3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3668ddc0-e9')
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.500 182096 INFO nova.virt.libvirt.driver [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Deleting instance files /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826_del
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.500 182096 INFO nova.virt.libvirt.driver [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Deletion of /var/lib/nova/instances/5ddf78b5-aedc-410a-a697-b44ad9c0e826_del complete
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.558 182096 INFO nova.compute.manager [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.558 182096 DEBUG oslo.service.loopingcall [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.558 182096 DEBUG nova.compute.manager [-] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:18:29 compute-0 nova_compute[182092]: 2026-01-23 09:18:29.559 182096 DEBUG nova.network.neutron [-] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.102 182096 DEBUG nova.network.neutron [-] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.115 182096 INFO nova.compute.manager [-] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Took 0.56 seconds to deallocate network for instance.
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.185 182096 DEBUG nova.compute.manager [req-3ee1f597-519a-478f-b630-648858876d62 req-483f3b2b-7cd8-4ca8-a254-0dbed5bb7c96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received event network-vif-deleted-3668ddc0-e958-448b-80e8-3a5f1c2432ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.196 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.196 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.235 182096 DEBUG nova.compute.provider_tree [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.256 182096 DEBUG nova.scheduler.client.report [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.282 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.315 182096 INFO nova.scheduler.client.report [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Deleted allocations for instance 5ddf78b5-aedc-410a-a697-b44ad9c0e826
Jan 23 09:18:30 compute-0 nova_compute[182092]: 2026-01-23 09:18:30.408 182096 DEBUG oslo_concurrency.lockutils [None req-a190c013-f3dc-44fa-b627-8de9ed384744 fcad5e4e431544b3a7a9714deb863d62 125c5d17683c4bbca1aa1dbf19f80a04 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.382 182096 DEBUG nova.compute.manager [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received event network-vif-unplugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.382 182096 DEBUG oslo_concurrency.lockutils [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.382 182096 DEBUG oslo_concurrency.lockutils [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.383 182096 DEBUG oslo_concurrency.lockutils [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.383 182096 DEBUG nova.compute.manager [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] No waiting events found dispatching network-vif-unplugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.383 182096 WARNING nova.compute.manager [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received unexpected event network-vif-unplugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae for instance with vm_state deleted and task_state None.
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.383 182096 DEBUG nova.compute.manager [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received event network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.383 182096 DEBUG oslo_concurrency.lockutils [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.384 182096 DEBUG oslo_concurrency.lockutils [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.384 182096 DEBUG oslo_concurrency.lockutils [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ddf78b5-aedc-410a-a697-b44ad9c0e826-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.384 182096 DEBUG nova.compute.manager [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] No waiting events found dispatching network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.384 182096 WARNING nova.compute.manager [req-d7de5a55-4435-4cdb-9126-8ce3cc255e72 req-e3f0514c-300a-4688-a1f6-650a1c657aa2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Received unexpected event network-vif-plugged-3668ddc0-e958-448b-80e8-3a5f1c2432ae for instance with vm_state deleted and task_state None.
Jan 23 09:18:31 compute-0 nova_compute[182092]: 2026-01-23 09:18:31.921 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:18:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:18:33 compute-0 podman[214503]: 2026-01-23 09:18:33.199222393 +0000 UTC m=+0.037867422 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=openstack_network_exporter)
Jan 23 09:18:34 compute-0 nova_compute[182092]: 2026-01-23 09:18:34.497 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:34 compute-0 nova_compute[182092]: 2026-01-23 09:18:34.949 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:36 compute-0 nova_compute[182092]: 2026-01-23 09:18:36.923 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:39 compute-0 nova_compute[182092]: 2026-01-23 09:18:39.498 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:39.854 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:39.855 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:18:39.855 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:40 compute-0 podman[214521]: 2026-01-23 09:18:40.227223146 +0000 UTC m=+0.059514726 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 23 09:18:41 compute-0 nova_compute[182092]: 2026-01-23 09:18:41.924 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:44 compute-0 nova_compute[182092]: 2026-01-23 09:18:44.475 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159909.475286, 5ddf78b5-aedc-410a-a697-b44ad9c0e826 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:18:44 compute-0 nova_compute[182092]: 2026-01-23 09:18:44.476 182096 INFO nova.compute.manager [-] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] VM Stopped (Lifecycle Event)
Jan 23 09:18:44 compute-0 nova_compute[182092]: 2026-01-23 09:18:44.491 182096 DEBUG nova.compute.manager [None req-6480d55a-ffc2-4597-96fa-4898a03727ba - - - - - -] [instance: 5ddf78b5-aedc-410a-a697-b44ad9c0e826] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:18:44 compute-0 nova_compute[182092]: 2026-01-23 09:18:44.499 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:46 compute-0 nova_compute[182092]: 2026-01-23 09:18:46.925 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:48 compute-0 podman[214544]: 2026-01-23 09:18:48.205159892 +0000 UTC m=+0.040206123 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:18:48 compute-0 podman[214543]: 2026-01-23 09:18:48.20855635 +0000 UTC m=+0.045938809 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:18:49 compute-0 nova_compute[182092]: 2026-01-23 09:18:49.499 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:51 compute-0 nova_compute[182092]: 2026-01-23 09:18:51.926 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:53 compute-0 nova_compute[182092]: 2026-01-23 09:18:53.651 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:53 compute-0 nova_compute[182092]: 2026-01-23 09:18:53.652 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:18:54 compute-0 nova_compute[182092]: 2026-01-23 09:18:54.500 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:54 compute-0 nova_compute[182092]: 2026-01-23 09:18:54.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.674 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.675 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.872 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.873 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.37425231933594GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.873 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.873 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.926 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.927 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.947 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.962 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.974 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:18:55 compute-0 nova_compute[182092]: 2026-01-23 09:18:55.974 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:18:56 compute-0 nova_compute[182092]: 2026-01-23 09:18:56.927 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:18:56 compute-0 nova_compute[182092]: 2026-01-23 09:18:56.975 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:56 compute-0 nova_compute[182092]: 2026-01-23 09:18:56.975 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:18:56 compute-0 nova_compute[182092]: 2026-01-23 09:18:56.975 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:18:56 compute-0 nova_compute[182092]: 2026-01-23 09:18:56.986 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:18:58 compute-0 nova_compute[182092]: 2026-01-23 09:18:58.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:58 compute-0 nova_compute[182092]: 2026-01-23 09:18:58.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:18:59 compute-0 nova_compute[182092]: 2026-01-23 09:18:59.501 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:00 compute-0 podman[214584]: 2026-01-23 09:19:00.2098555 +0000 UTC m=+0.042745163 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:19:00 compute-0 podman[214585]: 2026-01-23 09:19:00.209941773 +0000 UTC m=+0.039809203 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:19:01 compute-0 nova_compute[182092]: 2026-01-23 09:19:01.928 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.170 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "ba63ce37-d255-4310-9b43-66422e2ad4b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.171 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.351 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.424 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.425 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.429 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.430 182096 INFO nova.compute.claims [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.524 182096 DEBUG nova.compute.provider_tree [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.534 182096 DEBUG nova.scheduler.client.report [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.549 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.550 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.586 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.586 182096 DEBUG nova.network.neutron [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.601 182096 INFO nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.618 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.719 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.722 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.723 182096 INFO nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Creating image(s)
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.723 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "/var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.723 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "/var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.724 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "/var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.734 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.780 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.781 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.782 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.792 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.805 182096 DEBUG nova.policy [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10ee3f3ad9014355a5e44b919026c456', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d500707899f49baa15640293933cb00', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.837 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.838 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.872 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.873 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.873 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.939 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.940 182096 DEBUG nova.virt.disk.api [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Checking if we can resize image /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:19:03 compute-0 nova_compute[182092]: 2026-01-23 09:19:03.940 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.005 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.006 182096 DEBUG nova.virt.disk.api [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Cannot resize image /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.006 182096 DEBUG nova.objects.instance [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lazy-loading 'migration_context' on Instance uuid ba63ce37-d255-4310-9b43-66422e2ad4b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.026 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.026 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Ensure instance console log exists: /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.026 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.027 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.027 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:04 compute-0 podman[214637]: 2026-01-23 09:19:04.202685306 +0000 UTC m=+0.041517366 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, architecture=x86_64)
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.502 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:04 compute-0 nova_compute[182092]: 2026-01-23 09:19:04.775 182096 DEBUG nova.network.neutron [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Successfully created port: 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.145 182096 DEBUG nova.network.neutron [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Successfully updated port: 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.162 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "refresh_cache-ba63ce37-d255-4310-9b43-66422e2ad4b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.162 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquired lock "refresh_cache-ba63ce37-d255-4310-9b43-66422e2ad4b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.163 182096 DEBUG nova.network.neutron [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.252 182096 DEBUG nova.compute.manager [req-2bfbcc5f-216a-4c58-a7d7-89bae619c296 req-2b33506d-136b-4b14-ae2f-56090daec3ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received event network-changed-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.252 182096 DEBUG nova.compute.manager [req-2bfbcc5f-216a-4c58-a7d7-89bae619c296 req-2b33506d-136b-4b14-ae2f-56090daec3ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Refreshing instance network info cache due to event network-changed-77cf3143-8e1d-4ff0-b03a-134abf5c1c58. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.252 182096 DEBUG oslo_concurrency.lockutils [req-2bfbcc5f-216a-4c58-a7d7-89bae619c296 req-2b33506d-136b-4b14-ae2f-56090daec3ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-ba63ce37-d255-4310-9b43-66422e2ad4b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.331 182096 DEBUG nova.network.neutron [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:19:06 compute-0 nova_compute[182092]: 2026-01-23 09:19:06.929 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.842 182096 DEBUG nova.network.neutron [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Updating instance_info_cache with network_info: [{"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.858 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Releasing lock "refresh_cache-ba63ce37-d255-4310-9b43-66422e2ad4b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.859 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Instance network_info: |[{"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.859 182096 DEBUG oslo_concurrency.lockutils [req-2bfbcc5f-216a-4c58-a7d7-89bae619c296 req-2b33506d-136b-4b14-ae2f-56090daec3ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-ba63ce37-d255-4310-9b43-66422e2ad4b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.859 182096 DEBUG nova.network.neutron [req-2bfbcc5f-216a-4c58-a7d7-89bae619c296 req-2b33506d-136b-4b14-ae2f-56090daec3ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Refreshing network info cache for port 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.861 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Start _get_guest_xml network_info=[{"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.864 182096 WARNING nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.869 182096 DEBUG nova.virt.libvirt.host [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.870 182096 DEBUG nova.virt.libvirt.host [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.873 182096 DEBUG nova.virt.libvirt.host [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.873 182096 DEBUG nova.virt.libvirt.host [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.874 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.875 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.875 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.875 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.875 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.875 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.876 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.876 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.876 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.876 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.876 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.877 182096 DEBUG nova.virt.hardware [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.879 182096 DEBUG nova.virt.libvirt.vif [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:19:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1514416608',display_name='tempest-ImagesNegativeTestJSON-server-1514416608',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1514416608',id=50,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d500707899f49baa15640293933cb00',ramdisk_id='',reservation_id='r-bik47el2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-2103654228',owner_user_name='tempest-ImagesNegativeTestJSON-2103654228-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:19:03Z,user_data=None,user_id='10ee3f3ad9014355a5e44b919026c456',uuid=ba63ce37-d255-4310-9b43-66422e2ad4b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.879 182096 DEBUG nova.network.os_vif_util [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Converting VIF {"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.880 182096 DEBUG nova.network.os_vif_util [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:2e:90,bridge_name='br-int',has_traffic_filtering=True,id=77cf3143-8e1d-4ff0-b03a-134abf5c1c58,network=Network(1f7ffe2f-3ff4-41f8-bc47-9f0605049164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77cf3143-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.881 182096 DEBUG nova.objects.instance [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lazy-loading 'pci_devices' on Instance uuid ba63ce37-d255-4310-9b43-66422e2ad4b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.891 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <uuid>ba63ce37-d255-4310-9b43-66422e2ad4b6</uuid>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <name>instance-00000032</name>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <nova:name>tempest-ImagesNegativeTestJSON-server-1514416608</nova:name>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:19:07</nova:creationTime>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:19:07 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:19:07 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:19:07 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:19:07 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:19:07 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:19:07 compute-0 nova_compute[182092]:         <nova:user uuid="10ee3f3ad9014355a5e44b919026c456">tempest-ImagesNegativeTestJSON-2103654228-project-member</nova:user>
Jan 23 09:19:07 compute-0 nova_compute[182092]:         <nova:project uuid="7d500707899f49baa15640293933cb00">tempest-ImagesNegativeTestJSON-2103654228</nova:project>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:19:07 compute-0 nova_compute[182092]:         <nova:port uuid="77cf3143-8e1d-4ff0-b03a-134abf5c1c58">
Jan 23 09:19:07 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <system>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <entry name="serial">ba63ce37-d255-4310-9b43-66422e2ad4b6</entry>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <entry name="uuid">ba63ce37-d255-4310-9b43-66422e2ad4b6</entry>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </system>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <os>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   </os>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <features>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   </features>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk.config"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:c5:2e:90"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <target dev="tap77cf3143-8e"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/console.log" append="off"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <video>
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </video>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:19:07 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:19:07 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:19:07 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:19:07 compute-0 nova_compute[182092]: </domain>
Jan 23 09:19:07 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.892 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Preparing to wait for external event network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.892 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.892 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.892 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.893 182096 DEBUG nova.virt.libvirt.vif [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:19:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1514416608',display_name='tempest-ImagesNegativeTestJSON-server-1514416608',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1514416608',id=50,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d500707899f49baa15640293933cb00',ramdisk_id='',reservation_id='r-bik47el2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-2103654228',owner_user_name='tempest-ImagesNegativeTestJSON-2103654228-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:19:03Z,user_data=None,user_id='10ee3f3ad9014355a5e44b919026c456',uuid=ba63ce37-d255-4310-9b43-66422e2ad4b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.893 182096 DEBUG nova.network.os_vif_util [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Converting VIF {"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.894 182096 DEBUG nova.network.os_vif_util [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:2e:90,bridge_name='br-int',has_traffic_filtering=True,id=77cf3143-8e1d-4ff0-b03a-134abf5c1c58,network=Network(1f7ffe2f-3ff4-41f8-bc47-9f0605049164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77cf3143-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.894 182096 DEBUG os_vif [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:2e:90,bridge_name='br-int',has_traffic_filtering=True,id=77cf3143-8e1d-4ff0-b03a-134abf5c1c58,network=Network(1f7ffe2f-3ff4-41f8-bc47-9f0605049164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77cf3143-8e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.894 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.895 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.895 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.896 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.897 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77cf3143-8e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.897 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77cf3143-8e, col_values=(('external_ids', {'iface-id': '77cf3143-8e1d-4ff0-b03a-134abf5c1c58', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:2e:90', 'vm-uuid': 'ba63ce37-d255-4310-9b43-66422e2ad4b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.898 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:07 compute-0 NetworkManager[54920]: <info>  [1769159947.8994] manager: (tap77cf3143-8e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.901 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.902 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.903 182096 INFO os_vif [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:2e:90,bridge_name='br-int',has_traffic_filtering=True,id=77cf3143-8e1d-4ff0-b03a-134abf5c1c58,network=Network(1f7ffe2f-3ff4-41f8-bc47-9f0605049164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77cf3143-8e')
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.943 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.943 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.944 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] No VIF found with MAC fa:16:3e:c5:2e:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:19:07 compute-0 nova_compute[182092]: 2026-01-23 09:19:07.944 182096 INFO nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Using config drive
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.257 182096 INFO nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Creating config drive at /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk.config
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.261 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz_irl_2c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.381 182096 DEBUG oslo_concurrency.processutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz_irl_2c" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:08 compute-0 kernel: tap77cf3143-8e: entered promiscuous mode
Jan 23 09:19:08 compute-0 NetworkManager[54920]: <info>  [1769159948.4210] manager: (tap77cf3143-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.421 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.423 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.424 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:08 compute-0 ovn_controller[94697]: 2026-01-23T09:19:08Z|00124|binding|INFO|Claiming lport 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 for this chassis.
Jan 23 09:19:08 compute-0 ovn_controller[94697]: 2026-01-23T09:19:08Z|00125|binding|INFO|77cf3143-8e1d-4ff0-b03a-134abf5c1c58: Claiming fa:16:3e:c5:2e:90 10.100.0.10
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.438 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:2e:90 10.100.0.10'], port_security=['fa:16:3e:c5:2e:90 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ba63ce37-d255-4310-9b43-66422e2ad4b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f7ffe2f-3ff4-41f8-bc47-9f0605049164', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d500707899f49baa15640293933cb00', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd3acd4f1-d477-4873-a6be-e6f3c9a12729', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c69b6176-6617-42ce-92f6-8bd57b01017d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=77cf3143-8e1d-4ff0-b03a-134abf5c1c58) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.439 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 in datapath 1f7ffe2f-3ff4-41f8-bc47-9f0605049164 bound to our chassis
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.440 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f7ffe2f-3ff4-41f8-bc47-9f0605049164
Jan 23 09:19:08 compute-0 systemd-udevd[214673]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.448 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c02a877f-fa1e-4839-bcaf-2ff713747640]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.449 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f7ffe2f-31 in ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.450 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f7ffe2f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.450 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[12832b5f-0fe4-4448-b078-5f7f1f02cfa3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.451 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6a43ff-ced3-4f67-a511-f1d2c9ca575a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 NetworkManager[54920]: <info>  [1769159948.4604] device (tap77cf3143-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:19:08 compute-0 NetworkManager[54920]: <info>  [1769159948.4620] device (tap77cf3143-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.460 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e71fec-2939-4ee7-a769-77265265b9a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 systemd-machined[153562]: New machine qemu-20-instance-00000032.
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.484 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:08 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000032.
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.485 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9b273640-2257-424f-bf72-d541e13082fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_controller[94697]: 2026-01-23T09:19:08Z|00126|binding|INFO|Setting lport 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 ovn-installed in OVS
Jan 23 09:19:08 compute-0 ovn_controller[94697]: 2026-01-23T09:19:08Z|00127|binding|INFO|Setting lport 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 up in Southbound
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.490 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.511 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9427d76a-0f0a-461f-9764-d5a7d782bdc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 NetworkManager[54920]: <info>  [1769159948.5157] manager: (tap1f7ffe2f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Jan 23 09:19:08 compute-0 systemd-udevd[214678]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.515 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8af6d1dd-d29b-45bd-8e5e-0147a2307874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.538 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e53ef15e-5299-4a9e-b497-101bea9ca101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.540 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[95634a9d-2a54-4257-8dd1-5a9932268e53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 NetworkManager[54920]: <info>  [1769159948.5584] device (tap1f7ffe2f-30): carrier: link connected
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.562 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5617a58c-bb02-49b1-872d-5faf978076d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.577 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[88d1b340-7e7b-4241-b5ee-745b0b484ad8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f7ffe2f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:c4:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345404, 'reachable_time': 42538, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214700, 'error': None, 'target': 'ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.588 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d168f5-45cb-472d-af66-38555d968519]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe67:c49b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345404, 'tstamp': 345404}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214701, 'error': None, 'target': 'ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.601 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2ccef821-0ca6-48de-8f2c-34d5b604325d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f7ffe2f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:c4:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345404, 'reachable_time': 42538, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214702, 'error': None, 'target': 'ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.622 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[846f955d-87ed-4e00-8e94-7e4130787c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.663 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cfdbbd46-d464-4dcb-9f57-34e72dfdf054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.664 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f7ffe2f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.664 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.665 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f7ffe2f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:08 compute-0 NetworkManager[54920]: <info>  [1769159948.6669] manager: (tap1f7ffe2f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 23 09:19:08 compute-0 kernel: tap1f7ffe2f-30: entered promiscuous mode
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.666 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.670 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f7ffe2f-30, col_values=(('external_ids', {'iface-id': 'dd110bdc-166e-4e29-bb1f-380352e1c962'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.671 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:08 compute-0 ovn_controller[94697]: 2026-01-23T09:19:08Z|00128|binding|INFO|Releasing lport dd110bdc-166e-4e29-bb1f-380352e1c962 from this chassis (sb_readonly=0)
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.675 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f7ffe2f-3ff4-41f8-bc47-9f0605049164.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f7ffe2f-3ff4-41f8-bc47-9f0605049164.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.683 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd4105d-ffaa-4eff-95cb-3170169af0c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.684 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.684 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-1f7ffe2f-3ff4-41f8-bc47-9f0605049164
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/1f7ffe2f-3ff4-41f8-bc47-9f0605049164.pid.haproxy
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 1f7ffe2f-3ff4-41f8-bc47-9f0605049164
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:19:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:08.685 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164', 'env', 'PROCESS_TAG=haproxy-1f7ffe2f-3ff4-41f8-bc47-9f0605049164', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f7ffe2f-3ff4-41f8-bc47-9f0605049164.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.720 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159948.7196977, ba63ce37-d255-4310-9b43-66422e2ad4b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.720 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] VM Started (Lifecycle Event)
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.738 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.740 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159948.7198017, ba63ce37-d255-4310-9b43-66422e2ad4b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.740 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] VM Paused (Lifecycle Event)
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.753 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.755 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:08 compute-0 nova_compute[182092]: 2026-01-23 09:19:08.767 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:08 compute-0 podman[214738]: 2026-01-23 09:19:08.974517485 +0000 UTC m=+0.032389825 container create e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 09:19:08 compute-0 systemd[1]: Started libpod-conmon-e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd.scope.
Jan 23 09:19:09 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:19:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f417e105079035695c2e1878d6ccb6d5af16800818c6486be8b6c0a97abe2b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:19:09 compute-0 podman[214738]: 2026-01-23 09:19:09.036758413 +0000 UTC m=+0.094630763 container init e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:19:09 compute-0 podman[214738]: 2026-01-23 09:19:09.041344166 +0000 UTC m=+0.099216505 container start e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 09:19:09 compute-0 podman[214738]: 2026-01-23 09:19:08.959904201 +0000 UTC m=+0.017776552 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:19:09 compute-0 neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164[214750]: [NOTICE]   (214754) : New worker (214756) forked
Jan 23 09:19:09 compute-0 neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164[214750]: [NOTICE]   (214754) : Loading success.
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.578 182096 DEBUG nova.compute.manager [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received event network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.579 182096 DEBUG oslo_concurrency.lockutils [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.579 182096 DEBUG oslo_concurrency.lockutils [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.579 182096 DEBUG oslo_concurrency.lockutils [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.579 182096 DEBUG nova.compute.manager [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Processing event network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.580 182096 DEBUG nova.compute.manager [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received event network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.580 182096 DEBUG oslo_concurrency.lockutils [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.580 182096 DEBUG oslo_concurrency.lockutils [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.580 182096 DEBUG oslo_concurrency.lockutils [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.580 182096 DEBUG nova.compute.manager [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] No waiting events found dispatching network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.580 182096 WARNING nova.compute.manager [req-47b1d776-6813-42af-b19c-9ce9e854a0e4 req-5305e442-fced-4f9f-9dc2-5e63a236eef4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received unexpected event network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 for instance with vm_state building and task_state spawning.
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.581 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.584 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159949.5845091, ba63ce37-d255-4310-9b43-66422e2ad4b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.584 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] VM Resumed (Lifecycle Event)
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.587 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.589 182096 INFO nova.virt.libvirt.driver [-] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Instance spawned successfully.
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.589 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.618 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.620 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.620 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.621 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.621 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.621 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.622 182096 DEBUG nova.virt.libvirt.driver [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.624 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.651 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.683 182096 INFO nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Took 5.96 seconds to spawn the instance on the hypervisor.
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.683 182096 DEBUG nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.779 182096 INFO nova.compute.manager [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Took 6.38 seconds to build instance.
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.803 182096 DEBUG oslo_concurrency.lockutils [None req-6a92d5aa-bc72-4ac8-b83f-bf120219efdb 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.936 182096 DEBUG nova.network.neutron [req-2bfbcc5f-216a-4c58-a7d7-89bae619c296 req-2b33506d-136b-4b14-ae2f-56090daec3ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Updated VIF entry in instance network info cache for port 77cf3143-8e1d-4ff0-b03a-134abf5c1c58. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.936 182096 DEBUG nova.network.neutron [req-2bfbcc5f-216a-4c58-a7d7-89bae619c296 req-2b33506d-136b-4b14-ae2f-56090daec3ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Updating instance_info_cache with network_info: [{"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:09 compute-0 nova_compute[182092]: 2026-01-23 09:19:09.950 182096 DEBUG oslo_concurrency.lockutils [req-2bfbcc5f-216a-4c58-a7d7-89bae619c296 req-2b33506d-136b-4b14-ae2f-56090daec3ed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-ba63ce37-d255-4310-9b43-66422e2ad4b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:11 compute-0 podman[214761]: 2026-01-23 09:19:11.223595507 +0000 UTC m=+0.060185502 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.508 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "ba63ce37-d255-4310-9b43-66422e2ad4b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.509 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.509 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.509 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.510 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.517 182096 INFO nova.compute.manager [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Terminating instance
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.523 182096 DEBUG nova.compute.manager [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:19:11 compute-0 kernel: tap77cf3143-8e (unregistering): left promiscuous mode
Jan 23 09:19:11 compute-0 NetworkManager[54920]: <info>  [1769159951.5430] device (tap77cf3143-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.550 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 ovn_controller[94697]: 2026-01-23T09:19:11Z|00129|binding|INFO|Releasing lport 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 from this chassis (sb_readonly=0)
Jan 23 09:19:11 compute-0 ovn_controller[94697]: 2026-01-23T09:19:11Z|00130|binding|INFO|Setting lport 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 down in Southbound
Jan 23 09:19:11 compute-0 ovn_controller[94697]: 2026-01-23T09:19:11Z|00131|binding|INFO|Removing iface tap77cf3143-8e ovn-installed in OVS
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.553 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.564 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:2e:90 10.100.0.10'], port_security=['fa:16:3e:c5:2e:90 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ba63ce37-d255-4310-9b43-66422e2ad4b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f7ffe2f-3ff4-41f8-bc47-9f0605049164', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d500707899f49baa15640293933cb00', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd3acd4f1-d477-4873-a6be-e6f3c9a12729', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c69b6176-6617-42ce-92f6-8bd57b01017d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=77cf3143-8e1d-4ff0-b03a-134abf5c1c58) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.565 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 77cf3143-8e1d-4ff0-b03a-134abf5c1c58 in datapath 1f7ffe2f-3ff4-41f8-bc47-9f0605049164 unbound from our chassis
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.567 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f7ffe2f-3ff4-41f8-bc47-9f0605049164, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.567 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3f45fa5c-cf73-4d4a-afab-da4c9c230894]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.568 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164 namespace which is not needed anymore
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.568 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000032.scope: Deactivated successfully.
Jan 23 09:19:11 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000032.scope: Consumed 2.145s CPU time.
Jan 23 09:19:11 compute-0 systemd-machined[153562]: Machine qemu-20-instance-00000032 terminated.
Jan 23 09:19:11 compute-0 neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164[214750]: [NOTICE]   (214754) : haproxy version is 2.8.14-c23fe91
Jan 23 09:19:11 compute-0 neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164[214750]: [NOTICE]   (214754) : path to executable is /usr/sbin/haproxy
Jan 23 09:19:11 compute-0 neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164[214750]: [ALERT]    (214754) : Current worker (214756) exited with code 143 (Terminated)
Jan 23 09:19:11 compute-0 neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164[214750]: [WARNING]  (214754) : All workers exited. Exiting... (0)
Jan 23 09:19:11 compute-0 systemd[1]: libpod-e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd.scope: Deactivated successfully.
Jan 23 09:19:11 compute-0 conmon[214750]: conmon e82a23ede5845ace9a6f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd.scope/container/memory.events
Jan 23 09:19:11 compute-0 podman[214807]: 2026-01-23 09:19:11.665745202 +0000 UTC m=+0.036001099 container died e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:19:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd-userdata-shm.mount: Deactivated successfully.
Jan 23 09:19:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-1f417e105079035695c2e1878d6ccb6d5af16800818c6486be8b6c0a97abe2b9-merged.mount: Deactivated successfully.
Jan 23 09:19:11 compute-0 podman[214807]: 2026-01-23 09:19:11.686331675 +0000 UTC m=+0.056587562 container cleanup e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:19:11 compute-0 systemd[1]: libpod-conmon-e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd.scope: Deactivated successfully.
Jan 23 09:19:11 compute-0 podman[214830]: 2026-01-23 09:19:11.72820369 +0000 UTC m=+0.025575468 container remove e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.729 182096 DEBUG nova.compute.manager [req-56486b11-1c95-454e-bece-f308c82e3590 req-c955cd01-c6fa-459e-b353-181a151adf5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received event network-vif-unplugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.729 182096 DEBUG oslo_concurrency.lockutils [req-56486b11-1c95-454e-bece-f308c82e3590 req-c955cd01-c6fa-459e-b353-181a151adf5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.730 182096 DEBUG oslo_concurrency.lockutils [req-56486b11-1c95-454e-bece-f308c82e3590 req-c955cd01-c6fa-459e-b353-181a151adf5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.730 182096 DEBUG oslo_concurrency.lockutils [req-56486b11-1c95-454e-bece-f308c82e3590 req-c955cd01-c6fa-459e-b353-181a151adf5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.730 182096 DEBUG nova.compute.manager [req-56486b11-1c95-454e-bece-f308c82e3590 req-c955cd01-c6fa-459e-b353-181a151adf5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] No waiting events found dispatching network-vif-unplugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.731 182096 DEBUG nova.compute.manager [req-56486b11-1c95-454e-bece-f308c82e3590 req-c955cd01-c6fa-459e-b353-181a151adf5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received event network-vif-unplugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.732 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8034813c-142c-40d2-a0ce-14772a3eb70e]: (4, ('Fri Jan 23 09:19:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164 (e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd)\ne82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd\nFri Jan 23 09:19:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164 (e82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd)\ne82a23ede5845ace9a6f823e6c77c7706cbab4ef6d1c6246ee6ced700a9597cd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.734 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[075da087-3ccb-4397-8e92-2204187b1f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.735 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f7ffe2f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.737 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.752 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 kernel: tap1f7ffe2f-30: left promiscuous mode
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.759 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.760 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[51979d83-2845-4b3b-9320-4e94e5f13b86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.769 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c322c95-98fa-43c4-886f-ee816a1afbaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.769 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c36bc1fd-9909-4b9b-9b8c-579b086faacf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.776 182096 INFO nova.virt.libvirt.driver [-] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Instance destroyed successfully.
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.777 182096 DEBUG nova.objects.instance [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lazy-loading 'resources' on Instance uuid ba63ce37-d255-4310-9b43-66422e2ad4b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.783 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d35cd787-e417-4028-83fe-4ccec336b3a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345399, 'reachable_time': 16947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214860, 'error': None, 'target': 'ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:11 compute-0 systemd[1]: run-netns-ovnmeta\x2d1f7ffe2f\x2d3ff4\x2d41f8\x2dbc47\x2d9f0605049164.mount: Deactivated successfully.
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.785 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f7ffe2f-3ff4-41f8-bc47-9f0605049164 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:19:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:11.785 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[df4d28c8-c527-4e4e-a30d-2a36e4971fb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.793 182096 DEBUG nova.virt.libvirt.vif [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1514416608',display_name='tempest-ImagesNegativeTestJSON-server-1514416608',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1514416608',id=50,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7d500707899f49baa15640293933cb00',ramdisk_id='',reservation_id='r-bik47el2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-2103654228',owner_user_name='tempest-ImagesNegativeTestJSON-2103654228-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:09Z,user_data=None,user_id='10ee3f3ad9014355a5e44b919026c456',uuid=ba63ce37-d255-4310-9b43-66422e2ad4b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.793 182096 DEBUG nova.network.os_vif_util [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Converting VIF {"id": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "address": "fa:16:3e:c5:2e:90", "network": {"id": "1f7ffe2f-3ff4-41f8-bc47-9f0605049164", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1641267322-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d500707899f49baa15640293933cb00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77cf3143-8e", "ovs_interfaceid": "77cf3143-8e1d-4ff0-b03a-134abf5c1c58", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.794 182096 DEBUG nova.network.os_vif_util [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:2e:90,bridge_name='br-int',has_traffic_filtering=True,id=77cf3143-8e1d-4ff0-b03a-134abf5c1c58,network=Network(1f7ffe2f-3ff4-41f8-bc47-9f0605049164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77cf3143-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.794 182096 DEBUG os_vif [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:2e:90,bridge_name='br-int',has_traffic_filtering=True,id=77cf3143-8e1d-4ff0-b03a-134abf5c1c58,network=Network(1f7ffe2f-3ff4-41f8-bc47-9f0605049164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77cf3143-8e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.795 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.795 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77cf3143-8e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.796 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.797 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.798 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.799 182096 INFO os_vif [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:2e:90,bridge_name='br-int',has_traffic_filtering=True,id=77cf3143-8e1d-4ff0-b03a-134abf5c1c58,network=Network(1f7ffe2f-3ff4-41f8-bc47-9f0605049164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77cf3143-8e')
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.800 182096 INFO nova.virt.libvirt.driver [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Deleting instance files /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6_del
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.800 182096 INFO nova.virt.libvirt.driver [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Deletion of /var/lib/nova/instances/ba63ce37-d255-4310-9b43-66422e2ad4b6_del complete
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.859 182096 INFO nova.compute.manager [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.859 182096 DEBUG oslo.service.loopingcall [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.860 182096 DEBUG nova.compute.manager [-] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.860 182096 DEBUG nova.network.neutron [-] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:19:11 compute-0 nova_compute[182092]: 2026-01-23 09:19:11.930 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.403 182096 DEBUG nova.network.neutron [-] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.415 182096 INFO nova.compute.manager [-] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Took 0.56 seconds to deallocate network for instance.
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.490 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.491 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.502 182096 DEBUG nova.compute.manager [req-8dcfee62-4579-45e5-8843-0d0abcdc994f req-b8191184-5e38-48e4-b2b9-4b5fa2b76737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received event network-vif-deleted-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.553 182096 DEBUG nova.compute.provider_tree [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.564 182096 DEBUG nova.scheduler.client.report [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.581 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.601 182096 INFO nova.scheduler.client.report [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Deleted allocations for instance ba63ce37-d255-4310-9b43-66422e2ad4b6
Jan 23 09:19:12 compute-0 nova_compute[182092]: 2026-01-23 09:19:12.686 182096 DEBUG oslo_concurrency.lockutils [None req-799fc1d0-7d70-4ec9-848a-f3797939ecdc 10ee3f3ad9014355a5e44b919026c456 7d500707899f49baa15640293933cb00 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:13 compute-0 nova_compute[182092]: 2026-01-23 09:19:13.840 182096 DEBUG nova.compute.manager [req-31fe2e8c-6590-4c5c-b8ab-6e4d5409329c req-8f6ebdea-267b-465e-9fc3-c1d75855a60a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received event network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:13 compute-0 nova_compute[182092]: 2026-01-23 09:19:13.840 182096 DEBUG oslo_concurrency.lockutils [req-31fe2e8c-6590-4c5c-b8ab-6e4d5409329c req-8f6ebdea-267b-465e-9fc3-c1d75855a60a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:13 compute-0 nova_compute[182092]: 2026-01-23 09:19:13.841 182096 DEBUG oslo_concurrency.lockutils [req-31fe2e8c-6590-4c5c-b8ab-6e4d5409329c req-8f6ebdea-267b-465e-9fc3-c1d75855a60a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:13 compute-0 nova_compute[182092]: 2026-01-23 09:19:13.841 182096 DEBUG oslo_concurrency.lockutils [req-31fe2e8c-6590-4c5c-b8ab-6e4d5409329c req-8f6ebdea-267b-465e-9fc3-c1d75855a60a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ba63ce37-d255-4310-9b43-66422e2ad4b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:13 compute-0 nova_compute[182092]: 2026-01-23 09:19:13.841 182096 DEBUG nova.compute.manager [req-31fe2e8c-6590-4c5c-b8ab-6e4d5409329c req-8f6ebdea-267b-465e-9fc3-c1d75855a60a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] No waiting events found dispatching network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:13 compute-0 nova_compute[182092]: 2026-01-23 09:19:13.841 182096 WARNING nova.compute.manager [req-31fe2e8c-6590-4c5c-b8ab-6e4d5409329c req-8f6ebdea-267b-465e-9fc3-c1d75855a60a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Received unexpected event network-vif-plugged-77cf3143-8e1d-4ff0-b03a-134abf5c1c58 for instance with vm_state deleted and task_state None.
Jan 23 09:19:16 compute-0 nova_compute[182092]: 2026-01-23 09:19:16.071 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:16 compute-0 nova_compute[182092]: 2026-01-23 09:19:16.796 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:16 compute-0 nova_compute[182092]: 2026-01-23 09:19:16.931 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:19 compute-0 podman[214863]: 2026-01-23 09:19:19.20525753 +0000 UTC m=+0.041802056 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:19:19 compute-0 podman[214862]: 2026-01-23 09:19:19.208291775 +0000 UTC m=+0.045096712 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 09:19:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:20.229 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:19:20 compute-0 nova_compute[182092]: 2026-01-23 09:19:20.229 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:20.230 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:19:21 compute-0 nova_compute[182092]: 2026-01-23 09:19:21.797 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:21 compute-0 nova_compute[182092]: 2026-01-23 09:19:21.933 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:22.231 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:22 compute-0 nova_compute[182092]: 2026-01-23 09:19:22.845 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:22 compute-0 nova_compute[182092]: 2026-01-23 09:19:22.845 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:22 compute-0 nova_compute[182092]: 2026-01-23 09:19:22.893 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:19:22 compute-0 nova_compute[182092]: 2026-01-23 09:19:22.983 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:22 compute-0 nova_compute[182092]: 2026-01-23 09:19:22.983 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:22 compute-0 nova_compute[182092]: 2026-01-23 09:19:22.988 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:19:22 compute-0 nova_compute[182092]: 2026-01-23 09:19:22.989 182096 INFO nova.compute.claims [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.085 182096 DEBUG nova.compute.provider_tree [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.095 182096 DEBUG nova.scheduler.client.report [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.109 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.109 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.149 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.149 182096 DEBUG nova.network.neutron [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.163 182096 INFO nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.176 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.298 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.299 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.299 182096 INFO nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Creating image(s)
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.300 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.300 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.300 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.310 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.358 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.360 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.361 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.370 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.417 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.417 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.433 182096 DEBUG nova.policy [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66f8a16479154d23a187c0922062f421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '743f68086d8f4ae38a7b2fb3e91ce01c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.440 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.441 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.441 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.487 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.488 182096 DEBUG nova.virt.disk.api [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Checking if we can resize image /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.488 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.535 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.535 182096 DEBUG nova.virt.disk.api [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Cannot resize image /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.536 182096 DEBUG nova.objects.instance [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'migration_context' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.550 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.551 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Ensure instance console log exists: /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.551 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.551 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.551 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:23 compute-0 nova_compute[182092]: 2026-01-23 09:19:23.933 182096 DEBUG nova.network.neutron [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Successfully created port: 242626ef-6515-431f-9f4d-fc4aabe2c080 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:19:25 compute-0 nova_compute[182092]: 2026-01-23 09:19:25.817 182096 DEBUG nova.network.neutron [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Successfully updated port: 242626ef-6515-431f-9f4d-fc4aabe2c080 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:19:25 compute-0 nova_compute[182092]: 2026-01-23 09:19:25.829 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:25 compute-0 nova_compute[182092]: 2026-01-23 09:19:25.829 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquired lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:25 compute-0 nova_compute[182092]: 2026-01-23 09:19:25.829 182096 DEBUG nova.network.neutron [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:19:25 compute-0 nova_compute[182092]: 2026-01-23 09:19:25.970 182096 DEBUG nova.compute.manager [req-bb48c77a-c2ad-4da8-8f8b-88dcf430ae11 req-73f4d6ff-fe0e-4102-bb5c-a70d067b9f7f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-changed-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:25 compute-0 nova_compute[182092]: 2026-01-23 09:19:25.970 182096 DEBUG nova.compute.manager [req-bb48c77a-c2ad-4da8-8f8b-88dcf430ae11 req-73f4d6ff-fe0e-4102-bb5c-a70d067b9f7f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Refreshing instance network info cache due to event network-changed-242626ef-6515-431f-9f4d-fc4aabe2c080. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:19:25 compute-0 nova_compute[182092]: 2026-01-23 09:19:25.970 182096 DEBUG oslo_concurrency.lockutils [req-bb48c77a-c2ad-4da8-8f8b-88dcf430ae11 req-73f4d6ff-fe0e-4102-bb5c-a70d067b9f7f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:26 compute-0 nova_compute[182092]: 2026-01-23 09:19:26.014 182096 DEBUG nova.network.neutron [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:19:26 compute-0 nova_compute[182092]: 2026-01-23 09:19:26.776 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159951.7751124, ba63ce37-d255-4310-9b43-66422e2ad4b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:26 compute-0 nova_compute[182092]: 2026-01-23 09:19:26.776 182096 INFO nova.compute.manager [-] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] VM Stopped (Lifecycle Event)
Jan 23 09:19:26 compute-0 nova_compute[182092]: 2026-01-23 09:19:26.791 182096 DEBUG nova.compute.manager [None req-4ef45703-88f9-48b6-ade6-dff7becaeb5c - - - - - -] [instance: ba63ce37-d255-4310-9b43-66422e2ad4b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:26 compute-0 nova_compute[182092]: 2026-01-23 09:19:26.798 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:26 compute-0 nova_compute[182092]: 2026-01-23 09:19:26.935 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.062 182096 DEBUG nova.network.neutron [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Updating instance_info_cache with network_info: [{"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.081 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Releasing lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.081 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance network_info: |[{"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.082 182096 DEBUG oslo_concurrency.lockutils [req-bb48c77a-c2ad-4da8-8f8b-88dcf430ae11 req-73f4d6ff-fe0e-4102-bb5c-a70d067b9f7f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.082 182096 DEBUG nova.network.neutron [req-bb48c77a-c2ad-4da8-8f8b-88dcf430ae11 req-73f4d6ff-fe0e-4102-bb5c-a70d067b9f7f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Refreshing network info cache for port 242626ef-6515-431f-9f4d-fc4aabe2c080 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.084 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Start _get_guest_xml network_info=[{"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.087 182096 WARNING nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.091 182096 DEBUG nova.virt.libvirt.host [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.091 182096 DEBUG nova.virt.libvirt.host [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.095 182096 DEBUG nova.virt.libvirt.host [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.095 182096 DEBUG nova.virt.libvirt.host [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.096 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.096 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.096 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.097 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.097 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.097 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.097 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.097 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.098 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.098 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.098 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.098 182096 DEBUG nova.virt.hardware [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.101 182096 DEBUG nova.virt.libvirt.vif [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-939361418',display_name='tempest-ListServerFiltersTestJSON-instance-939361418',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-939361418',id=52,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='743f68086d8f4ae38a7b2fb3e91ce01c',ramdisk_id='',reservation_id='r-3gw46b98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-721478701',owner_user_name='tempest-ListServerFiltersTestJSON-721478701-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:19:23Z,user_data=None,user_id='66f8a16479154d23a187c0922062f421',uuid=4dd3c99c-cee8-4856-b2a2-2bd42f7ac038,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.101 182096 DEBUG nova.network.os_vif_util [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converting VIF {"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.102 182096 DEBUG nova.network.os_vif_util [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.102 182096 DEBUG nova.objects.instance [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'pci_devices' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.117 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <uuid>4dd3c99c-cee8-4856-b2a2-2bd42f7ac038</uuid>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <name>instance-00000034</name>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-939361418</nova:name>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:19:27</nova:creationTime>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:19:27 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:19:27 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:19:27 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:19:27 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:19:27 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:19:27 compute-0 nova_compute[182092]:         <nova:user uuid="66f8a16479154d23a187c0922062f421">tempest-ListServerFiltersTestJSON-721478701-project-member</nova:user>
Jan 23 09:19:27 compute-0 nova_compute[182092]:         <nova:project uuid="743f68086d8f4ae38a7b2fb3e91ce01c">tempest-ListServerFiltersTestJSON-721478701</nova:project>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:19:27 compute-0 nova_compute[182092]:         <nova:port uuid="242626ef-6515-431f-9f4d-fc4aabe2c080">
Jan 23 09:19:27 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <system>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <entry name="serial">4dd3c99c-cee8-4856-b2a2-2bd42f7ac038</entry>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <entry name="uuid">4dd3c99c-cee8-4856-b2a2-2bd42f7ac038</entry>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </system>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <os>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   </os>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <features>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   </features>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.config"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:2f:2b:87"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <target dev="tap242626ef-65"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/console.log" append="off"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <video>
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </video>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:19:27 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:19:27 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:19:27 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:19:27 compute-0 nova_compute[182092]: </domain>
Jan 23 09:19:27 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.118 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Preparing to wait for external event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.118 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.118 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.118 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.119 182096 DEBUG nova.virt.libvirt.vif [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-939361418',display_name='tempest-ListServerFiltersTestJSON-instance-939361418',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-939361418',id=52,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='743f68086d8f4ae38a7b2fb3e91ce01c',ramdisk_id='',reservation_id='r-3gw46b98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-721478701',owner_user_name='tempest-ListServerFiltersTestJSON-721478701-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:19:23Z,user_data=None,user_id='66f8a16479154d23a187c0922062f421',uuid=4dd3c99c-cee8-4856-b2a2-2bd42f7ac038,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.119 182096 DEBUG nova.network.os_vif_util [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converting VIF {"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.120 182096 DEBUG nova.network.os_vif_util [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.120 182096 DEBUG os_vif [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.120 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.121 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.121 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.122 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.123 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap242626ef-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.123 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap242626ef-65, col_values=(('external_ids', {'iface-id': '242626ef-6515-431f-9f4d-fc4aabe2c080', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:2b:87', 'vm-uuid': '4dd3c99c-cee8-4856-b2a2-2bd42f7ac038'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:27 compute-0 NetworkManager[54920]: <info>  [1769159967.1249] manager: (tap242626ef-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.124 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.126 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.127 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.128 182096 INFO os_vif [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65')
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.167 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.167 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.167 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] No VIF found with MAC fa:16:3e:2f:2b:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.167 182096 INFO nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Using config drive
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.943 182096 INFO nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Creating config drive at /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.config
Jan 23 09:19:27 compute-0 nova_compute[182092]: 2026-01-23 09:19:27.947 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjmfwzgij execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.065 182096 DEBUG oslo_concurrency.processutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjmfwzgij" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:28 compute-0 kernel: tap242626ef-65: entered promiscuous mode
Jan 23 09:19:28 compute-0 NetworkManager[54920]: <info>  [1769159968.1007] manager: (tap242626ef-65): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Jan 23 09:19:28 compute-0 ovn_controller[94697]: 2026-01-23T09:19:28Z|00132|binding|INFO|Claiming lport 242626ef-6515-431f-9f4d-fc4aabe2c080 for this chassis.
Jan 23 09:19:28 compute-0 ovn_controller[94697]: 2026-01-23T09:19:28Z|00133|binding|INFO|242626ef-6515-431f-9f4d-fc4aabe2c080: Claiming fa:16:3e:2f:2b:87 10.100.0.10
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.105 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.108 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:2b:87 10.100.0.10'], port_security=['fa:16:3e:2f:2b:87 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4dd3c99c-cee8-4856-b2a2-2bd42f7ac038', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '743f68086d8f4ae38a7b2fb3e91ce01c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '980000d9-53ce-405d-b2a7-06d2e52da4ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef5e5d22-91ef-425b-9a12-147397ffa1e3, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=242626ef-6515-431f-9f4d-fc4aabe2c080) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.110 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 242626ef-6515-431f-9f4d-fc4aabe2c080 in datapath 63e19746-3286-4b7a-ad2d-556b0af85c5a bound to our chassis
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.111 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 63e19746-3286-4b7a-ad2d-556b0af85c5a
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.120 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc9fa5c-0743-4d63-a9d1-83c3eb166995]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.121 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap63e19746-31 in ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.123 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap63e19746-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.123 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e5de1e61-0119-4803-b350-059448cd49e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 systemd-udevd[214935]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.125 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[932cbe42-fe1e-429e-b96b-7fbe892cd091]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 NetworkManager[54920]: <info>  [1769159968.1340] device (tap242626ef-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.137 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[27e0a297-c3d2-40bd-b992-b0c1008104bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 NetworkManager[54920]: <info>  [1769159968.1410] device (tap242626ef-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:19:28 compute-0 systemd-machined[153562]: New machine qemu-21-instance-00000034.
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.143 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:28 compute-0 ovn_controller[94697]: 2026-01-23T09:19:28Z|00134|binding|INFO|Setting lport 242626ef-6515-431f-9f4d-fc4aabe2c080 ovn-installed in OVS
Jan 23 09:19:28 compute-0 ovn_controller[94697]: 2026-01-23T09:19:28Z|00135|binding|INFO|Setting lport 242626ef-6515-431f-9f4d-fc4aabe2c080 up in Southbound
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.145 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.147 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[15b08e43-e018-4310-95ba-416a3bae11fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000034.
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.168 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[53afb386-645f-4970-91d1-4e425c9706ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.171 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[085d4bdd-6aa8-4822-bb00-35e1f6ccc4d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 NetworkManager[54920]: <info>  [1769159968.1719] manager: (tap63e19746-30): new Veth device (/org/freedesktop/NetworkManager/Devices/78)
Jan 23 09:19:28 compute-0 systemd-udevd[214940]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.191 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b5546822-f845-4e3b-b949-b3547fcaae66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.193 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ca82c388-9ce1-4d35-9f9f-ca2bc7d9b12e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 NetworkManager[54920]: <info>  [1769159968.2087] device (tap63e19746-30): carrier: link connected
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.212 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[3856911e-d769-48ba-9af4-24bbab77342d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.225 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b1afdc8a-ed38-4b95-b3a5-de9e79ad3294]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63e19746-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:18:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347369, 'reachable_time': 39949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214961, 'error': None, 'target': 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.236 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[be780ce9-10cf-46b1-8aac-aeed9a00039b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:1823'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347369, 'tstamp': 347369}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214962, 'error': None, 'target': 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.246 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bc1c8c94-c811-4125-8346-094458213917]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63e19746-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:18:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347369, 'reachable_time': 39949, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214963, 'error': None, 'target': 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.266 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5293d8-c09d-4966-8a21-a96108602a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.303 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8ccc7ebd-2834-4105-ae42-29f07c67ee29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.304 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63e19746-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.304 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.305 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63e19746-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:28 compute-0 NetworkManager[54920]: <info>  [1769159968.3072] manager: (tap63e19746-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 23 09:19:28 compute-0 kernel: tap63e19746-30: entered promiscuous mode
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.309 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap63e19746-30, col_values=(('external_ids', {'iface-id': 'b10bca13-6c88-49ff-b7f6-9cb6db2a6a71'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:28 compute-0 ovn_controller[94697]: 2026-01-23T09:19:28Z|00136|binding|INFO|Releasing lport b10bca13-6c88-49ff-b7f6-9cb6db2a6a71 from this chassis (sb_readonly=0)
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.306 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.323 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/63e19746-3286-4b7a-ad2d-556b0af85c5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/63e19746-3286-4b7a-ad2d-556b0af85c5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.324 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[28c65194-3e11-4eaf-a7bb-d56984d947e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.324 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-63e19746-3286-4b7a-ad2d-556b0af85c5a
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/63e19746-3286-4b7a-ad2d-556b0af85c5a.pid.haproxy
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 63e19746-3286-4b7a-ad2d-556b0af85c5a
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:19:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:28.325 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'env', 'PROCESS_TAG=haproxy-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/63e19746-3286-4b7a-ad2d-556b0af85c5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.432 182096 DEBUG nova.compute.manager [req-3f3097e7-a673-49e3-a964-30dbd6b19f33 req-5c908cb2-c3d1-4400-849c-0146f078e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.433 182096 DEBUG oslo_concurrency.lockutils [req-3f3097e7-a673-49e3-a964-30dbd6b19f33 req-5c908cb2-c3d1-4400-849c-0146f078e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.433 182096 DEBUG oslo_concurrency.lockutils [req-3f3097e7-a673-49e3-a964-30dbd6b19f33 req-5c908cb2-c3d1-4400-849c-0146f078e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.433 182096 DEBUG oslo_concurrency.lockutils [req-3f3097e7-a673-49e3-a964-30dbd6b19f33 req-5c908cb2-c3d1-4400-849c-0146f078e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:28 compute-0 nova_compute[182092]: 2026-01-23 09:19:28.433 182096 DEBUG nova.compute.manager [req-3f3097e7-a673-49e3-a964-30dbd6b19f33 req-5c908cb2-c3d1-4400-849c-0146f078e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Processing event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:19:28 compute-0 podman[214991]: 2026-01-23 09:19:28.6034644 +0000 UTC m=+0.033943717 container create bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 09:19:28 compute-0 systemd[1]: Started libpod-conmon-bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da.scope.
Jan 23 09:19:28 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:19:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10e34e9c86b6c01f3bf5b385d4c27644cdcdc5f7442dbca68f96ae1427536014/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:19:28 compute-0 podman[214991]: 2026-01-23 09:19:28.650549672 +0000 UTC m=+0.081028989 container init bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 09:19:28 compute-0 podman[214991]: 2026-01-23 09:19:28.655866755 +0000 UTC m=+0.086346072 container start bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:19:28 compute-0 podman[214991]: 2026-01-23 09:19:28.585560268 +0000 UTC m=+0.016039605 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:19:28 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215003]: [NOTICE]   (215007) : New worker (215009) forked
Jan 23 09:19:28 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215003]: [NOTICE]   (215007) : Loading success.
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.195 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159969.1950462, 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.195 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] VM Started (Lifecycle Event)
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.197 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.201 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.203 182096 INFO nova.virt.libvirt.driver [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance spawned successfully.
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.203 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.214 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.215 182096 DEBUG nova.network.neutron [req-bb48c77a-c2ad-4da8-8f8b-88dcf430ae11 req-73f4d6ff-fe0e-4102-bb5c-a70d067b9f7f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Updated VIF entry in instance network info cache for port 242626ef-6515-431f-9f4d-fc4aabe2c080. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.216 182096 DEBUG nova.network.neutron [req-bb48c77a-c2ad-4da8-8f8b-88dcf430ae11 req-73f4d6ff-fe0e-4102-bb5c-a70d067b9f7f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Updating instance_info_cache with network_info: [{"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.223 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.225 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.225 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.225 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.226 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.226 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.226 182096 DEBUG nova.virt.libvirt.driver [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.247 182096 DEBUG oslo_concurrency.lockutils [req-bb48c77a-c2ad-4da8-8f8b-88dcf430ae11 req-73f4d6ff-fe0e-4102-bb5c-a70d067b9f7f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.248 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.248 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159969.1973045, 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.248 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] VM Paused (Lifecycle Event)
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.266 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.269 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159969.1992583, 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.270 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] VM Resumed (Lifecycle Event)
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.285 182096 INFO nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Took 5.99 seconds to spawn the instance on the hypervisor.
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.285 182096 DEBUG nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.290 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.291 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.313 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.352 182096 INFO nova.compute.manager [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Took 6.40 seconds to build instance.
Jan 23 09:19:29 compute-0 nova_compute[182092]: 2026-01-23 09:19:29.367 182096 DEBUG oslo_concurrency.lockutils [None req-e4615396-7b1f-4075-b24a-9ccbf01e80e4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:30 compute-0 nova_compute[182092]: 2026-01-23 09:19:30.517 182096 DEBUG nova.compute.manager [req-361f3868-769f-4643-a090-15fcc7ae421b req-9e458fda-fe0f-4323-b6ca-6f8e7bf6ffef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:30 compute-0 nova_compute[182092]: 2026-01-23 09:19:30.518 182096 DEBUG oslo_concurrency.lockutils [req-361f3868-769f-4643-a090-15fcc7ae421b req-9e458fda-fe0f-4323-b6ca-6f8e7bf6ffef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:30 compute-0 nova_compute[182092]: 2026-01-23 09:19:30.518 182096 DEBUG oslo_concurrency.lockutils [req-361f3868-769f-4643-a090-15fcc7ae421b req-9e458fda-fe0f-4323-b6ca-6f8e7bf6ffef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:30 compute-0 nova_compute[182092]: 2026-01-23 09:19:30.519 182096 DEBUG oslo_concurrency.lockutils [req-361f3868-769f-4643-a090-15fcc7ae421b req-9e458fda-fe0f-4323-b6ca-6f8e7bf6ffef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:30 compute-0 nova_compute[182092]: 2026-01-23 09:19:30.519 182096 DEBUG nova.compute.manager [req-361f3868-769f-4643-a090-15fcc7ae421b req-9e458fda-fe0f-4323-b6ca-6f8e7bf6ffef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] No waiting events found dispatching network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:30 compute-0 nova_compute[182092]: 2026-01-23 09:19:30.520 182096 WARNING nova.compute.manager [req-361f3868-769f-4643-a090-15fcc7ae421b req-9e458fda-fe0f-4323-b6ca-6f8e7bf6ffef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received unexpected event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 for instance with vm_state active and task_state None.
Jan 23 09:19:31 compute-0 podman[215021]: 2026-01-23 09:19:31.207789235 +0000 UTC m=+0.041898435 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 09:19:31 compute-0 podman[215022]: 2026-01-23 09:19:31.213377149 +0000 UTC m=+0.045465807 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:19:31 compute-0 nova_compute[182092]: 2026-01-23 09:19:31.936 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:32 compute-0 nova_compute[182092]: 2026-01-23 09:19:32.124 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.681 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "50310be0-78cb-425f-ba64-fd847de259fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.681 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "50310be0-78cb-425f-ba64-fd847de259fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.698 182096 DEBUG nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.785 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.785 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.791 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.791 182096 INFO nova.compute.claims [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.911 182096 DEBUG nova.compute.provider_tree [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.922 182096 DEBUG nova.scheduler.client.report [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.938 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:33 compute-0 nova_compute[182092]: 2026-01-23 09:19:33.940 182096 DEBUG nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.005 182096 DEBUG nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.005 182096 DEBUG nova.network.neutron [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.023 182096 INFO nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.037 182096 DEBUG nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.130 182096 DEBUG nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.131 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.131 182096 INFO nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Creating image(s)
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.131 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "/var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.132 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "/var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.132 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "/var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.143 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.200 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.203 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.203 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.213 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.227 182096 DEBUG nova.network.neutron [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.228 182096 DEBUG nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.270 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.271 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.292 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.294 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.294 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.351 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.352 182096 DEBUG nova.virt.disk.api [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Checking if we can resize image /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.352 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.411 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.412 182096 DEBUG nova.virt.disk.api [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Cannot resize image /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.413 182096 DEBUG nova.objects.instance [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lazy-loading 'migration_context' on Instance uuid 50310be0-78cb-425f-ba64-fd847de259fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.427 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.428 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Ensure instance console log exists: /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.428 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.428 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.429 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.430 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.434 182096 WARNING nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.438 182096 DEBUG nova.virt.libvirt.host [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.438 182096 DEBUG nova.virt.libvirt.host [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.441 182096 DEBUG nova.virt.libvirt.host [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.441 182096 DEBUG nova.virt.libvirt.host [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.443 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.444 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.444 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.445 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.445 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.445 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.446 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.446 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.447 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.447 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.448 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.448 182096 DEBUG nova.virt.hardware [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.453 182096 DEBUG nova.objects.instance [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lazy-loading 'pci_devices' on Instance uuid 50310be0-78cb-425f-ba64-fd847de259fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.468 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <uuid>50310be0-78cb-425f-ba64-fd847de259fe</uuid>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <name>instance-00000037</name>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerDiagnosticsTest-server-863006930</nova:name>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:19:34</nova:creationTime>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:19:34 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:19:34 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:19:34 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:19:34 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:19:34 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:19:34 compute-0 nova_compute[182092]:         <nova:user uuid="497df8d451c94b6d9f9b0c73a4dcf508">tempest-ServerDiagnosticsTest-932894100-project-member</nova:user>
Jan 23 09:19:34 compute-0 nova_compute[182092]:         <nova:project uuid="8174e03b6b9c43bdb1390ae1dc1da388">tempest-ServerDiagnosticsTest-932894100</nova:project>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <system>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <entry name="serial">50310be0-78cb-425f-ba64-fd847de259fe</entry>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <entry name="uuid">50310be0-78cb-425f-ba64-fd847de259fe</entry>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     </system>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <os>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   </os>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <features>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   </features>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk.config"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/console.log" append="off"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <video>
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     </video>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:19:34 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:19:34 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:19:34 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:19:34 compute-0 nova_compute[182092]: </domain>
Jan 23 09:19:34 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.513 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.514 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.514 182096 INFO nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Using config drive
Jan 23 09:19:34 compute-0 podman[215075]: 2026-01-23 09:19:34.541166164 +0000 UTC m=+0.042723393 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, release=1755695350)
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.747 182096 INFO nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Creating config drive at /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk.config
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.751 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoidq5ebd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:34 compute-0 nova_compute[182092]: 2026-01-23 09:19:34.870 182096 DEBUG oslo_concurrency.processutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoidq5ebd" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:34 compute-0 systemd-machined[153562]: New machine qemu-22-instance-00000037.
Jan 23 09:19:34 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-00000037.
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.155 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159975.1538818, 50310be0-78cb-425f-ba64-fd847de259fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.155 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] VM Resumed (Lifecycle Event)
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.158 182096 DEBUG nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.158 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.161 182096 INFO nova.virt.libvirt.driver [-] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Instance spawned successfully.
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.161 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.174 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.178 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.181 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.181 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.182 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.182 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.182 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.183 182096 DEBUG nova.virt.libvirt.driver [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.219 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.219 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159975.154477, 50310be0-78cb-425f-ba64-fd847de259fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.219 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] VM Started (Lifecycle Event)
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.241 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.243 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.257 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.275 182096 INFO nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Took 1.15 seconds to spawn the instance on the hypervisor.
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.275 182096 DEBUG nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.374 182096 INFO nova.compute.manager [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Took 1.62 seconds to build instance.
Jan 23 09:19:35 compute-0 nova_compute[182092]: 2026-01-23 09:19:35.404 182096 DEBUG oslo_concurrency.lockutils [None req-d5b0ad7b-fa64-4d3f-a0bb-662ca153fbcb 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "50310be0-78cb-425f-ba64-fd847de259fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:36 compute-0 nova_compute[182092]: 2026-01-23 09:19:36.927 182096 DEBUG nova.compute.manager [None req-6bf268d0-1b3c-49b3-8069-df3e528a7fad ad6d3a547bc34731b892c284d519f2ca f218f28166a64f708d4bcc79a6c2c722 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:36 compute-0 nova_compute[182092]: 2026-01-23 09:19:36.929 182096 INFO nova.compute.manager [None req-6bf268d0-1b3c-49b3-8069-df3e528a7fad ad6d3a547bc34731b892c284d519f2ca f218f28166a64f708d4bcc79a6c2c722 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Retrieving diagnostics
Jan 23 09:19:36 compute-0 nova_compute[182092]: 2026-01-23 09:19:36.937 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.125 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.297 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "50310be0-78cb-425f-ba64-fd847de259fe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.297 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "50310be0-78cb-425f-ba64-fd847de259fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.297 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "50310be0-78cb-425f-ba64-fd847de259fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.298 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "50310be0-78cb-425f-ba64-fd847de259fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.298 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "50310be0-78cb-425f-ba64-fd847de259fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.309 182096 INFO nova.compute.manager [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Terminating instance
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.317 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "refresh_cache-50310be0-78cb-425f-ba64-fd847de259fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.318 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquired lock "refresh_cache-50310be0-78cb-425f-ba64-fd847de259fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.318 182096 DEBUG nova.network.neutron [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.498 182096 DEBUG nova.network.neutron [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.719 182096 DEBUG nova.network.neutron [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.735 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Releasing lock "refresh_cache-50310be0-78cb-425f-ba64-fd847de259fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.735 182096 DEBUG nova.compute.manager [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:19:37 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Deactivated successfully.
Jan 23 09:19:37 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Consumed 2.780s CPU time.
Jan 23 09:19:37 compute-0 systemd-machined[153562]: Machine qemu-22-instance-00000037 terminated.
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.965 182096 INFO nova.virt.libvirt.driver [-] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Instance destroyed successfully.
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.966 182096 DEBUG nova.objects.instance [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lazy-loading 'resources' on Instance uuid 50310be0-78cb-425f-ba64-fd847de259fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.992 182096 INFO nova.virt.libvirt.driver [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Deleting instance files /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe_del
Jan 23 09:19:37 compute-0 nova_compute[182092]: 2026-01-23 09:19:37.992 182096 INFO nova.virt.libvirt.driver [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Deletion of /var/lib/nova/instances/50310be0-78cb-425f-ba64-fd847de259fe_del complete
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.080 182096 INFO nova.compute.manager [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.080 182096 DEBUG oslo.service.loopingcall [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.080 182096 DEBUG nova.compute.manager [-] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.081 182096 DEBUG nova.network.neutron [-] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.203 182096 DEBUG nova.network.neutron [-] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.216 182096 DEBUG nova.network.neutron [-] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.226 182096 INFO nova.compute.manager [-] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Took 0.14 seconds to deallocate network for instance.
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.282 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.283 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.329 182096 DEBUG nova.compute.provider_tree [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.349 182096 DEBUG nova.scheduler.client.report [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.362 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.384 182096 INFO nova.scheduler.client.report [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Deleted allocations for instance 50310be0-78cb-425f-ba64-fd847de259fe
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.440 182096 DEBUG oslo_concurrency.lockutils [None req-8914e767-fab9-4503-aa2d-b50c5066d968 497df8d451c94b6d9f9b0c73a4dcf508 8174e03b6b9c43bdb1390ae1dc1da388 - - default default] Lock "50310be0-78cb-425f-ba64-fd847de259fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.444 182096 DEBUG oslo_concurrency.lockutils [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.444 182096 DEBUG oslo_concurrency.lockutils [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.445 182096 DEBUG nova.compute.manager [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.447 182096 DEBUG nova.compute.manager [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.447 182096 DEBUG nova.objects.instance [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'flavor' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.464 182096 DEBUG nova.objects.instance [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'info_cache' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:38 compute-0 nova_compute[182092]: 2026-01-23 09:19:38.480 182096 DEBUG nova.virt.libvirt.driver [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:19:39 compute-0 ovn_controller[94697]: 2026-01-23T09:19:39Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2f:2b:87 10.100.0.10
Jan 23 09:19:39 compute-0 ovn_controller[94697]: 2026-01-23T09:19:39Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2f:2b:87 10.100.0.10
Jan 23 09:19:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:39.855 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:39.856 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:39.856 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:41 compute-0 nova_compute[182092]: 2026-01-23 09:19:41.938 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:42 compute-0 nova_compute[182092]: 2026-01-23 09:19:42.126 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:42 compute-0 podman[215136]: 2026-01-23 09:19:42.247228914 +0000 UTC m=+0.083424549 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.755 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.755 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.771 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.857 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.857 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.861 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.861 182096 INFO nova.compute.claims [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.980 182096 DEBUG nova.compute.provider_tree [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:19:44 compute-0 nova_compute[182092]: 2026-01-23 09:19:44.992 182096 DEBUG nova.scheduler.client.report [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.007 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.008 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.019 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.019 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.042 182096 DEBUG nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.069 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.069 182096 DEBUG nova.network.neutron [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.089 182096 INFO nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.101 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.116 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.116 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.121 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.121 182096 INFO nova.compute.claims [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.223 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.224 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.224 182096 INFO nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Creating image(s)
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.225 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.225 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.226 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.237 182096 DEBUG nova.policy [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.238 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.285 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.286 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.287 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.296 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.327 182096 DEBUG nova.compute.provider_tree [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.342 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.343 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.362 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.363 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.364 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.376 182096 DEBUG nova.scheduler.client.report [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.402 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.402 182096 DEBUG nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.408 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.408 182096 DEBUG nova.virt.disk.api [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Checking if we can resize image /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.409 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.452 182096 DEBUG nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.462 182096 INFO nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.465 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.465 182096 DEBUG nova.virt.disk.api [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Cannot resize image /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.466 182096 DEBUG nova.objects.instance [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'migration_context' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.475 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.475 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Ensure instance console log exists: /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.475 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.476 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.476 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.477 182096 DEBUG nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.544 182096 DEBUG nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.545 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.545 182096 INFO nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Creating image(s)
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.546 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.546 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.547 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.557 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.603 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.604 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.605 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.614 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.659 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.660 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.679 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.680 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.680 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.724 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.725 182096 DEBUG nova.virt.disk.api [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Checking if we can resize image /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.726 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.771 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.772 182096 DEBUG nova.virt.disk.api [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Cannot resize image /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.773 182096 DEBUG nova.objects.instance [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lazy-loading 'migration_context' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.791 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.792 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Ensure instance console log exists: /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.792 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.793 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.793 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.794 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.797 182096 WARNING nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.802 182096 DEBUG nova.virt.libvirt.host [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.802 182096 DEBUG nova.virt.libvirt.host [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.807 182096 DEBUG nova.virt.libvirt.host [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.807 182096 DEBUG nova.virt.libvirt.host [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.808 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.809 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.809 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.810 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.810 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.810 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.810 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.811 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.811 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.811 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.811 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.812 182096 DEBUG nova.virt.hardware [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.815 182096 DEBUG nova.objects.instance [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lazy-loading 'pci_devices' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.833 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <uuid>c58940b5-b4b8-4e1b-a741-cce80dd02096</uuid>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <name>instance-00000039</name>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1311020306</nova:name>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:19:45</nova:creationTime>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:19:45 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:19:45 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:19:45 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:19:45 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:19:45 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:19:45 compute-0 nova_compute[182092]:         <nova:user uuid="476826df6a2744be869eff79367a0516">tempest-UnshelveToHostMultiNodesTest-1442713466-project-member</nova:user>
Jan 23 09:19:45 compute-0 nova_compute[182092]:         <nova:project uuid="ce124b37bc3b4586ad3473321562d9fb">tempest-UnshelveToHostMultiNodesTest-1442713466</nova:project>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <system>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <entry name="serial">c58940b5-b4b8-4e1b-a741-cce80dd02096</entry>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <entry name="uuid">c58940b5-b4b8-4e1b-a741-cce80dd02096</entry>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     </system>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <os>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   </os>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <features>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   </features>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.config"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/console.log" append="off"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <video>
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     </video>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:19:45 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:19:45 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:19:45 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:19:45 compute-0 nova_compute[182092]: </domain>
Jan 23 09:19:45 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.873 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.873 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:45 compute-0 nova_compute[182092]: 2026-01-23 09:19:45.873 182096 INFO nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Using config drive
Jan 23 09:19:46 compute-0 nova_compute[182092]: 2026-01-23 09:19:46.740 182096 DEBUG nova.network.neutron [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Successfully created port: 592a06ee-2695-49b2-811e-88ecd02c4cc7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:19:46 compute-0 nova_compute[182092]: 2026-01-23 09:19:46.745 182096 INFO nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Creating config drive at /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.config
Jan 23 09:19:46 compute-0 nova_compute[182092]: 2026-01-23 09:19:46.749 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1amfb059 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:46 compute-0 nova_compute[182092]: 2026-01-23 09:19:46.868 182096 DEBUG oslo_concurrency.processutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1amfb059" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:46 compute-0 systemd-machined[153562]: New machine qemu-23-instance-00000039.
Jan 23 09:19:46 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-00000039.
Jan 23 09:19:46 compute-0 nova_compute[182092]: 2026-01-23 09:19:46.940 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.127 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.322 182096 DEBUG nova.network.neutron [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Successfully updated port: 592a06ee-2695-49b2-811e-88ecd02c4cc7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.356 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.357 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.357 182096 DEBUG nova.network.neutron [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.434 182096 DEBUG nova.compute.manager [req-b198cc33-e5ea-4719-bd55-4b1ba3b8ab60 req-4b3447eb-b535-43b2-b621-dfa91053c78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-changed-592a06ee-2695-49b2-811e-88ecd02c4cc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.434 182096 DEBUG nova.compute.manager [req-b198cc33-e5ea-4719-bd55-4b1ba3b8ab60 req-4b3447eb-b535-43b2-b621-dfa91053c78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing instance network info cache due to event network-changed-592a06ee-2695-49b2-811e-88ecd02c4cc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.434 182096 DEBUG oslo_concurrency.lockutils [req-b198cc33-e5ea-4719-bd55-4b1ba3b8ab60 req-4b3447eb-b535-43b2-b621-dfa91053c78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:47 compute-0 nova_compute[182092]: 2026-01-23 09:19:47.529 182096 DEBUG nova.network.neutron [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.510 182096 DEBUG nova.virt.libvirt.driver [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.935 182096 DEBUG nova.network.neutron [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.952 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.952 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Instance network_info: |[{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.953 182096 DEBUG oslo_concurrency.lockutils [req-b198cc33-e5ea-4719-bd55-4b1ba3b8ab60 req-4b3447eb-b535-43b2-b621-dfa91053c78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.953 182096 DEBUG nova.network.neutron [req-b198cc33-e5ea-4719-bd55-4b1ba3b8ab60 req-4b3447eb-b535-43b2-b621-dfa91053c78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing network info cache for port 592a06ee-2695-49b2-811e-88ecd02c4cc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.955 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Start _get_guest_xml network_info=[{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.958 182096 WARNING nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.963 182096 DEBUG nova.virt.libvirt.host [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.964 182096 DEBUG nova.virt.libvirt.host [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.968 182096 DEBUG nova.virt.libvirt.host [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.969 182096 DEBUG nova.virt.libvirt.host [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.969 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.970 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.970 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.970 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.970 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.970 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.971 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.971 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.971 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.971 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.971 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.971 182096 DEBUG nova.virt.hardware [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.974 182096 DEBUG nova.virt.libvirt.vif [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:19:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.974 182096 DEBUG nova.network.os_vif_util [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.975 182096 DEBUG nova.network.os_vif_util [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:6d,bridge_name='br-int',has_traffic_filtering=True,id=592a06ee-2695-49b2-811e-88ecd02c4cc7,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap592a06ee-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.975 182096 DEBUG nova.objects.instance [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'pci_devices' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.988 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <uuid>aaa5d40f-5771-495a-9e1a-76dab011324d</uuid>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <name>instance-00000038</name>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:19:48</nova:creationTime>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:19:48 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:19:48 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:19:48 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:19:48 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:19:48 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:19:48 compute-0 nova_compute[182092]:         <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:19:48 compute-0 nova_compute[182092]:         <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:19:48 compute-0 nova_compute[182092]:         <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:19:48 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <system>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <entry name="serial">aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <entry name="uuid">aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </system>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <os>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   </os>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <features>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   </features>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.config"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:38:78:6d"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <target dev="tap592a06ee-26"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log" append="off"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <video>
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </video>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:19:48 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:19:48 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:19:48 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:19:48 compute-0 nova_compute[182092]: </domain>
Jan 23 09:19:48 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.988 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Preparing to wait for external event network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.989 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.989 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.989 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.989 182096 DEBUG nova.virt.libvirt.vif [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:19:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.990 182096 DEBUG nova.network.os_vif_util [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.990 182096 DEBUG nova.network.os_vif_util [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:6d,bridge_name='br-int',has_traffic_filtering=True,id=592a06ee-2695-49b2-811e-88ecd02c4cc7,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap592a06ee-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.990 182096 DEBUG os_vif [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:6d,bridge_name='br-int',has_traffic_filtering=True,id=592a06ee-2695-49b2-811e-88ecd02c4cc7,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap592a06ee-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.991 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.991 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.991 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.993 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.993 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap592a06ee-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.993 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap592a06ee-26, col_values=(('external_ids', {'iface-id': '592a06ee-2695-49b2-811e-88ecd02c4cc7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:78:6d', 'vm-uuid': 'aaa5d40f-5771-495a-9e1a-76dab011324d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.994 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:48 compute-0 NetworkManager[54920]: <info>  [1769159988.9955] manager: (tap592a06ee-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 23 09:19:48 compute-0 nova_compute[182092]: 2026-01-23 09:19:48.996 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.000 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.000 182096 INFO os_vif [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:6d,bridge_name='br-int',has_traffic_filtering=True,id=592a06ee-2695-49b2-811e-88ecd02c4cc7,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap592a06ee-26')
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.036 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.036 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.037 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:38:78:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.037 182096 INFO nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Using config drive
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.190 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159989.1903293, c58940b5-b4b8-4e1b-a741-cce80dd02096 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.191 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] VM Resumed (Lifecycle Event)
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.193 182096 DEBUG nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.193 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.199 182096 INFO nova.virt.libvirt.driver [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance spawned successfully.
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.200 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.210 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.212 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.217 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.217 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.217 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.218 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.218 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.218 182096 DEBUG nova.virt.libvirt.driver [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.224 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.224 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159989.191224, c58940b5-b4b8-4e1b-a741-cce80dd02096 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.224 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] VM Started (Lifecycle Event)
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.244 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.246 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.263 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.275 182096 INFO nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Took 3.73 seconds to spawn the instance on the hypervisor.
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.276 182096 DEBUG nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.322 182096 INFO nova.compute.manager [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Took 4.23 seconds to build instance.
Jan 23 09:19:49 compute-0 nova_compute[182092]: 2026-01-23 09:19:49.347 182096 DEBUG oslo_concurrency.lockutils [None req-2eeabe0e-f36c-43d5-9c5b-af56a4295158 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:50 compute-0 podman[215221]: 2026-01-23 09:19:50.206669049 +0000 UTC m=+0.042768138 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:19:50 compute-0 podman[215222]: 2026-01-23 09:19:50.209324068 +0000 UTC m=+0.044223864 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:19:50 compute-0 kernel: tap242626ef-65 (unregistering): left promiscuous mode
Jan 23 09:19:50 compute-0 NetworkManager[54920]: <info>  [1769159990.6435] device (tap242626ef-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.653 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.655 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:50 compute-0 ovn_controller[94697]: 2026-01-23T09:19:50Z|00137|binding|INFO|Releasing lport 242626ef-6515-431f-9f4d-fc4aabe2c080 from this chassis (sb_readonly=0)
Jan 23 09:19:50 compute-0 ovn_controller[94697]: 2026-01-23T09:19:50Z|00138|binding|INFO|Setting lport 242626ef-6515-431f-9f4d-fc4aabe2c080 down in Southbound
Jan 23 09:19:50 compute-0 ovn_controller[94697]: 2026-01-23T09:19:50Z|00139|binding|INFO|Removing iface tap242626ef-65 ovn-installed in OVS
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.659 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:2b:87 10.100.0.10'], port_security=['fa:16:3e:2f:2b:87 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4dd3c99c-cee8-4856-b2a2-2bd42f7ac038', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '743f68086d8f4ae38a7b2fb3e91ce01c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '980000d9-53ce-405d-b2a7-06d2e52da4ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef5e5d22-91ef-425b-9a12-147397ffa1e3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=242626ef-6515-431f-9f4d-fc4aabe2c080) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.660 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 242626ef-6515-431f-9f4d-fc4aabe2c080 in datapath 63e19746-3286-4b7a-ad2d-556b0af85c5a unbound from our chassis
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.661 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 63e19746-3286-4b7a-ad2d-556b0af85c5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.672 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b81a38-2dad-4a7f-be88-0b3691ef566c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.673 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a namespace which is not needed anymore
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.677 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:50 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 23 09:19:50 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000034.scope: Consumed 12.560s CPU time.
Jan 23 09:19:50 compute-0 systemd-machined[153562]: Machine qemu-21-instance-00000034 terminated.
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.751 182096 INFO nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Creating config drive at /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.config
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.756 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0dqd7fze execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:50 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215003]: [NOTICE]   (215007) : haproxy version is 2.8.14-c23fe91
Jan 23 09:19:50 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215003]: [NOTICE]   (215007) : path to executable is /usr/sbin/haproxy
Jan 23 09:19:50 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215003]: [ALERT]    (215007) : Current worker (215009) exited with code 143 (Terminated)
Jan 23 09:19:50 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215003]: [WARNING]  (215007) : All workers exited. Exiting... (0)
Jan 23 09:19:50 compute-0 systemd[1]: libpod-bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da.scope: Deactivated successfully.
Jan 23 09:19:50 compute-0 podman[215284]: 2026-01-23 09:19:50.779672512 +0000 UTC m=+0.042811330 container died bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:19:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da-userdata-shm.mount: Deactivated successfully.
Jan 23 09:19:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-10e34e9c86b6c01f3bf5b385d4c27644cdcdc5f7442dbca68f96ae1427536014-merged.mount: Deactivated successfully.
Jan 23 09:19:50 compute-0 podman[215284]: 2026-01-23 09:19:50.80233376 +0000 UTC m=+0.065472577 container cleanup bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:19:50 compute-0 systemd[1]: libpod-conmon-bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da.scope: Deactivated successfully.
Jan 23 09:19:50 compute-0 podman[215309]: 2026-01-23 09:19:50.842688613 +0000 UTC m=+0.025063722 container remove bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.845 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f5a7a7-2e47-4218-adf1-37d352ac1b3a]: (4, ('Fri Jan 23 09:19:50 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a (bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da)\nbc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da\nFri Jan 23 09:19:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a (bc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da)\nbc221649be5a12f88d390c74cb374ed9c314f2ee26df8f047a2949c2972a82da\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.847 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e9cb3b18-7f36-4556-8103-df524ff322b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.848 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63e19746-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.850 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:50 compute-0 kernel: tap63e19746-30: left promiscuous mode
Jan 23 09:19:50 compute-0 NetworkManager[54920]: <info>  [1769159990.8592] manager: (tap242626ef-65): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.875 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.878 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[548feeda-4124-4c75-90a4-0d451f0ef95a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.881 182096 DEBUG oslo_concurrency.processutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0dqd7fze" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.889 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a882d7c7-72c2-4f42-8825-0055962c43cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.890 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[41caa3b4-a674-4fbd-b564-ff186700bbb0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.910 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d630ed86-0863-47d8-a50a-7b14d983d875]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347365, 'reachable_time': 42925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215336, 'error': None, 'target': 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 systemd[1]: run-netns-ovnmeta\x2d63e19746\x2d3286\x2d4b7a\x2dad2d\x2d556b0af85c5a.mount: Deactivated successfully.
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.912 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.912 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e9d9c0-0b71-43e9-b985-f08b8c073178]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 systemd-udevd[215262]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:19:50 compute-0 NetworkManager[54920]: <info>  [1769159990.9382] manager: (tap592a06ee-26): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Jan 23 09:19:50 compute-0 kernel: tap592a06ee-26: entered promiscuous mode
Jan 23 09:19:50 compute-0 ovn_controller[94697]: 2026-01-23T09:19:50Z|00140|binding|INFO|Claiming lport 592a06ee-2695-49b2-811e-88ecd02c4cc7 for this chassis.
Jan 23 09:19:50 compute-0 ovn_controller[94697]: 2026-01-23T09:19:50Z|00141|binding|INFO|592a06ee-2695-49b2-811e-88ecd02c4cc7: Claiming fa:16:3e:38:78:6d 10.100.0.7
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.945 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:50 compute-0 nova_compute[182092]: 2026-01-23 09:19:50.948 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.955 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:78:6d 10.100.0.7'], port_security=['fa:16:3e:38:78:6d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '18938afc-cae3-477c-aee1-5f343f0a5140', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=592a06ee-2695-49b2-811e-88ecd02c4cc7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.956 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 592a06ee-2695-49b2-811e-88ecd02c4cc7 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 bound to our chassis
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.957 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:19:50 compute-0 NetworkManager[54920]: <info>  [1769159990.9582] device (tap592a06ee-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:19:50 compute-0 NetworkManager[54920]: <info>  [1769159990.9589] device (tap592a06ee-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.972 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0819b835-bedb-4769-b397-2db8c04ceb62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.975 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap502ff19d-71 in ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.976 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap502ff19d-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.976 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8fbd85c7-8978-4b44-ae20-73f3ca8b90e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.977 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6a37f746-a1e5-4402-9ab2-66bcc16f4d3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:50 compute-0 systemd-machined[153562]: New machine qemu-24-instance-00000038.
Jan 23 09:19:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:50.987 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[7af69468-0b69-4088-a358-b601247a6b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000038.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.014 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:51 compute-0 ovn_controller[94697]: 2026-01-23T09:19:51Z|00142|binding|INFO|Setting lport 592a06ee-2695-49b2-811e-88ecd02c4cc7 ovn-installed in OVS
Jan 23 09:19:51 compute-0 ovn_controller[94697]: 2026-01-23T09:19:51Z|00143|binding|INFO|Setting lport 592a06ee-2695-49b2-811e-88ecd02c4cc7 up in Southbound
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.017 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[47699059-dc18-4fbc-bf81-eccd89e153e6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.018 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.051 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[89d29716-fa62-4e64-9731-e1736569b094]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 NetworkManager[54920]: <info>  [1769159991.0565] manager: (tap502ff19d-70): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.056 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[34e9e989-2400-4ae7-b588-3686ff938454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.087 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[73472121-28db-4444-95aa-24d17e55f69f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.089 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c7002168-c584-4eb7-b0ff-0c06f8ff769b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 NetworkManager[54920]: <info>  [1769159991.1127] device (tap502ff19d-70): carrier: link connected
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.117 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[99d472f3-2808-41d5-b5d1-f929bcac2842]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.132 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ff43715d-db14-4870-b754-369a5faa71e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349659, 'reachable_time': 27397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215386, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.145 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[49f07970-20ad-4264-a01e-beb6971a69c7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefb:cef2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349659, 'tstamp': 349659}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215387, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.158 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[df988716-d83e-44d5-b31a-6778ef0ae5d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349659, 'reachable_time': 27397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215388, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.182 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[87412c55-18c5-40d8-968f-d165252a55af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.189 182096 DEBUG nova.compute.manager [req-5732b79f-558e-4106-b669-1649d9295bf4 req-6ab50e21-399d-4380-bb7f-7682fc7e64ea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-unplugged-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.190 182096 DEBUG oslo_concurrency.lockutils [req-5732b79f-558e-4106-b669-1649d9295bf4 req-6ab50e21-399d-4380-bb7f-7682fc7e64ea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.190 182096 DEBUG oslo_concurrency.lockutils [req-5732b79f-558e-4106-b669-1649d9295bf4 req-6ab50e21-399d-4380-bb7f-7682fc7e64ea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.191 182096 DEBUG oslo_concurrency.lockutils [req-5732b79f-558e-4106-b669-1649d9295bf4 req-6ab50e21-399d-4380-bb7f-7682fc7e64ea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.191 182096 DEBUG nova.compute.manager [req-5732b79f-558e-4106-b669-1649d9295bf4 req-6ab50e21-399d-4380-bb7f-7682fc7e64ea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] No waiting events found dispatching network-vif-unplugged-242626ef-6515-431f-9f4d-fc4aabe2c080 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.191 182096 WARNING nova.compute.manager [req-5732b79f-558e-4106-b669-1649d9295bf4 req-6ab50e21-399d-4380-bb7f-7682fc7e64ea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received unexpected event network-vif-unplugged-242626ef-6515-431f-9f4d-fc4aabe2c080 for instance with vm_state active and task_state powering-off.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.224 182096 DEBUG nova.compute.manager [req-1679d730-bfdd-4d6d-b58b-a8506a57a60e req-f88a413c-cede-4dd8-a994-89556a23f96a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.225 182096 DEBUG oslo_concurrency.lockutils [req-1679d730-bfdd-4d6d-b58b-a8506a57a60e req-f88a413c-cede-4dd8-a994-89556a23f96a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.225 182096 DEBUG oslo_concurrency.lockutils [req-1679d730-bfdd-4d6d-b58b-a8506a57a60e req-f88a413c-cede-4dd8-a994-89556a23f96a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.225 182096 DEBUG oslo_concurrency.lockutils [req-1679d730-bfdd-4d6d-b58b-a8506a57a60e req-f88a413c-cede-4dd8-a994-89556a23f96a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.225 182096 DEBUG nova.compute.manager [req-1679d730-bfdd-4d6d-b58b-a8506a57a60e req-f88a413c-cede-4dd8-a994-89556a23f96a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Processing event network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.236 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0e020e-a9be-4a8b-8e10-cb60ec15b74e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.238 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.239 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.239 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:51 compute-0 NetworkManager[54920]: <info>  [1769159991.2415] manager: (tap502ff19d-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.242 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:51 compute-0 kernel: tap502ff19d-70: entered promiscuous mode
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.244 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.250 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:51 compute-0 ovn_controller[94697]: 2026-01-23T09:19:51Z|00144|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.251 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.264 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.266 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.267 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/502ff19d-7b13-4dc2-8ece-02806b418ba0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/502ff19d-7b13-4dc2-8ece-02806b418ba0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.268 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cef86426-230e-4260-b83f-b7e6bf22708d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.268 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/502ff19d-7b13-4dc2-8ece-02806b418ba0.pid.haproxy
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:19:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:51.270 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'env', 'PROCESS_TAG=haproxy-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/502ff19d-7b13-4dc2-8ece-02806b418ba0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.406 182096 DEBUG nova.network.neutron [req-b198cc33-e5ea-4719-bd55-4b1ba3b8ab60 req-4b3447eb-b535-43b2-b621-dfa91053c78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updated VIF entry in instance network info cache for port 592a06ee-2695-49b2-811e-88ecd02c4cc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.406 182096 DEBUG nova.network.neutron [req-b198cc33-e5ea-4719-bd55-4b1ba3b8ab60 req-4b3447eb-b535-43b2-b621-dfa91053c78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.422 182096 DEBUG oslo_concurrency.lockutils [req-b198cc33-e5ea-4719-bd55-4b1ba3b8ab60 req-4b3447eb-b535-43b2-b621-dfa91053c78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.440 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159991.4407158, aaa5d40f-5771-495a-9e1a-76dab011324d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.441 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] VM Started (Lifecycle Event)
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.443 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.447 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.451 182096 INFO nova.virt.libvirt.driver [-] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Instance spawned successfully.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.452 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.464 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.470 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.472 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.473 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.473 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.474 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.474 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.474 182096 DEBUG nova.virt.libvirt.driver [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.502 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.503 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159991.4407532, aaa5d40f-5771-495a-9e1a-76dab011324d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.503 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] VM Paused (Lifecycle Event)
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.519 182096 INFO nova.virt.libvirt.driver [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance shutdown successfully after 13 seconds.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.524 182096 INFO nova.virt.libvirt.driver [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance destroyed successfully.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.525 182096 DEBUG nova.objects.instance [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'numa_topology' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.530 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.533 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159991.4464002, aaa5d40f-5771-495a-9e1a-76dab011324d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.533 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] VM Resumed (Lifecycle Event)
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.544 182096 DEBUG nova.compute.manager [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.551 182096 INFO nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Took 6.33 seconds to spawn the instance on the hypervisor.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.551 182096 DEBUG nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.553 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.558 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.588 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.623 182096 DEBUG oslo_concurrency.lockutils [None req-862453c1-7f48-4133-bcfb-0a54296bd7f0 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:51 compute-0 podman[215428]: 2026-01-23 09:19:51.629969734 +0000 UTC m=+0.038108866 container create f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.634 182096 INFO nova.compute.manager [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Took 6.81 seconds to build instance.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.650 182096 DEBUG oslo_concurrency.lockutils [None req-4dea6957-b253-4204-9eca-768ca6f170d8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:51 compute-0 systemd[1]: Started libpod-conmon-f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12.scope.
Jan 23 09:19:51 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:19:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cc901fb96740599edac43ffe76c6706020a93445cbdfbb297a6474cb59ad00b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:19:51 compute-0 podman[215428]: 2026-01-23 09:19:51.69445057 +0000 UTC m=+0.102589722 container init f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:19:51 compute-0 podman[215428]: 2026-01-23 09:19:51.699089171 +0000 UTC m=+0.107228303 container start f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 09:19:51 compute-0 podman[215428]: 2026-01-23 09:19:51.611931409 +0000 UTC m=+0.020070560 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:19:51 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:19:51 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:19:51 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[215440]: [NOTICE]   (215444) : New worker (215447) forked
Jan 23 09:19:51 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[215440]: [NOTICE]   (215444) : Loading success.
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.942 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.996 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.996 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:51 compute-0 nova_compute[182092]: 2026-01-23 09:19:51.996 182096 INFO nova.compute.manager [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Shelving
Jan 23 09:19:52 compute-0 nova_compute[182092]: 2026-01-23 09:19:52.030 182096 DEBUG nova.virt.libvirt.driver [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:19:52 compute-0 nova_compute[182092]: 2026-01-23 09:19:52.965 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769159977.964947, 50310be0-78cb-425f-ba64-fd847de259fe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:52 compute-0 nova_compute[182092]: 2026-01-23 09:19:52.965 182096 INFO nova.compute.manager [-] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] VM Stopped (Lifecycle Event)
Jan 23 09:19:52 compute-0 nova_compute[182092]: 2026-01-23 09:19:52.984 182096 DEBUG nova.compute.manager [None req-9a88fdf6-fc28-4a2e-874c-64f837a5d981 - - - - - -] [instance: 50310be0-78cb-425f-ba64-fd847de259fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.275 182096 DEBUG nova.compute.manager [req-dd9bc2c1-18b9-4de6-af06-d573acb116fa req-de4a6abc-6d67-47ae-9c34-3882ed8ebf59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.275 182096 DEBUG oslo_concurrency.lockutils [req-dd9bc2c1-18b9-4de6-af06-d573acb116fa req-de4a6abc-6d67-47ae-9c34-3882ed8ebf59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.275 182096 DEBUG oslo_concurrency.lockutils [req-dd9bc2c1-18b9-4de6-af06-d573acb116fa req-de4a6abc-6d67-47ae-9c34-3882ed8ebf59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.275 182096 DEBUG oslo_concurrency.lockutils [req-dd9bc2c1-18b9-4de6-af06-d573acb116fa req-de4a6abc-6d67-47ae-9c34-3882ed8ebf59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.276 182096 DEBUG nova.compute.manager [req-dd9bc2c1-18b9-4de6-af06-d573acb116fa req-de4a6abc-6d67-47ae-9c34-3882ed8ebf59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] No waiting events found dispatching network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.276 182096 WARNING nova.compute.manager [req-dd9bc2c1-18b9-4de6-af06-d573acb116fa req-de4a6abc-6d67-47ae-9c34-3882ed8ebf59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received unexpected event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 for instance with vm_state stopped and task_state None.
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.300 182096 DEBUG nova.compute.manager [req-27c61fdf-71c4-4764-aa4d-04e5bd4bedeb req-eb196f13-016a-4194-a72c-2db808a8e4b6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.300 182096 DEBUG oslo_concurrency.lockutils [req-27c61fdf-71c4-4764-aa4d-04e5bd4bedeb req-eb196f13-016a-4194-a72c-2db808a8e4b6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.300 182096 DEBUG oslo_concurrency.lockutils [req-27c61fdf-71c4-4764-aa4d-04e5bd4bedeb req-eb196f13-016a-4194-a72c-2db808a8e4b6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.301 182096 DEBUG oslo_concurrency.lockutils [req-27c61fdf-71c4-4764-aa4d-04e5bd4bedeb req-eb196f13-016a-4194-a72c-2db808a8e4b6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.301 182096 DEBUG nova.compute.manager [req-27c61fdf-71c4-4764-aa4d-04e5bd4bedeb req-eb196f13-016a-4194-a72c-2db808a8e4b6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.301 182096 WARNING nova.compute.manager [req-27c61fdf-71c4-4764-aa4d-04e5bd4bedeb req-eb196f13-016a-4194-a72c-2db808a8e4b6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 for instance with vm_state active and task_state None.
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.372 182096 DEBUG nova.objects.instance [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'flavor' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.397 182096 DEBUG nova.objects.instance [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'info_cache' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.414 182096 DEBUG oslo_concurrency.lockutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.414 182096 DEBUG oslo_concurrency.lockutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquired lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.415 182096 DEBUG nova.network.neutron [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:19:53 compute-0 nova_compute[182092]: 2026-01-23 09:19:53.996 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:54 compute-0 NetworkManager[54920]: <info>  [1769159994.2224] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 23 09:19:54 compute-0 NetworkManager[54920]: <info>  [1769159994.2233] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.224 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:54 compute-0 ovn_controller[94697]: 2026-01-23T09:19:54Z|00145|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:19:54 compute-0 ovn_controller[94697]: 2026-01-23T09:19:54Z|00146|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.269 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.281 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.495 182096 DEBUG nova.compute.manager [req-8d205bb3-da7a-4fd8-8e54-ba56b7a92e06 req-a1b81bed-7317-4a79-86be-0893ce2f923b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-changed-592a06ee-2695-49b2-811e-88ecd02c4cc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.495 182096 DEBUG nova.compute.manager [req-8d205bb3-da7a-4fd8-8e54-ba56b7a92e06 req-a1b81bed-7317-4a79-86be-0893ce2f923b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing instance network info cache due to event network-changed-592a06ee-2695-49b2-811e-88ecd02c4cc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.496 182096 DEBUG oslo_concurrency.lockutils [req-8d205bb3-da7a-4fd8-8e54-ba56b7a92e06 req-a1b81bed-7317-4a79-86be-0893ce2f923b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.496 182096 DEBUG oslo_concurrency.lockutils [req-8d205bb3-da7a-4fd8-8e54-ba56b7a92e06 req-a1b81bed-7317-4a79-86be-0893ce2f923b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.496 182096 DEBUG nova.network.neutron [req-8d205bb3-da7a-4fd8-8e54-ba56b7a92e06 req-a1b81bed-7317-4a79-86be-0893ce2f923b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing network info cache for port 592a06ee-2695-49b2-811e-88ecd02c4cc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.660 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:19:54 compute-0 nova_compute[182092]: 2026-01-23 09:19:54.661 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.205 182096 DEBUG nova.network.neutron [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Updating instance_info_cache with network_info: [{"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.220 182096 DEBUG oslo_concurrency.lockutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Releasing lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.238 182096 INFO nova.virt.libvirt.driver [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance destroyed successfully.
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.239 182096 DEBUG nova.objects.instance [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'numa_topology' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.254 182096 DEBUG nova.objects.instance [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'resources' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.262 182096 DEBUG nova.virt.libvirt.vif [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-939361418',display_name='tempest-ListServerFiltersTestJSON-instance-939361418',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-939361418',id=52,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='743f68086d8f4ae38a7b2fb3e91ce01c',ramdisk_id='',reservation_id='r-3gw46b98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-721478701',owner_user_name='tempest-ListServerFiltersTestJSON-721478701-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data=None,user_id='66f8a16479154d23a187c0922062f421',uuid=4dd3c99c-cee8-4856-b2a2-2bd42f7ac038,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.262 182096 DEBUG nova.network.os_vif_util [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converting VIF {"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.263 182096 DEBUG nova.network.os_vif_util [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.263 182096 DEBUG os_vif [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.265 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.265 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap242626ef-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.266 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.268 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.270 182096 INFO os_vif [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65')
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.276 182096 DEBUG nova.virt.libvirt.driver [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Start _get_guest_xml network_info=[{"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.278 182096 WARNING nova.virt.libvirt.driver [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.282 182096 DEBUG nova.virt.libvirt.host [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.283 182096 DEBUG nova.virt.libvirt.host [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.285 182096 DEBUG nova.virt.libvirt.host [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.285 182096 DEBUG nova.virt.libvirt.host [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.286 182096 DEBUG nova.virt.libvirt.driver [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.286 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.287 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.287 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.287 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.287 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.288 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.288 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.288 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.289 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.289 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.289 182096 DEBUG nova.virt.hardware [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.289 182096 DEBUG nova.objects.instance [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.300 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.366 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.366 182096 DEBUG oslo_concurrency.lockutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.367 182096 DEBUG oslo_concurrency.lockutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.367 182096 DEBUG oslo_concurrency.lockutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.368 182096 DEBUG nova.virt.libvirt.vif [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-939361418',display_name='tempest-ListServerFiltersTestJSON-instance-939361418',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-939361418',id=52,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='743f68086d8f4ae38a7b2fb3e91ce01c',ramdisk_id='',reservation_id='r-3gw46b98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-721478701',owner_user_name='tempest-ListServerFiltersTestJSON-721478701-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data=None,user_id='66f8a16479154d23a187c0922062f421',uuid=4dd3c99c-cee8-4856-b2a2-2bd42f7ac038,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.369 182096 DEBUG nova.network.os_vif_util [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converting VIF {"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.370 182096 DEBUG nova.network.os_vif_util [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.370 182096 DEBUG nova.objects.instance [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'pci_devices' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.389 182096 DEBUG nova.virt.libvirt.driver [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <uuid>4dd3c99c-cee8-4856-b2a2-2bd42f7ac038</uuid>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <name>instance-00000034</name>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-939361418</nova:name>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:19:55</nova:creationTime>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:19:55 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:19:55 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:19:55 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:19:55 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:19:55 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:19:55 compute-0 nova_compute[182092]:         <nova:user uuid="66f8a16479154d23a187c0922062f421">tempest-ListServerFiltersTestJSON-721478701-project-member</nova:user>
Jan 23 09:19:55 compute-0 nova_compute[182092]:         <nova:project uuid="743f68086d8f4ae38a7b2fb3e91ce01c">tempest-ListServerFiltersTestJSON-721478701</nova:project>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:19:55 compute-0 nova_compute[182092]:         <nova:port uuid="242626ef-6515-431f-9f4d-fc4aabe2c080">
Jan 23 09:19:55 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <system>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <entry name="serial">4dd3c99c-cee8-4856-b2a2-2bd42f7ac038</entry>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <entry name="uuid">4dd3c99c-cee8-4856-b2a2-2bd42f7ac038</entry>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </system>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <os>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   </os>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <features>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   </features>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk.config"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:2f:2b:87"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <target dev="tap242626ef-65"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/console.log" append="off"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <video>
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </video>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <input type="keyboard" bus="usb"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:19:55 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:19:55 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:19:55 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:19:55 compute-0 nova_compute[182092]: </domain>
Jan 23 09:19:55 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.394 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.452 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.453 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.510 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.512 182096 DEBUG nova.objects.instance [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.524 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.581 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.582 182096 DEBUG nova.virt.disk.api [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Checking if we can resize image /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.583 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.641 182096 DEBUG oslo_concurrency.processutils [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.642 182096 DEBUG nova.virt.disk.api [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Cannot resize image /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.642 182096 DEBUG nova.objects.instance [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'migration_context' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.652 182096 DEBUG nova.virt.libvirt.vif [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-939361418',display_name='tempest-ListServerFiltersTestJSON-instance-939361418',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-939361418',id=52,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='743f68086d8f4ae38a7b2fb3e91ce01c',ramdisk_id='',reservation_id='r-3gw46b98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-721478701',owner_user_name='tempest-ListServerFiltersTestJSON-721478701-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:19:51Z,user_data=None,user_id='66f8a16479154d23a187c0922062f421',uuid=4dd3c99c-cee8-4856-b2a2-2bd42f7ac038,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.653 182096 DEBUG nova.network.os_vif_util [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converting VIF {"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.653 182096 DEBUG nova.network.os_vif_util [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.654 182096 DEBUG os_vif [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.654 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.655 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.655 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.658 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.658 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap242626ef-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.658 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap242626ef-65, col_values=(('external_ids', {'iface-id': '242626ef-6515-431f-9f4d-fc4aabe2c080', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:2b:87', 'vm-uuid': '4dd3c99c-cee8-4856-b2a2-2bd42f7ac038'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:55 compute-0 NetworkManager[54920]: <info>  [1769159995.6606] manager: (tap242626ef-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.662 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.663 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.664 182096 INFO os_vif [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65')
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.668 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.668 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:19:55 compute-0 NetworkManager[54920]: <info>  [1769159995.7137] manager: (tap242626ef-65): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Jan 23 09:19:55 compute-0 kernel: tap242626ef-65: entered promiscuous mode
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.718 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 ovn_controller[94697]: 2026-01-23T09:19:55Z|00147|binding|INFO|Claiming lport 242626ef-6515-431f-9f4d-fc4aabe2c080 for this chassis.
Jan 23 09:19:55 compute-0 ovn_controller[94697]: 2026-01-23T09:19:55Z|00148|binding|INFO|242626ef-6515-431f-9f4d-fc4aabe2c080: Claiming fa:16:3e:2f:2b:87 10.100.0.10
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.721 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.735 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:2b:87 10.100.0.10'], port_security=['fa:16:3e:2f:2b:87 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4dd3c99c-cee8-4856-b2a2-2bd42f7ac038', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '743f68086d8f4ae38a7b2fb3e91ce01c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '980000d9-53ce-405d-b2a7-06d2e52da4ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef5e5d22-91ef-425b-9a12-147397ffa1e3, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=242626ef-6515-431f-9f4d-fc4aabe2c080) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.736 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 242626ef-6515-431f-9f4d-fc4aabe2c080 in datapath 63e19746-3286-4b7a-ad2d-556b0af85c5a bound to our chassis
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.737 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 63e19746-3286-4b7a-ad2d-556b0af85c5a
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.746 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[649bfc7f-1ffd-4734-8529-4ba55185250c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.749 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap63e19746-31 in ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.750 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap63e19746-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.750 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7c32b6-7309-4c48-8c56-b88ea2c77a0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 systemd-udevd[215486]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.751 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e27e2a-0edb-4c37-9f25-7b8c4bdd8f55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 systemd-machined[153562]: New machine qemu-25-instance-00000034.
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.768 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[9484ce8c-4765-444b-99b4-c6c231e7c2e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000034.
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.772 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 NetworkManager[54920]: <info>  [1769159995.7750] device (tap242626ef-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:19:55 compute-0 NetworkManager[54920]: <info>  [1769159995.7757] device (tap242626ef-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:19:55 compute-0 ovn_controller[94697]: 2026-01-23T09:19:55Z|00149|binding|INFO|Setting lport 242626ef-6515-431f-9f4d-fc4aabe2c080 ovn-installed in OVS
Jan 23 09:19:55 compute-0 ovn_controller[94697]: 2026-01-23T09:19:55Z|00150|binding|INFO|Setting lport 242626ef-6515-431f-9f4d-fc4aabe2c080 up in Southbound
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.781 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.793 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dc1ea2ee-e0fc-4109-86a1-857b36d26b8e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.794 182096 DEBUG nova.network.neutron [req-8d205bb3-da7a-4fd8-8e54-ba56b7a92e06 req-a1b81bed-7317-4a79-86be-0893ce2f923b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updated VIF entry in instance network info cache for port 592a06ee-2695-49b2-811e-88ecd02c4cc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.795 182096 DEBUG nova.network.neutron [req-8d205bb3-da7a-4fd8-8e54-ba56b7a92e06 req-a1b81bed-7317-4a79-86be-0893ce2f923b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.812 182096 DEBUG oslo_concurrency.lockutils [req-8d205bb3-da7a-4fd8-8e54-ba56b7a92e06 req-a1b81bed-7317-4a79-86be-0893ce2f923b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.824 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb4513b-6fa0-4833-bf68-296060625fb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.828 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[45d867ce-6cf6-46ea-8a3e-f959dfa18801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 NetworkManager[54920]: <info>  [1769159995.8312] manager: (tap63e19746-30): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.866 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c06268b8-3ba5-4a15-8181-0c8b5d3ae16b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.868 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9d5e84b8-de81-4f21-8045-e8a96630eab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.879 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:55 compute-0 NetworkManager[54920]: <info>  [1769159995.9042] device (tap63e19746-30): carrier: link connected
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.905 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4e12b078-62fc-400c-88ce-e93f48095807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.931 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2497fd36-dd98-4cb6-afad-4e356e92c6e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63e19746-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:18:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350138, 'reachable_time': 44800, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215512, 'error': None, 'target': 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.943 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[78b6c756-09ab-40f2-ba82-cb1c3dfd3501]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:1823'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 350138, 'tstamp': 350138}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215513, 'error': None, 'target': 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.949 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.950 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.956 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b1ec71-67a0-4414-9e5f-d986a9e69f9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63e19746-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:18:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350138, 'reachable_time': 44800, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215517, 'error': None, 'target': 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.988 182096 DEBUG nova.compute.manager [req-a6e29920-c8e1-4ef9-a7cf-e66b00463391 req-babc52d7-b6b9-49d1-8bca-d34976d8f9b5 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:55.992 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6bedfa62-ad6a-45c5-aabc-54bd7e2aafa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:55 compute-0 nova_compute[182092]: 2026-01-23 09:19:55.993 182096 DEBUG oslo_concurrency.lockutils [req-a6e29920-c8e1-4ef9-a7cf-e66b00463391 req-babc52d7-b6b9-49d1-8bca-d34976d8f9b5 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.007 182096 DEBUG oslo_concurrency.lockutils [req-a6e29920-c8e1-4ef9-a7cf-e66b00463391 req-babc52d7-b6b9-49d1-8bca-d34976d8f9b5 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.008 182096 DEBUG oslo_concurrency.lockutils [req-a6e29920-c8e1-4ef9-a7cf-e66b00463391 req-babc52d7-b6b9-49d1-8bca-d34976d8f9b5 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.008 182096 DEBUG nova.compute.manager [req-a6e29920-c8e1-4ef9-a7cf-e66b00463391 req-babc52d7-b6b9-49d1-8bca-d34976d8f9b5 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] No waiting events found dispatching network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.009 182096 WARNING nova.compute.manager [req-a6e29920-c8e1-4ef9-a7cf-e66b00463391 req-babc52d7-b6b9-49d1-8bca-d34976d8f9b5 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received unexpected event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 for instance with vm_state stopped and task_state powering-on.
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.051 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.055 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.131 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ea93ac5e-b2fb-43b8-84ff-dacfb089259b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.132 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63e19746-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.132 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.132 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63e19746-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:56 compute-0 NetworkManager[54920]: <info>  [1769159996.1346] manager: (tap63e19746-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 23 09:19:56 compute-0 kernel: tap63e19746-30: entered promiscuous mode
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.134 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.139 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap63e19746-30, col_values=(('external_ids', {'iface-id': 'b10bca13-6c88-49ff-b7f6-9cb6db2a6a71'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:19:56 compute-0 ovn_controller[94697]: 2026-01-23T09:19:56Z|00151|binding|INFO|Releasing lport b10bca13-6c88-49ff-b7f6-9cb6db2a6a71 from this chassis (sb_readonly=0)
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.141 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.158 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/63e19746-3286-4b7a-ad2d-556b0af85c5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/63e19746-3286-4b7a-ad2d-556b0af85c5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.157 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.159 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8f20bfe0-f1d0-450b-97c5-8fbdef345612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.159 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-63e19746-3286-4b7a-ad2d-556b0af85c5a
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/63e19746-3286-4b7a-ad2d-556b0af85c5a.pid.haproxy
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 63e19746-3286-4b7a-ad2d-556b0af85c5a
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:19:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:19:56.160 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'env', 'PROCESS_TAG=haproxy-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/63e19746-3286-4b7a-ad2d-556b0af85c5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.159 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.161 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.253 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.259 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.329 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.330 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.385 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.386 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159996.3843105, 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.386 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] VM Resumed (Lifecycle Event)
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.389 182096 DEBUG nova.compute.manager [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.396 182096 INFO nova.virt.libvirt.driver [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance rebooted successfully.
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.396 182096 DEBUG nova.compute.manager [None req-4886effe-bde1-4580-8840-e8b2d3fdb511 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.406 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.422 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.427 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.449 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] During sync_power_state the instance has a pending task (powering-on). Skip.
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.449 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769159996.3862193, 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.449 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] VM Started (Lifecycle Event)
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.464 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.474 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:19:56 compute-0 podman[215572]: 2026-01-23 09:19:56.665203027 +0000 UTC m=+0.066599061 container create c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:19:56 compute-0 systemd[1]: Started libpod-conmon-c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932.scope.
Jan 23 09:19:56 compute-0 podman[215572]: 2026-01-23 09:19:56.638767538 +0000 UTC m=+0.040163582 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:19:56 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:19:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc7903353a5e73396120a4b4f1588df7b97349a4b414beeceb6dbffe26c9bca1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:19:56 compute-0 podman[215572]: 2026-01-23 09:19:56.751054536 +0000 UTC m=+0.152450570 container init c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:19:56 compute-0 podman[215572]: 2026-01-23 09:19:56.757738126 +0000 UTC m=+0.159134160 container start c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.759 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.760 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5409MB free_disk=73.34296417236328GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.761 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.761 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:56 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215583]: [NOTICE]   (215587) : New worker (215589) forked
Jan 23 09:19:56 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215583]: [NOTICE]   (215587) : Loading success.
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.827 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.827 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance aaa5d40f-5771-495a-9e1a-76dab011324d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.827 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.828 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.828 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=4 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.890 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.901 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.915 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.916 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:56 compute-0 nova_compute[182092]: 2026-01-23 09:19:56.943 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:57 compute-0 nova_compute[182092]: 2026-01-23 09:19:57.449 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:19:57 compute-0 nova_compute[182092]: 2026-01-23 09:19:57.912 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:19:57 compute-0 nova_compute[182092]: 2026-01-23 09:19:57.913 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.072 182096 DEBUG nova.compute.manager [req-92903296-066c-40fa-880a-47176fcb49c8 req-b75f7771-0b3b-430c-bc88-36d7dabdaeb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.073 182096 DEBUG oslo_concurrency.lockutils [req-92903296-066c-40fa-880a-47176fcb49c8 req-b75f7771-0b3b-430c-bc88-36d7dabdaeb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.073 182096 DEBUG oslo_concurrency.lockutils [req-92903296-066c-40fa-880a-47176fcb49c8 req-b75f7771-0b3b-430c-bc88-36d7dabdaeb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.073 182096 DEBUG oslo_concurrency.lockutils [req-92903296-066c-40fa-880a-47176fcb49c8 req-b75f7771-0b3b-430c-bc88-36d7dabdaeb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.073 182096 DEBUG nova.compute.manager [req-92903296-066c-40fa-880a-47176fcb49c8 req-b75f7771-0b3b-430c-bc88-36d7dabdaeb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] No waiting events found dispatching network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.074 182096 WARNING nova.compute.manager [req-92903296-066c-40fa-880a-47176fcb49c8 req-b75f7771-0b3b-430c-bc88-36d7dabdaeb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received unexpected event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 for instance with vm_state active and task_state None.
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.787 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.787 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.787 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:19:58 compute-0 nova_compute[182092]: 2026-01-23 09:19:58.787 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:00 compute-0 nova_compute[182092]: 2026-01-23 09:20:00.259 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Updating instance_info_cache with network_info: [{"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:00 compute-0 nova_compute[182092]: 2026-01-23 09:20:00.275 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:00 compute-0 nova_compute[182092]: 2026-01-23 09:20:00.275 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:20:00 compute-0 nova_compute[182092]: 2026-01-23 09:20:00.276 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:00 compute-0 nova_compute[182092]: 2026-01-23 09:20:00.276 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:00 compute-0 nova_compute[182092]: 2026-01-23 09:20:00.331 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:00 compute-0 nova_compute[182092]: 2026-01-23 09:20:00.661 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:01 compute-0 nova_compute[182092]: 2026-01-23 09:20:01.944 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:02 compute-0 nova_compute[182092]: 2026-01-23 09:20:02.070 182096 DEBUG nova.virt.libvirt.driver [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:20:02 compute-0 ovn_controller[94697]: 2026-01-23T09:20:02Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:78:6d 10.100.0.7
Jan 23 09:20:02 compute-0 ovn_controller[94697]: 2026-01-23T09:20:02Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:78:6d 10.100.0.7
Jan 23 09:20:02 compute-0 podman[215620]: 2026-01-23 09:20:02.218210515 +0000 UTC m=+0.053278047 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 09:20:02 compute-0 podman[215621]: 2026-01-23 09:20:02.221264247 +0000 UTC m=+0.056045679 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:20:04 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000039.scope: Deactivated successfully.
Jan 23 09:20:04 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000039.scope: Consumed 12.645s CPU time.
Jan 23 09:20:04 compute-0 systemd-machined[153562]: Machine qemu-23-instance-00000039 terminated.
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.081 182096 INFO nova.virt.libvirt.driver [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance shutdown successfully after 13 seconds.
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.086 182096 INFO nova.virt.libvirt.driver [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance destroyed successfully.
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.087 182096 DEBUG nova.objects.instance [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lazy-loading 'numa_topology' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:05 compute-0 podman[215665]: 2026-01-23 09:20:05.208522976 +0000 UTC m=+0.046590088 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.533 182096 INFO nova.virt.libvirt.driver [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Beginning cold snapshot process
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.623 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.623 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.624 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.624 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.624 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.632 182096 INFO nova.compute.manager [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Terminating instance
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.638 182096 DEBUG nova.compute.manager [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:20:05 compute-0 kernel: tap242626ef-65 (unregistering): left promiscuous mode
Jan 23 09:20:05 compute-0 NetworkManager[54920]: <info>  [1769160005.6573] device (tap242626ef-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.663 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 ovn_controller[94697]: 2026-01-23T09:20:05Z|00152|binding|INFO|Releasing lport 242626ef-6515-431f-9f4d-fc4aabe2c080 from this chassis (sb_readonly=0)
Jan 23 09:20:05 compute-0 ovn_controller[94697]: 2026-01-23T09:20:05Z|00153|binding|INFO|Setting lport 242626ef-6515-431f-9f4d-fc4aabe2c080 down in Southbound
Jan 23 09:20:05 compute-0 ovn_controller[94697]: 2026-01-23T09:20:05Z|00154|binding|INFO|Removing iface tap242626ef-65 ovn-installed in OVS
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.665 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.667 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.678 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:2b:87 10.100.0.10'], port_security=['fa:16:3e:2f:2b:87 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4dd3c99c-cee8-4856-b2a2-2bd42f7ac038', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '743f68086d8f4ae38a7b2fb3e91ce01c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '980000d9-53ce-405d-b2a7-06d2e52da4ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef5e5d22-91ef-425b-9a12-147397ffa1e3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=242626ef-6515-431f-9f4d-fc4aabe2c080) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.679 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 242626ef-6515-431f-9f4d-fc4aabe2c080 in datapath 63e19746-3286-4b7a-ad2d-556b0af85c5a unbound from our chassis
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.679 182096 DEBUG nova.privsep.utils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.679 182096 DEBUG oslo_concurrency.processutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk /var/lib/nova/instances/snapshots/tmpvr917469/6cacfcfdbcc74144a0ce0ed70ca944a3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.680 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 63e19746-3286-4b7a-ad2d-556b0af85c5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.681 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d24a888-c44d-4340-990c-e665442cd627]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.682 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a namespace which is not needed anymore
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.694 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 23 09:20:05 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Consumed 9.450s CPU time.
Jan 23 09:20:05 compute-0 systemd-machined[153562]: Machine qemu-25-instance-00000034 terminated.
Jan 23 09:20:05 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215583]: [NOTICE]   (215587) : haproxy version is 2.8.14-c23fe91
Jan 23 09:20:05 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215583]: [NOTICE]   (215587) : path to executable is /usr/sbin/haproxy
Jan 23 09:20:05 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215583]: [WARNING]  (215587) : Exiting Master process...
Jan 23 09:20:05 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215583]: [ALERT]    (215587) : Current worker (215589) exited with code 143 (Terminated)
Jan 23 09:20:05 compute-0 neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a[215583]: [WARNING]  (215587) : All workers exited. Exiting... (0)
Jan 23 09:20:05 compute-0 systemd[1]: libpod-c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932.scope: Deactivated successfully.
Jan 23 09:20:05 compute-0 conmon[215583]: conmon c1be121cf7c27fd699a8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932.scope/container/memory.events
Jan 23 09:20:05 compute-0 podman[215709]: 2026-01-23 09:20:05.790888865 +0000 UTC m=+0.045913513 container stop c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:20:05 compute-0 podman[215709]: 2026-01-23 09:20:05.819760803 +0000 UTC m=+0.074785449 container died c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.831 182096 DEBUG oslo_concurrency.processutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk /var/lib/nova/instances/snapshots/tmpvr917469/6cacfcfdbcc74144a0ce0ed70ca944a3" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.832 182096 INFO nova.virt.libvirt.driver [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Snapshot extracted, beginning image upload
Jan 23 09:20:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932-userdata-shm.mount: Deactivated successfully.
Jan 23 09:20:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-bc7903353a5e73396120a4b4f1588df7b97349a4b414beeceb6dbffe26c9bca1-merged.mount: Deactivated successfully.
Jan 23 09:20:05 compute-0 podman[215709]: 2026-01-23 09:20:05.844123451 +0000 UTC m=+0.099148097 container cleanup c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:20:05 compute-0 systemd[1]: libpod-conmon-c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932.scope: Deactivated successfully.
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.880 182096 INFO nova.virt.libvirt.driver [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Instance destroyed successfully.
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.881 182096 DEBUG nova.objects.instance [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lazy-loading 'resources' on Instance uuid 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.896 182096 DEBUG nova.virt.libvirt.vif [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-939361418',display_name='tempest-ListServerFiltersTestJSON-instance-939361418',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-939361418',id=52,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='743f68086d8f4ae38a7b2fb3e91ce01c',ramdisk_id='',reservation_id='r-3gw46b98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-721478701',owner_user_name='tempest-ListServerFiltersTestJSON-721478701-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:56Z,user_data=None,user_id='66f8a16479154d23a187c0922062f421',uuid=4dd3c99c-cee8-4856-b2a2-2bd42f7ac038,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.896 182096 DEBUG nova.network.os_vif_util [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converting VIF {"id": "242626ef-6515-431f-9f4d-fc4aabe2c080", "address": "fa:16:3e:2f:2b:87", "network": {"id": "63e19746-3286-4b7a-ad2d-556b0af85c5a", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-204177319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "743f68086d8f4ae38a7b2fb3e91ce01c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242626ef-65", "ovs_interfaceid": "242626ef-6515-431f-9f4d-fc4aabe2c080", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.897 182096 DEBUG nova.network.os_vif_util [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.897 182096 DEBUG os_vif [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:20:05 compute-0 podman[215737]: 2026-01-23 09:20:05.8990618 +0000 UTC m=+0.034810362 container remove c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.899 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.899 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap242626ef-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.900 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.901 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.903 182096 INFO os_vif [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:2b:87,bridge_name='br-int',has_traffic_filtering=True,id=242626ef-6515-431f-9f4d-fc4aabe2c080,network=Network(63e19746-3286-4b7a-ad2d-556b0af85c5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242626ef-65')
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.903 182096 INFO nova.virt.libvirt.driver [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Deleting instance files /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038_del
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.903 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[66d54743-e311-4455-b590-f801b9449ddf]: (4, ('Fri Jan 23 09:20:05 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a (c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932)\nc1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932\nFri Jan 23 09:20:05 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a (c1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932)\nc1be121cf7c27fd699a84a6540834a3f8f2edbe5c276d2c967070d4627fa4932\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.904 182096 INFO nova.virt.libvirt.driver [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Deletion of /var/lib/nova/instances/4dd3c99c-cee8-4856-b2a2-2bd42f7ac038_del complete
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.905 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0efd20-8a65-4135-955d-c5e4e0384524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.906 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63e19746-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.909 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 kernel: tap63e19746-30: left promiscuous mode
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.911 182096 DEBUG nova.compute.manager [req-554f7bae-3716-4af5-aaa7-9d2744f64454 req-a73304fc-511c-423c-aa3a-268f370ce9fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-unplugged-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.912 182096 DEBUG oslo_concurrency.lockutils [req-554f7bae-3716-4af5-aaa7-9d2744f64454 req-a73304fc-511c-423c-aa3a-268f370ce9fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.912 182096 DEBUG oslo_concurrency.lockutils [req-554f7bae-3716-4af5-aaa7-9d2744f64454 req-a73304fc-511c-423c-aa3a-268f370ce9fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.912 182096 DEBUG oslo_concurrency.lockutils [req-554f7bae-3716-4af5-aaa7-9d2744f64454 req-a73304fc-511c-423c-aa3a-268f370ce9fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.913 182096 DEBUG nova.compute.manager [req-554f7bae-3716-4af5-aaa7-9d2744f64454 req-a73304fc-511c-423c-aa3a-268f370ce9fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] No waiting events found dispatching network-vif-unplugged-242626ef-6515-431f-9f4d-fc4aabe2c080 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.913 182096 DEBUG nova.compute.manager [req-554f7bae-3716-4af5-aaa7-9d2744f64454 req-a73304fc-511c-423c-aa3a-268f370ce9fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-unplugged-242626ef-6515-431f-9f4d-fc4aabe2c080 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.921 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.924 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a94e5ed1-7cdb-45e2-ae1d-135f8c65dd3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.932 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[60c96582-37dc-454b-86c7-731252e85f5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.933 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[80c4a64b-9d85-4d2e-b51b-2d5aba080264]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.945 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c04d7cee-2ba2-4399-afac-d6c1f378a872]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350130, 'reachable_time': 32592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215760, 'error': None, 'target': 'ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d63e19746\x2d3286\x2d4b7a\x2dad2d\x2d556b0af85c5a.mount: Deactivated successfully.
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.946 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-63e19746-3286-4b7a-ad2d-556b0af85c5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:20:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:05.946 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[767eeb57-381d-496c-8752-05efe7a382de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.952 182096 INFO nova.compute.manager [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Took 0.31 seconds to destroy the instance on the hypervisor.
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.952 182096 DEBUG oslo.service.loopingcall [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.953 182096 DEBUG nova.compute.manager [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:20:05 compute-0 nova_compute[182092]: 2026-01-23 09:20:05.953 182096 DEBUG nova.network.neutron [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.445 182096 DEBUG nova.network.neutron [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.465 182096 INFO nova.compute.manager [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Took 0.51 seconds to deallocate network for instance.
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.516 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.516 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.526 182096 DEBUG nova.compute.manager [req-6a8e85ba-847c-4c23-b3db-c86a4318c910 req-25596d95-4e1c-4b12-830a-fe4261050757 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-deleted-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.612 182096 DEBUG nova.compute.provider_tree [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.649 182096 DEBUG nova.scheduler.client.report [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.663 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.684 182096 INFO nova.scheduler.client.report [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Deleted allocations for instance 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.738 182096 DEBUG oslo_concurrency.lockutils [None req-7cc1b44f-ca00-49ac-bb49-56db975a4fb4 66f8a16479154d23a187c0922062f421 743f68086d8f4ae38a7b2fb3e91ce01c - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:06 compute-0 nova_compute[182092]: 2026-01-23 09:20:06.945 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:07 compute-0 nova_compute[182092]: 2026-01-23 09:20:07.952 182096 INFO nova.virt.libvirt.driver [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Snapshot image upload complete
Jan 23 09:20:07 compute-0 nova_compute[182092]: 2026-01-23 09:20:07.953 182096 DEBUG nova.compute.manager [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:07 compute-0 nova_compute[182092]: 2026-01-23 09:20:07.991 182096 DEBUG nova.compute.manager [req-4b919ba7-8076-42f3-8d9c-af9eb0d7bf9a req-b0370a3c-bb60-42c4-963d-288d3322a527 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:07 compute-0 nova_compute[182092]: 2026-01-23 09:20:07.992 182096 DEBUG oslo_concurrency.lockutils [req-4b919ba7-8076-42f3-8d9c-af9eb0d7bf9a req-b0370a3c-bb60-42c4-963d-288d3322a527 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:07 compute-0 nova_compute[182092]: 2026-01-23 09:20:07.992 182096 DEBUG oslo_concurrency.lockutils [req-4b919ba7-8076-42f3-8d9c-af9eb0d7bf9a req-b0370a3c-bb60-42c4-963d-288d3322a527 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:07 compute-0 nova_compute[182092]: 2026-01-23 09:20:07.992 182096 DEBUG oslo_concurrency.lockutils [req-4b919ba7-8076-42f3-8d9c-af9eb0d7bf9a req-b0370a3c-bb60-42c4-963d-288d3322a527 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4dd3c99c-cee8-4856-b2a2-2bd42f7ac038-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:07 compute-0 nova_compute[182092]: 2026-01-23 09:20:07.992 182096 DEBUG nova.compute.manager [req-4b919ba7-8076-42f3-8d9c-af9eb0d7bf9a req-b0370a3c-bb60-42c4-963d-288d3322a527 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] No waiting events found dispatching network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:07 compute-0 nova_compute[182092]: 2026-01-23 09:20:07.992 182096 WARNING nova.compute.manager [req-4b919ba7-8076-42f3-8d9c-af9eb0d7bf9a req-b0370a3c-bb60-42c4-963d-288d3322a527 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Received unexpected event network-vif-plugged-242626ef-6515-431f-9f4d-fc4aabe2c080 for instance with vm_state deleted and task_state None.
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.045 182096 INFO nova.compute.manager [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Shelve offloading
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.054 182096 INFO nova.virt.libvirt.driver [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance destroyed successfully.
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.054 182096 DEBUG nova.compute.manager [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.056 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.056 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquired lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.056 182096 DEBUG nova.network.neutron [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.290 182096 DEBUG nova.network.neutron [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.593 182096 DEBUG nova.network.neutron [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.613 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Releasing lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.618 182096 INFO nova.virt.libvirt.driver [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance destroyed successfully.
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.618 182096 DEBUG nova.objects.instance [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lazy-loading 'resources' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.625 182096 INFO nova.virt.libvirt.driver [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Deleting instance files /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096_del
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.629 182096 INFO nova.virt.libvirt.driver [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Deletion of /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096_del complete
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.713 182096 INFO nova.scheduler.client.report [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Deleted allocations for instance c58940b5-b4b8-4e1b-a741-cce80dd02096
Jan 23 09:20:08 compute-0 ovn_controller[94697]: 2026-01-23T09:20:08Z|00155|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.762 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.772 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.773 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.817 182096 DEBUG nova.compute.provider_tree [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.829 182096 DEBUG nova.scheduler.client.report [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.849 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:08 compute-0 nova_compute[182092]: 2026-01-23 09:20:08.929 182096 DEBUG oslo_concurrency.lockutils [None req-3a355c77-acbc-40b6-98f7-2763f3377c75 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 16.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:10 compute-0 ovn_controller[94697]: 2026-01-23T09:20:10Z|00156|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:20:10 compute-0 nova_compute[182092]: 2026-01-23 09:20:10.891 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:10 compute-0 nova_compute[182092]: 2026-01-23 09:20:10.900 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.429 182096 DEBUG oslo_concurrency.lockutils [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.430 182096 DEBUG oslo_concurrency.lockutils [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.430 182096 DEBUG nova.objects.instance [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'flavor' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.454 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Acquiring lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.455 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.455 182096 INFO nova.compute.manager [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Unshelving
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.456 182096 DEBUG nova.objects.instance [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'pci_requests' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.468 182096 DEBUG nova.network.neutron [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.536 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.536 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.541 182096 DEBUG nova.objects.instance [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lazy-loading 'pci_requests' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.549 182096 DEBUG nova.objects.instance [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lazy-loading 'numa_topology' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.557 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.557 182096 INFO nova.compute.claims [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.670 182096 DEBUG nova.compute.provider_tree [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.681 182096 DEBUG nova.scheduler.client.report [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.705 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.842 182096 DEBUG nova.policy [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.849 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Acquiring lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.849 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Acquired lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.850 182096 DEBUG nova.network.neutron [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:20:11 compute-0 nova_compute[182092]: 2026-01-23 09:20:11.946 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.051 182096 DEBUG nova.network.neutron [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.423 182096 DEBUG nova.network.neutron [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Successfully created port: 234110f5-482d-4f74-8806-8e50183b4820 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.451 182096 DEBUG nova.network.neutron [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.477 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Releasing lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.478 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.478 182096 INFO nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Creating image(s)
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.479 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Acquiring lock "/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.479 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.479 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.480 182096 DEBUG nova.objects.instance [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lazy-loading 'trusted_certs' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.487 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Acquiring lock "048a134f49b908e4f6e5b4543a869b574113fd89" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:12 compute-0 nova_compute[182092]: 2026-01-23 09:20:12.488 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "048a134f49b908e4f6e5b4543a869b574113fd89" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:13 compute-0 nova_compute[182092]: 2026-01-23 09:20:13.133 182096 DEBUG nova.network.neutron [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Successfully updated port: 234110f5-482d-4f74-8806-8e50183b4820 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:20:13 compute-0 nova_compute[182092]: 2026-01-23 09:20:13.167 182096 DEBUG oslo_concurrency.lockutils [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:13 compute-0 nova_compute[182092]: 2026-01-23 09:20:13.167 182096 DEBUG oslo_concurrency.lockutils [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:13 compute-0 nova_compute[182092]: 2026-01-23 09:20:13.167 182096 DEBUG nova.network.neutron [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:20:13 compute-0 podman[215761]: 2026-01-23 09:20:13.217615675 +0000 UTC m=+0.054064571 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 09:20:13 compute-0 nova_compute[182092]: 2026-01-23 09:20:13.431 182096 DEBUG nova.compute.manager [req-cc248603-6ac6-4711-b1a2-c832fe78e6d1 req-9c541268-f56c-43e7-9168-985a30bd670d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-changed-234110f5-482d-4f74-8806-8e50183b4820 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:13 compute-0 nova_compute[182092]: 2026-01-23 09:20:13.431 182096 DEBUG nova.compute.manager [req-cc248603-6ac6-4711-b1a2-c832fe78e6d1 req-9c541268-f56c-43e7-9168-985a30bd670d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing instance network info cache due to event network-changed-234110f5-482d-4f74-8806-8e50183b4820. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:20:13 compute-0 nova_compute[182092]: 2026-01-23 09:20:13.431 182096 DEBUG oslo_concurrency.lockutils [req-cc248603-6ac6-4711-b1a2-c832fe78e6d1 req-9c541268-f56c-43e7-9168-985a30bd670d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:13 compute-0 nova_compute[182092]: 2026-01-23 09:20:13.501 182096 WARNING nova.network.neutron [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] 502ff19d-7b13-4dc2-8ece-02806b418ba0 already exists in list: networks containing: ['502ff19d-7b13-4dc2-8ece-02806b418ba0']. ignoring it
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.387 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.435 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89.part --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.436 182096 DEBUG nova.virt.images [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] dc92976e-0762-4ffd-84a7-9ee18aeafca6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.437 182096 DEBUG nova.privsep.utils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.437 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89.part /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.564 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89.part /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89.converted" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.571 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.619 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89.converted --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.620 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "048a134f49b908e4f6e5b4543a869b574113fd89" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.631 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.677 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.678 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Acquiring lock "048a134f49b908e4f6e5b4543a869b574113fd89" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.679 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "048a134f49b908e4f6e5b4543a869b574113fd89" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.688 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.734 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.735 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89,backing_fmt=raw /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.759 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89,backing_fmt=raw /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.760 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "048a134f49b908e4f6e5b4543a869b574113fd89" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.761 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.809 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.810 182096 DEBUG nova.objects.instance [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lazy-loading 'migration_context' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.822 182096 INFO nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Rebasing disk image.
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.823 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.870 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:14 compute-0 nova_compute[182092]: 2026-01-23 09:20:14.871 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 -F raw /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.752 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 -F raw /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk" returned: 0 in 0.881s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.753 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.753 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Ensure instance console log exists: /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.754 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.754 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.754 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.755 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='6ff85660e5755d6ed1135c1d38cf3e3b',container_format='bare',created_at=2026-01-23T09:19:51Z,direct_url=<?>,disk_format='qcow2',id=dc92976e-0762-4ffd-84a7-9ee18aeafca6,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1311020306-shelved',owner='ce124b37bc3b4586ad3473321562d9fb',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2026-01-23T09:20:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.758 182096 WARNING nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.763 182096 DEBUG nova.virt.libvirt.host [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.763 182096 DEBUG nova.virt.libvirt.host [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.766 182096 DEBUG nova.virt.libvirt.host [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.766 182096 DEBUG nova.virt.libvirt.host [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.767 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.767 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='6ff85660e5755d6ed1135c1d38cf3e3b',container_format='bare',created_at=2026-01-23T09:19:51Z,direct_url=<?>,disk_format='qcow2',id=dc92976e-0762-4ffd-84a7-9ee18aeafca6,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1311020306-shelved',owner='ce124b37bc3b4586ad3473321562d9fb',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2026-01-23T09:20:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.768 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.768 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.768 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.768 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.769 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.769 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.769 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.769 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.769 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.770 182096 DEBUG nova.virt.hardware [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.770 182096 DEBUG nova.objects.instance [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lazy-loading 'vcpu_model' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.788 182096 DEBUG nova.objects.instance [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lazy-loading 'pci_devices' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.819 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <uuid>c58940b5-b4b8-4e1b-a741-cce80dd02096</uuid>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <name>instance-00000039</name>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1311020306</nova:name>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:20:15</nova:creationTime>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:20:15 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:20:15 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:20:15 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:20:15 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:15 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:20:15 compute-0 nova_compute[182092]:         <nova:user uuid="476826df6a2744be869eff79367a0516">tempest-UnshelveToHostMultiNodesTest-1442713466-project-member</nova:user>
Jan 23 09:20:15 compute-0 nova_compute[182092]:         <nova:project uuid="ce124b37bc3b4586ad3473321562d9fb">tempest-UnshelveToHostMultiNodesTest-1442713466</nova:project>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="dc92976e-0762-4ffd-84a7-9ee18aeafca6"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <system>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <entry name="serial">c58940b5-b4b8-4e1b-a741-cce80dd02096</entry>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <entry name="uuid">c58940b5-b4b8-4e1b-a741-cce80dd02096</entry>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     </system>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <os>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   </os>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <features>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   </features>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.config"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/console.log" append="off"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <video>
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     </video>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <input type="keyboard" bus="usb"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:20:15 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:20:15 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:20:15 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:20:15 compute-0 nova_compute[182092]: </domain>
Jan 23 09:20:15 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.876 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.876 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.877 182096 INFO nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Using config drive
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.887 182096 DEBUG nova.objects.instance [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lazy-loading 'ec2_ids' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.902 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:15 compute-0 nova_compute[182092]: 2026-01-23 09:20:15.929 182096 DEBUG nova.objects.instance [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lazy-loading 'keypairs' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.144 182096 INFO nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Creating config drive at /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.config
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.149 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf8wg5z7m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.269 182096 DEBUG oslo_concurrency.processutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf8wg5z7m" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:16 compute-0 systemd-machined[153562]: New machine qemu-26-instance-00000039.
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.310 182096 DEBUG nova.network.neutron [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:16 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-00000039.
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.329 182096 DEBUG oslo_concurrency.lockutils [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.330 182096 DEBUG oslo_concurrency.lockutils [req-cc248603-6ac6-4711-b1a2-c832fe78e6d1 req-9c541268-f56c-43e7-9168-985a30bd670d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.330 182096 DEBUG nova.network.neutron [req-cc248603-6ac6-4711-b1a2-c832fe78e6d1 req-9c541268-f56c-43e7-9168-985a30bd670d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing network info cache for port 234110f5-482d-4f74-8806-8e50183b4820 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.333 182096 DEBUG nova.virt.libvirt.vif [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.333 182096 DEBUG nova.network.os_vif_util [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.334 182096 DEBUG nova.network.os_vif_util [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.334 182096 DEBUG os_vif [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.334 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.335 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.335 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.337 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.338 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap234110f5-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.338 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap234110f5-48, col_values=(('external_ids', {'iface-id': '234110f5-482d-4f74-8806-8e50183b4820', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:4a:5f', 'vm-uuid': 'aaa5d40f-5771-495a-9e1a-76dab011324d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:16 compute-0 NetworkManager[54920]: <info>  [1769160016.3404] manager: (tap234110f5-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.340 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.345 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.345 182096 INFO os_vif [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48')
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.346 182096 DEBUG nova.virt.libvirt.vif [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.346 182096 DEBUG nova.network.os_vif_util [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.347 182096 DEBUG nova.network.os_vif_util [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.349 182096 DEBUG nova.virt.libvirt.guest [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] attach device xml: <interface type="ethernet">
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:f4:4a:5f"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <target dev="tap234110f5-48"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]: </interface>
Jan 23 09:20:16 compute-0 nova_compute[182092]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 23 09:20:16 compute-0 NetworkManager[54920]: <info>  [1769160016.3582] manager: (tap234110f5-48): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Jan 23 09:20:16 compute-0 kernel: tap234110f5-48: entered promiscuous mode
Jan 23 09:20:16 compute-0 ovn_controller[94697]: 2026-01-23T09:20:16Z|00157|binding|INFO|Claiming lport 234110f5-482d-4f74-8806-8e50183b4820 for this chassis.
Jan 23 09:20:16 compute-0 ovn_controller[94697]: 2026-01-23T09:20:16Z|00158|binding|INFO|234110f5-482d-4f74-8806-8e50183b4820: Claiming fa:16:3e:f4:4a:5f 10.100.0.3
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.363 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.373 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:4a:5f 10.100.0.3'], port_security=['fa:16:3e:f4:4a:5f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e1e3e7d-99c7-4c17-b0f8-7664ef7a84a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=234110f5-482d-4f74-8806-8e50183b4820) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.374 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 234110f5-482d-4f74-8806-8e50183b4820 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 bound to our chassis
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.376 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:20:16 compute-0 ovn_controller[94697]: 2026-01-23T09:20:16Z|00159|binding|INFO|Setting lport 234110f5-482d-4f74-8806-8e50183b4820 ovn-installed in OVS
Jan 23 09:20:16 compute-0 ovn_controller[94697]: 2026-01-23T09:20:16Z|00160|binding|INFO|Setting lport 234110f5-482d-4f74-8806-8e50183b4820 up in Southbound
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.379 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.381 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:16 compute-0 systemd-udevd[215844]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.403 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[305f0a92-b65b-416f-b9b4-0e181dc9a72a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:16 compute-0 NetworkManager[54920]: <info>  [1769160016.4104] device (tap234110f5-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:20:16 compute-0 NetworkManager[54920]: <info>  [1769160016.4111] device (tap234110f5-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.434 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[fd69178a-1ff5-4a9b-98f7-5235437ce3d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.436 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae3d5f0-67aa-4c9e-ba40-2eee2e239ccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.458 182096 DEBUG nova.virt.libvirt.driver [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.458 182096 DEBUG nova.virt.libvirt.driver [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.458 182096 DEBUG nova.virt.libvirt.driver [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:38:78:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.459 182096 DEBUG nova.virt.libvirt.driver [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:f4:4a:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.461 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b66bdf74-dd8a-4a4d-b1e8-04b554c827fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.474 182096 DEBUG nova.virt.libvirt.guest [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:16</nova:creationTime>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:16 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     <nova:port uuid="234110f5-482d-4f74-8806-8e50183b4820">
Jan 23 09:20:16 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 09:20:16 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:16 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:16 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:16 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.476 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[478e95b2-3435-4937-b518-10a71a288b0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349659, 'reachable_time': 27397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215851, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.487 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8df23b8a-cae5-40bf-8428-e4fcce403a39]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349669, 'tstamp': 349669}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215852, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349671, 'tstamp': 349671}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215852, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.488 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.490 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.490 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.491 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:16.491 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.492 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.495 182096 DEBUG oslo_concurrency.lockutils [None req-c16fd846-47a0-4cc7-a792-5e697ccf1a55 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.572 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for c58940b5-b4b8-4e1b-a741-cce80dd02096 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.573 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160016.5707393, c58940b5-b4b8-4e1b-a741-cce80dd02096 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.573 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] VM Resumed (Lifecycle Event)
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.574 182096 DEBUG nova.compute.manager [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.575 182096 DEBUG nova.virt.libvirt.driver [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.577 182096 INFO nova.virt.libvirt.driver [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance spawned successfully.
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.609 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.611 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.627 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.628 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160016.5708463, c58940b5-b4b8-4e1b-a741-cce80dd02096 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.628 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] VM Started (Lifecycle Event)
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.639 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.642 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.655 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.742 182096 DEBUG nova.compute.manager [req-f4df42b0-6787-403b-86d8-82ecf8a7ea2e req-c08c712e-ee08-461e-b4be-0785e320fe7a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.742 182096 DEBUG oslo_concurrency.lockutils [req-f4df42b0-6787-403b-86d8-82ecf8a7ea2e req-c08c712e-ee08-461e-b4be-0785e320fe7a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.743 182096 DEBUG oslo_concurrency.lockutils [req-f4df42b0-6787-403b-86d8-82ecf8a7ea2e req-c08c712e-ee08-461e-b4be-0785e320fe7a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.743 182096 DEBUG oslo_concurrency.lockutils [req-f4df42b0-6787-403b-86d8-82ecf8a7ea2e req-c08c712e-ee08-461e-b4be-0785e320fe7a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.743 182096 DEBUG nova.compute.manager [req-f4df42b0-6787-403b-86d8-82ecf8a7ea2e req-c08c712e-ee08-461e-b4be-0785e320fe7a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.743 182096 WARNING nova.compute.manager [req-f4df42b0-6787-403b-86d8-82ecf8a7ea2e req-c08c712e-ee08-461e-b4be-0785e320fe7a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 for instance with vm_state active and task_state None.
Jan 23 09:20:16 compute-0 nova_compute[182092]: 2026-01-23 09:20:16.949 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.326 182096 DEBUG oslo_concurrency.lockutils [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.326 182096 DEBUG oslo_concurrency.lockutils [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.327 182096 DEBUG nova.objects.instance [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'flavor' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.476 182096 DEBUG nova.compute.manager [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.557 182096 DEBUG oslo_concurrency.lockutils [None req-0f8e6346-acec-4f08-b6bd-723e8c2bbb43 4a2c7cc0a9e14ea3ad50b5b88ccd9f94 cf99859233184823b57a841a72c84bfa - - default default] Lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 6.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.876 182096 DEBUG nova.network.neutron [req-cc248603-6ac6-4711-b1a2-c832fe78e6d1 req-9c541268-f56c-43e7-9168-985a30bd670d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updated VIF entry in instance network info cache for port 234110f5-482d-4f74-8806-8e50183b4820. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.876 182096 DEBUG nova.network.neutron [req-cc248603-6ac6-4711-b1a2-c832fe78e6d1 req-9c541268-f56c-43e7-9168-985a30bd670d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.880 182096 DEBUG nova.objects.instance [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'pci_requests' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.935 182096 DEBUG nova.network.neutron [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:20:17 compute-0 nova_compute[182092]: 2026-01-23 09:20:17.937 182096 DEBUG oslo_concurrency.lockutils [req-cc248603-6ac6-4711-b1a2-c832fe78e6d1 req-9c541268-f56c-43e7-9168-985a30bd670d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:18 compute-0 ovn_controller[94697]: 2026-01-23T09:20:18Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:4a:5f 10.100.0.3
Jan 23 09:20:18 compute-0 ovn_controller[94697]: 2026-01-23T09:20:18Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:4a:5f 10.100.0.3
Jan 23 09:20:18 compute-0 nova_compute[182092]: 2026-01-23 09:20:18.343 182096 DEBUG nova.policy [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:20:18 compute-0 nova_compute[182092]: 2026-01-23 09:20:18.847 182096 DEBUG nova.compute.manager [req-fa8274e5-329f-4176-b7e9-cfdac51d43b1 req-4106cf5c-f12d-4c49-adf7-14540e064283 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:18 compute-0 nova_compute[182092]: 2026-01-23 09:20:18.848 182096 DEBUG oslo_concurrency.lockutils [req-fa8274e5-329f-4176-b7e9-cfdac51d43b1 req-4106cf5c-f12d-4c49-adf7-14540e064283 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:18 compute-0 nova_compute[182092]: 2026-01-23 09:20:18.848 182096 DEBUG oslo_concurrency.lockutils [req-fa8274e5-329f-4176-b7e9-cfdac51d43b1 req-4106cf5c-f12d-4c49-adf7-14540e064283 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:18 compute-0 nova_compute[182092]: 2026-01-23 09:20:18.849 182096 DEBUG oslo_concurrency.lockutils [req-fa8274e5-329f-4176-b7e9-cfdac51d43b1 req-4106cf5c-f12d-4c49-adf7-14540e064283 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:18 compute-0 nova_compute[182092]: 2026-01-23 09:20:18.849 182096 DEBUG nova.compute.manager [req-fa8274e5-329f-4176-b7e9-cfdac51d43b1 req-4106cf5c-f12d-4c49-adf7-14540e064283 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:18 compute-0 nova_compute[182092]: 2026-01-23 09:20:18.849 182096 WARNING nova.compute.manager [req-fa8274e5-329f-4176-b7e9-cfdac51d43b1 req-4106cf5c-f12d-4c49-adf7-14540e064283 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 for instance with vm_state active and task_state None.
Jan 23 09:20:18 compute-0 nova_compute[182092]: 2026-01-23 09:20:18.980 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:19 compute-0 nova_compute[182092]: 2026-01-23 09:20:19.542 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:19 compute-0 nova_compute[182092]: 2026-01-23 09:20:19.543 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:19 compute-0 nova_compute[182092]: 2026-01-23 09:20:19.543 182096 INFO nova.compute.manager [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Shelving
Jan 23 09:20:19 compute-0 nova_compute[182092]: 2026-01-23 09:20:19.568 182096 DEBUG nova.virt.libvirt.driver [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:20:19 compute-0 nova_compute[182092]: 2026-01-23 09:20:19.611 182096 DEBUG nova.network.neutron [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Successfully created port: 9f6084b0-c202-400c-b135-db64f24411c2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.182 182096 DEBUG nova.network.neutron [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Successfully updated port: 9f6084b0-c202-400c-b135-db64f24411c2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.195 182096 DEBUG oslo_concurrency.lockutils [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.195 182096 DEBUG oslo_concurrency.lockutils [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.195 182096 DEBUG nova.network.neutron [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.355 182096 WARNING nova.network.neutron [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] 502ff19d-7b13-4dc2-8ece-02806b418ba0 already exists in list: networks containing: ['502ff19d-7b13-4dc2-8ece-02806b418ba0']. ignoring it
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.355 182096 WARNING nova.network.neutron [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] 502ff19d-7b13-4dc2-8ece-02806b418ba0 already exists in list: networks containing: ['502ff19d-7b13-4dc2-8ece-02806b418ba0']. ignoring it
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.360 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.879 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160005.8782923, 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.880 182096 INFO nova.compute.manager [-] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] VM Stopped (Lifecycle Event)
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.896 182096 DEBUG nova.compute.manager [None req-02a960ac-513c-4ab7-8fb2-330081681ac7 - - - - - -] [instance: 4dd3c99c-cee8-4856-b2a2-2bd42f7ac038] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.944 182096 DEBUG nova.compute.manager [req-a7dec90b-4afc-4da9-a3ad-dbfb447f97f7 req-fd0b527a-d248-4006-84cd-1ba7cb3ea154 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-changed-9f6084b0-c202-400c-b135-db64f24411c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.944 182096 DEBUG nova.compute.manager [req-a7dec90b-4afc-4da9-a3ad-dbfb447f97f7 req-fd0b527a-d248-4006-84cd-1ba7cb3ea154 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing instance network info cache due to event network-changed-9f6084b0-c202-400c-b135-db64f24411c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:20:20 compute-0 nova_compute[182092]: 2026-01-23 09:20:20.944 182096 DEBUG oslo_concurrency.lockutils [req-a7dec90b-4afc-4da9-a3ad-dbfb447f97f7 req-fd0b527a-d248-4006-84cd-1ba7cb3ea154 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:21.060 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:21.060 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:20:21 compute-0 nova_compute[182092]: 2026-01-23 09:20:21.062 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:21 compute-0 nova_compute[182092]: 2026-01-23 09:20:21.200 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:21 compute-0 podman[215861]: 2026-01-23 09:20:21.233324158 +0000 UTC m=+0.064232117 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:20:21 compute-0 podman[215860]: 2026-01-23 09:20:21.267356411 +0000 UTC m=+0.099469043 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:20:21 compute-0 nova_compute[182092]: 2026-01-23 09:20:21.339 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:21 compute-0 nova_compute[182092]: 2026-01-23 09:20:21.950 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.105 182096 DEBUG nova.network.neutron [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.145 182096 DEBUG oslo_concurrency.lockutils [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.146 182096 DEBUG oslo_concurrency.lockutils [req-a7dec90b-4afc-4da9-a3ad-dbfb447f97f7 req-fd0b527a-d248-4006-84cd-1ba7cb3ea154 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.146 182096 DEBUG nova.network.neutron [req-a7dec90b-4afc-4da9-a3ad-dbfb447f97f7 req-fd0b527a-d248-4006-84cd-1ba7cb3ea154 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing network info cache for port 9f6084b0-c202-400c-b135-db64f24411c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.148 182096 DEBUG nova.virt.libvirt.vif [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.148 182096 DEBUG nova.network.os_vif_util [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.149 182096 DEBUG nova.network.os_vif_util [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:1a:d5,bridge_name='br-int',has_traffic_filtering=True,id=9f6084b0-c202-400c-b135-db64f24411c2,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f6084b0-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.149 182096 DEBUG os_vif [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:1a:d5,bridge_name='br-int',has_traffic_filtering=True,id=9f6084b0-c202-400c-b135-db64f24411c2,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f6084b0-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.150 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.150 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.151 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.153 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.153 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f6084b0-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.153 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9f6084b0-c2, col_values=(('external_ids', {'iface-id': '9f6084b0-c202-400c-b135-db64f24411c2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:1a:d5', 'vm-uuid': 'aaa5d40f-5771-495a-9e1a-76dab011324d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.154 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 NetworkManager[54920]: <info>  [1769160023.1555] manager: (tap9f6084b0-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.156 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.162 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.164 182096 INFO os_vif [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:1a:d5,bridge_name='br-int',has_traffic_filtering=True,id=9f6084b0-c202-400c-b135-db64f24411c2,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f6084b0-c2')
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.164 182096 DEBUG nova.virt.libvirt.vif [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.165 182096 DEBUG nova.network.os_vif_util [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.165 182096 DEBUG nova.network.os_vif_util [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:1a:d5,bridge_name='br-int',has_traffic_filtering=True,id=9f6084b0-c202-400c-b135-db64f24411c2,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f6084b0-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.167 182096 DEBUG nova.virt.libvirt.guest [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] attach device xml: <interface type="ethernet">
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:bf:1a:d5"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <target dev="tap9f6084b0-c2"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]: </interface>
Jan 23 09:20:23 compute-0 nova_compute[182092]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 23 09:20:23 compute-0 kernel: tap9f6084b0-c2: entered promiscuous mode
Jan 23 09:20:23 compute-0 NetworkManager[54920]: <info>  [1769160023.1768] manager: (tap9f6084b0-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.178 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 ovn_controller[94697]: 2026-01-23T09:20:23Z|00161|binding|INFO|Claiming lport 9f6084b0-c202-400c-b135-db64f24411c2 for this chassis.
Jan 23 09:20:23 compute-0 ovn_controller[94697]: 2026-01-23T09:20:23Z|00162|binding|INFO|9f6084b0-c202-400c-b135-db64f24411c2: Claiming fa:16:3e:bf:1a:d5 10.100.0.14
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.191 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:1a:d5 10.100.0.14'], port_security=['fa:16:3e:bf:1a:d5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e1e3e7d-99c7-4c17-b0f8-7664ef7a84a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=9f6084b0-c202-400c-b135-db64f24411c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.192 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 9f6084b0-c202-400c-b135-db64f24411c2 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 bound to our chassis
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.193 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.210 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 systemd-udevd[215906]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:20:23 compute-0 ovn_controller[94697]: 2026-01-23T09:20:23Z|00163|binding|INFO|Setting lport 9f6084b0-c202-400c-b135-db64f24411c2 ovn-installed in OVS
Jan 23 09:20:23 compute-0 ovn_controller[94697]: 2026-01-23T09:20:23Z|00164|binding|INFO|Setting lport 9f6084b0-c202-400c-b135-db64f24411c2 up in Southbound
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.225 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.227 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 NetworkManager[54920]: <info>  [1769160023.2345] device (tap9f6084b0-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.234 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[995d635e-f6cd-4b4d-bec3-e716ad2cd593]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:23 compute-0 NetworkManager[54920]: <info>  [1769160023.2383] device (tap9f6084b0-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.260 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[229ba85b-cf80-4f75-9d48-be0adc1f747f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.265 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[cb54eea6-0efd-44ba-865b-e34cd1833563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.284 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5b69ecb1-017d-44b1-be9e-a2c8e06f37fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.298 182096 DEBUG nova.virt.libvirt.driver [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.298 182096 DEBUG nova.virt.libvirt.driver [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.298 182096 DEBUG nova.virt.libvirt.driver [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:38:78:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.299 182096 DEBUG nova.virt.libvirt.driver [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:f4:4a:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.299 182096 DEBUG nova.virt.libvirt.driver [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:bf:1a:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.305 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[771fd999-246f-4b40-953b-110c760b5fe4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 8, 'rx_bytes': 658, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 8, 'rx_bytes': 658, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349659, 'reachable_time': 27397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215913, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.318 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ba19a6-928c-4804-9b15-d4bf5b6c4963]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349669, 'tstamp': 349669}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215914, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349671, 'tstamp': 349671}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215914, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.319 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.320 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.320 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.321 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.321 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.321 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:23.322 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.325 182096 DEBUG nova.virt.libvirt.guest [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:23</nova:creationTime>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:23 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:port uuid="234110f5-482d-4f74-8806-8e50183b4820">
Jan 23 09:20:23 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     <nova:port uuid="9f6084b0-c202-400c-b135-db64f24411c2">
Jan 23 09:20:23 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:20:23 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:23 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:23 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:23 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:20:23 compute-0 nova_compute[182092]: 2026-01-23 09:20:23.350 182096 DEBUG oslo_concurrency.lockutils [None req-649474a6-9fc1-4796-b485-189dd3ec2c00 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:24 compute-0 nova_compute[182092]: 2026-01-23 09:20:24.112 182096 DEBUG nova.network.neutron [req-a7dec90b-4afc-4da9-a3ad-dbfb447f97f7 req-fd0b527a-d248-4006-84cd-1ba7cb3ea154 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updated VIF entry in instance network info cache for port 9f6084b0-c202-400c-b135-db64f24411c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:20:24 compute-0 nova_compute[182092]: 2026-01-23 09:20:24.113 182096 DEBUG nova.network.neutron [req-a7dec90b-4afc-4da9-a3ad-dbfb447f97f7 req-fd0b527a-d248-4006-84cd-1ba7cb3ea154 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:24 compute-0 nova_compute[182092]: 2026-01-23 09:20:24.132 182096 DEBUG oslo_concurrency.lockutils [req-a7dec90b-4afc-4da9-a3ad-dbfb447f97f7 req-fd0b527a-d248-4006-84cd-1ba7cb3ea154 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:25 compute-0 ovn_controller[94697]: 2026-01-23T09:20:25Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:1a:d5 10.100.0.14
Jan 23 09:20:25 compute-0 ovn_controller[94697]: 2026-01-23T09:20:25Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:1a:d5 10.100.0.14
Jan 23 09:20:26 compute-0 nova_compute[182092]: 2026-01-23 09:20:26.952 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:27.062 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:28 compute-0 nova_compute[182092]: 2026-01-23 09:20:28.155 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:28 compute-0 nova_compute[182092]: 2026-01-23 09:20:28.328 182096 DEBUG nova.compute.manager [req-a8bbadd6-4e53-40d6-9b19-7970575a06f7 req-56d570ce-518b-461d-bb57-5f9063ab8dea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:28 compute-0 nova_compute[182092]: 2026-01-23 09:20:28.328 182096 DEBUG oslo_concurrency.lockutils [req-a8bbadd6-4e53-40d6-9b19-7970575a06f7 req-56d570ce-518b-461d-bb57-5f9063ab8dea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:28 compute-0 nova_compute[182092]: 2026-01-23 09:20:28.329 182096 DEBUG oslo_concurrency.lockutils [req-a8bbadd6-4e53-40d6-9b19-7970575a06f7 req-56d570ce-518b-461d-bb57-5f9063ab8dea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:28 compute-0 nova_compute[182092]: 2026-01-23 09:20:28.329 182096 DEBUG oslo_concurrency.lockutils [req-a8bbadd6-4e53-40d6-9b19-7970575a06f7 req-56d570ce-518b-461d-bb57-5f9063ab8dea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:28 compute-0 nova_compute[182092]: 2026-01-23 09:20:28.329 182096 DEBUG nova.compute.manager [req-a8bbadd6-4e53-40d6-9b19-7970575a06f7 req-56d570ce-518b-461d-bb57-5f9063ab8dea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:28 compute-0 nova_compute[182092]: 2026-01-23 09:20:28.330 182096 WARNING nova.compute.manager [req-a8bbadd6-4e53-40d6-9b19-7970575a06f7 req-56d570ce-518b-461d-bb57-5f9063ab8dea 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 for instance with vm_state active and task_state None.
Jan 23 09:20:29 compute-0 nova_compute[182092]: 2026-01-23 09:20:29.597 182096 DEBUG nova.virt.libvirt.driver [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:20:29 compute-0 nova_compute[182092]: 2026-01-23 09:20:29.741 182096 DEBUG oslo_concurrency.lockutils [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:29 compute-0 nova_compute[182092]: 2026-01-23 09:20:29.741 182096 DEBUG oslo_concurrency.lockutils [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:29 compute-0 nova_compute[182092]: 2026-01-23 09:20:29.742 182096 DEBUG nova.objects.instance [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'flavor' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.151 182096 DEBUG nova.objects.instance [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'pci_requests' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.163 182096 DEBUG nova.network.neutron [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.514 182096 DEBUG nova.compute.manager [req-a00f13af-cba1-457c-9fc6-0dec58b31ff0 req-3b2ccf2f-8e3a-4347-8bc0-0cb61a73f6e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.515 182096 DEBUG oslo_concurrency.lockutils [req-a00f13af-cba1-457c-9fc6-0dec58b31ff0 req-3b2ccf2f-8e3a-4347-8bc0-0cb61a73f6e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.515 182096 DEBUG oslo_concurrency.lockutils [req-a00f13af-cba1-457c-9fc6-0dec58b31ff0 req-3b2ccf2f-8e3a-4347-8bc0-0cb61a73f6e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.515 182096 DEBUG oslo_concurrency.lockutils [req-a00f13af-cba1-457c-9fc6-0dec58b31ff0 req-3b2ccf2f-8e3a-4347-8bc0-0cb61a73f6e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.516 182096 DEBUG nova.compute.manager [req-a00f13af-cba1-457c-9fc6-0dec58b31ff0 req-3b2ccf2f-8e3a-4347-8bc0-0cb61a73f6e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.516 182096 WARNING nova.compute.manager [req-a00f13af-cba1-457c-9fc6-0dec58b31ff0 req-3b2ccf2f-8e3a-4347-8bc0-0cb61a73f6e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 for instance with vm_state active and task_state None.
Jan 23 09:20:30 compute-0 nova_compute[182092]: 2026-01-23 09:20:30.606 182096 DEBUG nova.policy [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.288 182096 DEBUG nova.network.neutron [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Successfully updated port: 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.298 182096 DEBUG oslo_concurrency.lockutils [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.298 182096 DEBUG oslo_concurrency.lockutils [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.299 182096 DEBUG nova.network.neutron [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.375 182096 DEBUG nova.compute.manager [req-562bd5e3-14ab-450e-b690-487e6e971c30 req-dc326f42-7b16-47b4-8f85-bc0bf9941cdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-changed-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.375 182096 DEBUG nova.compute.manager [req-562bd5e3-14ab-450e-b690-487e6e971c30 req-dc326f42-7b16-47b4-8f85-bc0bf9941cdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing instance network info cache due to event network-changed-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.375 182096 DEBUG oslo_concurrency.lockutils [req-562bd5e3-14ab-450e-b690-487e6e971c30 req-dc326f42-7b16-47b4-8f85-bc0bf9941cdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.441 182096 WARNING nova.network.neutron [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] 502ff19d-7b13-4dc2-8ece-02806b418ba0 already exists in list: networks containing: ['502ff19d-7b13-4dc2-8ece-02806b418ba0']. ignoring it
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.441 182096 WARNING nova.network.neutron [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] 502ff19d-7b13-4dc2-8ece-02806b418ba0 already exists in list: networks containing: ['502ff19d-7b13-4dc2-8ece-02806b418ba0']. ignoring it
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.441 182096 WARNING nova.network.neutron [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] 502ff19d-7b13-4dc2-8ece-02806b418ba0 already exists in list: networks containing: ['502ff19d-7b13-4dc2-8ece-02806b418ba0']. ignoring it
Jan 23 09:20:31 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000039.scope: Deactivated successfully.
Jan 23 09:20:31 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000039.scope: Consumed 11.121s CPU time.
Jan 23 09:20:31 compute-0 systemd-machined[153562]: Machine qemu-26-instance-00000039 terminated.
Jan 23 09:20:31 compute-0 nova_compute[182092]: 2026-01-23 09:20:31.955 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:32 compute-0 nova_compute[182092]: 2026-01-23 09:20:32.606 182096 INFO nova.virt.libvirt.driver [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance shutdown successfully after 13 seconds.
Jan 23 09:20:32 compute-0 nova_compute[182092]: 2026-01-23 09:20:32.609 182096 INFO nova.virt.libvirt.driver [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance destroyed successfully.
Jan 23 09:20:32 compute-0 nova_compute[182092]: 2026-01-23 09:20:32.609 182096 DEBUG nova.objects.instance [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lazy-loading 'numa_topology' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:32 compute-0 nova_compute[182092]: 2026-01-23 09:20:32.836 182096 INFO nova.virt.libvirt.driver [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Beginning cold snapshot process
Jan 23 09:20:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:32.999 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c58940b5-b4b8-4e1b-a741-cce80dd02096', 'name': 'tempest-UnshelveToHostMultiNodesTest-server-1311020306', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'dc92976e-0762-4ffd-84a7-9ee18aeafca6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000039', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'ce124b37bc3b4586ad3473321562d9fb', 'user_id': '476826df6a2744be869eff79367a0516', 'hostId': '0613fa55ee2724ef8813a2a9a2f3ab45f5da255a2e92893942da63d1', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.000 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c2a29923657746f0a0e0d489a2b1a730', 'user_id': '654d6343796442d7946c6adfe1179a1f', 'hostId': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.001 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 nova_compute[182092]: 2026-01-23 09:20:33.006 182096 DEBUG nova.privsep.utils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:20:33 compute-0 nova_compute[182092]: 2026-01-23 09:20:33.007 182096 DEBUG oslo_concurrency.processutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk /var/lib/nova/instances/snapshots/tmp2qk9vjxh/33a4a8e7dd254203992778d28759358e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.009 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.009 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c59ff3e9-9870-4b9d-b941-66f7efc3b205', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.001198', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4b02e80-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.53130453, 'message_signature': '34dd08b1ca0640b704a49e3a287084fa097c8c0b8fa19a917166b8f2bf7a067a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.001198', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4b037e0-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.53130453, 'message_signature': '943d95f843baf6c994eb9017c7ddfbdbea0d7edabce6b45abccc0936103886a1'}]}, 'timestamp': '2026-01-23 09:20:33.009560', '_unique_id': 'e3bba8af1a984c5da61b45a8c8eecb5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.010 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.011 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.013 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.031 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.write.bytes volume: 73003008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.031 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cf2310f-d900-4ede-9450-0c1fb0f431b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73003008, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.011126', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4b399c6-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '6ee8bb935acbb3ed11d5d29020bb541ec5ca8d8acf00a7f75de97fd46365ee82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.011126', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4b3a6f0-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '1fb20efe1efb6ddd1737ea0dffa531ea4c2916fe9362a4a22ffdf6fcc65faa09'}]}, 'timestamp': '2026-01-23 09:20:33.032063', '_unique_id': '88690acf9f8749f6b8ad8e23a11a86ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.032 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.034 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.036 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for aaa5d40f-5771-495a-9e1a-76dab011324d / tap592a06ee-26 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.036 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for aaa5d40f-5771-495a-9e1a-76dab011324d / tap234110f5-48 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.037 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for aaa5d40f-5771-495a-9e1a-76dab011324d / tap9f6084b0-c2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.037 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.037 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.037 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c357234f-08c4-4e03-9a6f-1c03c2522c44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.034238', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4b479d6-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'bf56e36d4a589a6a311b8d47cdab7a313fe1ef369cfbbb48f8834c4764b656a9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.034238', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4b48408-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '5f49028cf76fe081a5ccd36116f401fb5ea98a70e5bdcbf82fc93f0e280d3663'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.034238', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4b48d5e-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '4a40f820c36533ba68487a1a6b2e23c68014308f8375b73d7d346f611c94959e'}]}, 'timestamp': '2026-01-23 09:20:33.037965', '_unique_id': '417faa483c614f27b7f18ff22c60c585'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.038 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.039 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.039 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1311020306>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-465181778>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1311020306>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-465181778>]
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.039 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.040 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.051 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/cpu volume: 9980000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c04933b4-148d-4613-9fd6-88796952ffc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9980000000, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'timestamp': '2026-01-23T09:20:33.039975', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c4b6b598-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.581186018, 'message_signature': 'd96aa5722d85716284185465e2b137ba95306c52d09bef54aef052ed56d2d5c9'}]}, 'timestamp': '2026-01-23 09:20:33.052117', '_unique_id': '13ce11a3fdc0430cb7fe5658c420b3e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.053 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.053 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets volume: 34 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.054 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.054 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4ae318a-0b87-4dd1-8e48-d32e3b12a973', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 34, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.053441', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4b7066a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '5b3c20d56bd382e621423ccb6354a7618724053c1904cd2cc631ab1f9b61ed7a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.053441', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4b70f5c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '99d504f2f894852166bcb95f16922a93585d12a481fce3e2bd2296e81e68038d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.053441', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4b71736-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '39332361da0e389f4351c2fbf5eeda6d1b48b708b0d1cafe2df37baf638effde'}]}, 'timestamp': '2026-01-23 09:20:33.054591', '_unique_id': '29d2bac6e88a49b1ba0c43f8efb1d115'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.055 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.056 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.056 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.read.requests volume: 1097 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.056 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '009f093a-c414-4ef8-a1c8-8e7071597408', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1097, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.055920', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4b765e2-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '71537c68ef2963cb3506004c4e575e470f47ab935e0d7feff5d59a6b17c4149e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.055920', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4b76e8e-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '96226fdc547eabf38bbb09eed068bac0185f88e738de981ea207095fdc9662a0'}]}, 'timestamp': '2026-01-23 09:20:33.056825', '_unique_id': '32edfcb7f0514819965e3d132995423d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.058 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.058 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.058 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.058 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07bed9ca-9901-4945-9ace-31ea3e321e16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.057930', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4b7b3a8-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'a3a0cb60e3058aaa584309cd2927fd6788109b60ea0684f32c11fb3ab641960e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.057930', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4b7bd4e-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'f11b979f88a9444bf723bf380c86056ed6db3bd356b10fe7af9c6464d8081d40'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.057930', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4b7c528-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '87279f641364ae7279fe05dc8dac1b10c6a7155553e5b41e3bd44be4d788cee5'}]}, 'timestamp': '2026-01-23 09:20:33.059045', '_unique_id': '338a3be210bb44b886b92cdc277cad52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.060 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.060 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.060 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1311020306>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-465181778>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1311020306>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-465181778>]
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.060 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.063 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/memory.usage volume: 44.9296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ce60eda-d367-48bc-a69c-5591c2045d1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.9296875, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'timestamp': '2026-01-23T09:20:33.060452', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c4b890d4-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.581186018, 'message_signature': 'bb8015aad7b9f1055e39d9a6cf3a82d702e7077f02a129d35418b63f8d0f51e7'}]}, 'timestamp': '2026-01-23 09:20:33.064264', '_unique_id': 'e0a94627cd38426287e8614d33789b5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.065 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.065 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.065 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1311020306>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-465181778>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1311020306>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-465181778>]
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.065 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.066 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.066 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.066 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.066 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dffc8a7-99a4-4077-977e-0cbb98c7e3aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.065640', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4b8e2fa-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'c831e8f069ca246c291ab157e786638e4884ac01b3433f9b9b505de42af03541'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.065640', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4b8eb60-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '57b6df258e6c330178b27caebc3901f74b35d2e78c039b098567b7a362ac4233'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.065640', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4b8f3bc-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '10319a29610b3905b99cc5ae4d4a8709b6d338caddeb3a134772964cf9b2dbb5'}]}, 'timestamp': '2026-01-23 09:20:33.066793', '_unique_id': 'f532ab8945844c4d98b5f27e84902fa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.067 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.068 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.068 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.068 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.068 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9d49f18-7689-4a3e-82a7-54eaf071c1ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.067905', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4b9394e-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'b5176b82949f3110c8d0842a6524f6754e276651e3f170af0e31bf38fb3e68b6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.067905', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4b94254-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'fa59f26363f12776ce3028c7ceb67dfa961cb9f2e9c2fe4d7d52bb0b4c080058'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.067905', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4b94a42-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '5d10f6f0f5413a2c3c8c41cc76d3ce198e0ccb0ff60a47039bbe57c21e7993e7'}]}, 'timestamp': '2026-01-23 09:20:33.069005', '_unique_id': '90e1bfcdb4a1401f979d4c589611e5e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.069 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.071 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.071 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.071 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.071 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b05ff39a-997b-457b-a1ac-99eb88a867c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.071110', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4b9b8b0-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '6add346d69207071feca969d34715dc15860b0bd08f1d2ae1dbe5505a58ac5d5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.071110', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4b9c134-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '82132c382b7207e9eeadc8014d36f251f3a567afcfa571d532e8e687435f2aa9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.071110', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4b9c954-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '5058b1bb09d320923ec44453ad4817d3800dab0473aaad44a7a054ca2e74c93a'}]}, 'timestamp': '2026-01-23 09:20:33.072257', '_unique_id': 'cdee5be48cfd4bbeb37df9f66b3291f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.073 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.073 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '033320be-df1e-4789-a112-47fc6d3d38ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.073350', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4ba0e28-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'ea99fdf29aff4561fd8d749227816311078e6e1a99e899b9e26464d0e587c5a9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.073350', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4ba16d4-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '5e68ea67321465bdea302b1485ea52f5c9c518908a32433070528790ac0925fe'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.073350', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4ba1e9a-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '4a37cf506c462c6f3789c1f3d265995fdb8713342e3c1c7529b91b056adaf742'}]}, 'timestamp': '2026-01-23 09:20:33.074440', '_unique_id': 'abfbf08c978644a9baca433eddd14107'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.077 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.078 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.078 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.078 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c566a352-1a4c-4ae6-b062-007acf640240', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.075532', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4bab92c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'f46d8cdc726d12daf463048debf4383fa126a4b5e9fc0de18d301cc3a1f8b244'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.075532', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4bac2e6-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'c488f4b9f9cc90f50ad5bb7df1b9d9ad620c6cacb9424fb47f970e0b5fc70a85'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.075532', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4bacd5e-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'e1eb27435ce80e76b6c0d2dda58547414ec1d5074ba2f80669dc972a32be858d'}]}, 'timestamp': '2026-01-23 09:20:33.078922', '_unique_id': '3eb3c4c91867455a8c78314ddfc08494'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.080 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.081 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.081 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.write.latency volume: 340740586 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.081 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33286c9e-37ca-4447-a526-6176bafe4e17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 340740586, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.080360', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4bb2d80-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '366656842fde3dceea238daa1b7fc7d03a2e3ae641e8fdf9660a63fe19e38c7c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.080360', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4bb358c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '411dd5c0f2cda1ae5b092f5a2527a803dcc810b747d96a69c862fdfb3fcf6d1d'}]}, 'timestamp': '2026-01-23 09:20:33.081578', '_unique_id': '039cbe150d004f6f9cc60cdbdc59a8f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.084 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.084 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.read.latency volume: 225886170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.084 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.read.latency volume: 16492875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '344469bb-842d-4e08-bd51-3f491b05073d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 225886170, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.082753', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4bbb55c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': 'ec27296151deb01efbc2da576f71f3c7f279449e060ee41559f45993b34118f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16492875, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.082753', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4bbbd7c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '4b9520d1d405b18a4e36626046f5927db80a49bc9ec49be1c0c0af7e26b40efd'}]}, 'timestamp': '2026-01-23 09:20:33.085059', '_unique_id': 'c106fb3a1bf24d0dae7b4770116fa9b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.086 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79c08e95-99fa-44da-bec9-4a504b00b170', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.086176', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4bc143e-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '3af398b621034d191b0f433e054c1d0e221bd41f44f0953721e292b92c8f67bf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.086176', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4bc1c2c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': 'f9fef12e57efc06bcd097c6b0a51e517a43c55c13dd386ca7a3513c286c49488'}]}, 'timestamp': '2026-01-23 09:20:33.087482', '_unique_id': 'd7c035e763c64491b71bcc1be6ea5352'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.089 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.089 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.089 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '621b888b-b9eb-4109-b804-d6acebc3f4e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.088581', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4bc686c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '5d953a4b87bead72cb6e9594fbfefa0f7c898054e5eabdc0d9093f908c39ccc7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.088581', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4bc708c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.543081721, 'message_signature': '4bcf0b77a46c12457e80bb76958e3211bf9c7f9cccb765b51d8be8d4f0d3d576'}]}, 'timestamp': '2026-01-23 09:20:33.089642', '_unique_id': '201e483477e1497b9c8bf3c1a0bb5e8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.090 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1311020306>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-465181778>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1311020306>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-465181778>]
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.091 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c5cc8a5-371a-4440-a297-ff58004cfda9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.091074', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4bcd7c0-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.53130453, 'message_signature': 'f7c65fa12efae18354a67d63a67fe3b3e697a35bf7150f284364b39f57e76e77'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.091074', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4bcdfe0-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.53130453, 'message_signature': 'bcda846e784889aeb32c506398b24a48123fcc0cc1622a1c859293fbe2d7aa6d'}]}, 'timestamp': '2026-01-23 09:20:33.092491', '_unique_id': '946eec19d68740d6b6bab953c956c85a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.092 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.093 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.094 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.094 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.094 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.outgoing.bytes volume: 1480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68da6b7d-7f55-4ed9-84de-30be3dc50864', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.093564', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4bd252c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'faf05229b1445cef85d1cce74262e1ee2de0226fe40658bc71e1b8ea9af1ff12'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.093564', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4bd2d9c-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'ccf16c474faf6955f95e47e96db3f2ba707073f25168198f4937c884918af001'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1480, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.093564', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4bd3580-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': 'd806a18009dbeec41d746db302ad9ba88bc15a3b5e194d36f01517b2f0ef0cd0'}]}, 'timestamp': '2026-01-23 09:20:33.094705', '_unique_id': '1670ee9d2afc4242ab5d664c962e8e60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.096 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.096 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.bytes volume: 4531 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.096 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.bytes volume: 1514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.096 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/network.incoming.bytes volume: 1388 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5f11a40-7622-448f-8809-5c7890a870c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4531, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap592a06ee-26', 'timestamp': '2026-01-23T09:20:33.095815', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap592a06ee-26', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap592a06ee-26'}, 'message_id': 'c4bd7bf8-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '8d6d610e4cf5c3fc46c109d59195f9ee96465a98f9373db0d27f6c7022ae6600'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1514, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap234110f5-48', 'timestamp': '2026-01-23T09:20:33.095815', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap234110f5-48', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:4a:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap234110f5-48'}, 'message_id': 'c4bd8440-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '745befd8bcf301fc995606908bec1269372b02edd45c6a57c1cf0b6ca4891f97'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1388, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'instance-00000038-aaa5d40f-5771-495a-9e1a-76dab011324d-tap9f6084b0-c2', 'timestamp': '2026-01-23T09:20:33.095815', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'tap9f6084b0-c2', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bf:1a:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f6084b0-c2'}, 'message_id': 'c4bd8d14-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.564290619, 'message_signature': '2a9c3dcfa09d0d980ea798331617fcf38be0ab79695b7f5c823775ce27c8579c'}]}, 'timestamp': '2026-01-23 09:20:33.096929', '_unique_id': 'ac9015f9b1e74ea8889eba3ff56b1930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.098 12 DEBUG ceilometer.compute.pollsters [-] Instance c58940b5-b4b8-4e1b-a741-cce80dd02096 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000039, id=c58940b5-b4b8-4e1b-a741-cce80dd02096>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.098 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.098 12 DEBUG ceilometer.compute.pollsters [-] aaa5d40f-5771-495a-9e1a-76dab011324d/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '829df35b-449d-4b3e-a6e5-ff7fc08e0df3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-vda', 'timestamp': '2026-01-23T09:20:33.098026', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4bdd260-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.53130453, 'message_signature': '3e92a4fbf2dc8437f0df493392ef9bf085f280dc63db17496973e7ad7f4e1199'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_name': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_name': None, 'resource_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d-sda', 'timestamp': '2026-01-23T09:20:33.098026', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-465181778', 'name': 'instance-00000038', 'instance_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'instance_type': 'm1.nano', 'host': '9e07d9e0034d973c3752d114efb8de120c257ba6fc27ac5c363dc006', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4bddba2-f83c-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3538.53130453, 'message_signature': '2e022e37254b55874b8c879bc7a63fcdab8c72d22d30262483e21d0a03f20f01'}]}, 'timestamp': '2026-01-23 09:20:33.098936', '_unique_id': '5e8e8f9d537d4c81a38723fb0dd6955b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:20:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:20:33.099 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:20:33 compute-0 nova_compute[182092]: 2026-01-23 09:20:33.146 182096 DEBUG oslo_concurrency.processutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096/disk /var/lib/nova/instances/snapshots/tmp2qk9vjxh/33a4a8e7dd254203992778d28759358e" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:33 compute-0 nova_compute[182092]: 2026-01-23 09:20:33.147 182096 INFO nova.virt.libvirt.driver [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Snapshot extracted, beginning image upload
Jan 23 09:20:33 compute-0 nova_compute[182092]: 2026-01-23 09:20:33.158 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:33 compute-0 podman[215938]: 2026-01-23 09:20:33.204281719 +0000 UTC m=+0.038943171 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:20:33 compute-0 podman[215939]: 2026-01-23 09:20:33.209717594 +0000 UTC m=+0.042335641 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.208 182096 DEBUG nova.network.neutron [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.230 182096 DEBUG oslo_concurrency.lockutils [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.232 182096 DEBUG oslo_concurrency.lockutils [req-562bd5e3-14ab-450e-b690-487e6e971c30 req-dc326f42-7b16-47b4-8f85-bc0bf9941cdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.232 182096 DEBUG nova.network.neutron [req-562bd5e3-14ab-450e-b690-487e6e971c30 req-dc326f42-7b16-47b4-8f85-bc0bf9941cdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Refreshing network info cache for port 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.236 182096 DEBUG nova.virt.libvirt.vif [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.236 182096 DEBUG nova.network.os_vif_util [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.237 182096 DEBUG nova.network.os_vif_util [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:b7:b3,bridge_name='br-int',has_traffic_filtering=True,id=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7712d7f0-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.237 182096 DEBUG os_vif [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:b7:b3,bridge_name='br-int',has_traffic_filtering=True,id=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7712d7f0-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.237 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.238 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.238 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.241 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.242 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7712d7f0-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.243 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7712d7f0-d6, col_values=(('external_ids', {'iface-id': '7712d7f0-d6ca-4d95-af9d-6e66a1371f4e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:b7:b3', 'vm-uuid': 'aaa5d40f-5771-495a-9e1a-76dab011324d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:34 compute-0 NetworkManager[54920]: <info>  [1769160034.2456] manager: (tap7712d7f0-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.249 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.252 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.253 182096 INFO os_vif [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:b7:b3,bridge_name='br-int',has_traffic_filtering=True,id=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7712d7f0-d6')
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.254 182096 DEBUG nova.virt.libvirt.vif [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.254 182096 DEBUG nova.network.os_vif_util [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.255 182096 DEBUG nova.network.os_vif_util [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:b7:b3,bridge_name='br-int',has_traffic_filtering=True,id=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7712d7f0-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.257 182096 DEBUG nova.virt.libvirt.guest [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] attach device xml: <interface type="ethernet">
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:bd:b7:b3"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <target dev="tap7712d7f0-d6"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]: </interface>
Jan 23 09:20:34 compute-0 nova_compute[182092]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 23 09:20:34 compute-0 kernel: tap7712d7f0-d6: entered promiscuous mode
Jan 23 09:20:34 compute-0 NetworkManager[54920]: <info>  [1769160034.2679] manager: (tap7712d7f0-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Jan 23 09:20:34 compute-0 systemd-udevd[215923]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.270 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:34 compute-0 ovn_controller[94697]: 2026-01-23T09:20:34Z|00165|binding|INFO|Claiming lport 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e for this chassis.
Jan 23 09:20:34 compute-0 ovn_controller[94697]: 2026-01-23T09:20:34Z|00166|binding|INFO|7712d7f0-d6ca-4d95-af9d-6e66a1371f4e: Claiming fa:16:3e:bd:b7:b3 10.100.0.13
Jan 23 09:20:34 compute-0 NetworkManager[54920]: <info>  [1769160034.2798] device (tap7712d7f0-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:20:34 compute-0 NetworkManager[54920]: <info>  [1769160034.2809] device (tap7712d7f0-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.283 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:b7:b3 10.100.0.13'], port_security=['fa:16:3e:bd:b7:b3 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1561624706', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1561624706', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e1e3e7d-99c7-4c17-b0f8-7664ef7a84a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.284 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 bound to our chassis
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.285 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:20:34 compute-0 ovn_controller[94697]: 2026-01-23T09:20:34Z|00167|binding|INFO|Setting lport 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e ovn-installed in OVS
Jan 23 09:20:34 compute-0 ovn_controller[94697]: 2026-01-23T09:20:34Z|00168|binding|INFO|Setting lport 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e up in Southbound
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.292 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.297 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[79299cd2-2154-4c21-93e2-73dbda734590]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.318 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c1528481-e24d-47d7-80f6-7db4b8f902ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.322 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1566f659-adcd-4041-a6e9-f4d15ba95333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.344 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5d207f34-f37b-4eae-ac5f-bc998d2e0513]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.348 182096 DEBUG nova.virt.libvirt.driver [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.349 182096 DEBUG nova.virt.libvirt.driver [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.349 182096 DEBUG nova.virt.libvirt.driver [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:38:78:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.349 182096 DEBUG nova.virt.libvirt.driver [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:f4:4a:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.350 182096 DEBUG nova.virt.libvirt.driver [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:bf:1a:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.350 182096 DEBUG nova.virt.libvirt.driver [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:bd:b7:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.360 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9ed2c6-a5a6-423f-a1e3-4a60554d70a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 10, 'rx_bytes': 784, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 10, 'rx_bytes': 784, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349659, 'reachable_time': 27397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215988, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.366 182096 DEBUG nova.virt.libvirt.guest [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:34</nova:creationTime>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:34 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:port uuid="234110f5-482d-4f74-8806-8e50183b4820">
Jan 23 09:20:34 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:port uuid="9f6084b0-c202-400c-b135-db64f24411c2">
Jan 23 09:20:34 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     <nova:port uuid="7712d7f0-d6ca-4d95-af9d-6e66a1371f4e">
Jan 23 09:20:34 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:20:34 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:34 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:34 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:34 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.372 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[377dc3c7-306f-4e39-9634-e832b0c04a65]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349669, 'tstamp': 349669}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215989, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349671, 'tstamp': 349671}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215989, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.373 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.374 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.376 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.377 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.377 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:34.377 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.390 182096 DEBUG oslo_concurrency.lockutils [None req-dff5419d-347a-4130-82d5-1c230bdb87cd 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 4.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.582 182096 DEBUG nova.compute.manager [req-09a03697-209d-4a0a-97fb-a78b03ab814c req-427a7a15-15b7-448d-a8fe-682199f3aeaf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.582 182096 DEBUG oslo_concurrency.lockutils [req-09a03697-209d-4a0a-97fb-a78b03ab814c req-427a7a15-15b7-448d-a8fe-682199f3aeaf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.582 182096 DEBUG oslo_concurrency.lockutils [req-09a03697-209d-4a0a-97fb-a78b03ab814c req-427a7a15-15b7-448d-a8fe-682199f3aeaf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.583 182096 DEBUG oslo_concurrency.lockutils [req-09a03697-209d-4a0a-97fb-a78b03ab814c req-427a7a15-15b7-448d-a8fe-682199f3aeaf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.583 182096 DEBUG nova.compute.manager [req-09a03697-209d-4a0a-97fb-a78b03ab814c req-427a7a15-15b7-448d-a8fe-682199f3aeaf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:34 compute-0 nova_compute[182092]: 2026-01-23 09:20:34.583 182096 WARNING nova.compute.manager [req-09a03697-209d-4a0a-97fb-a78b03ab814c req-427a7a15-15b7-448d-a8fe-682199f3aeaf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e for instance with vm_state active and task_state None.
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.008 182096 INFO nova.virt.libvirt.driver [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Snapshot image upload complete
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.009 182096 DEBUG nova.compute.manager [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.112 182096 INFO nova.compute.manager [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Shelve offloading
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.126 182096 INFO nova.virt.libvirt.driver [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance destroyed successfully.
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.126 182096 DEBUG nova.compute.manager [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.128 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.128 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquired lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.128 182096 DEBUG nova.network.neutron [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.288 182096 DEBUG nova.network.neutron [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.495 182096 DEBUG nova.network.neutron [req-562bd5e3-14ab-450e-b690-487e6e971c30 req-dc326f42-7b16-47b4-8f85-bc0bf9941cdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updated VIF entry in instance network info cache for port 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.496 182096 DEBUG nova.network.neutron [req-562bd5e3-14ab-450e-b690-487e6e971c30 req-dc326f42-7b16-47b4-8f85-bc0bf9941cdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.513 182096 DEBUG oslo_concurrency.lockutils [req-562bd5e3-14ab-450e-b690-487e6e971c30 req-dc326f42-7b16-47b4-8f85-bc0bf9941cdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.530 182096 DEBUG nova.network.neutron [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.539 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Releasing lock "refresh_cache-c58940b5-b4b8-4e1b-a741-cce80dd02096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.545 182096 INFO nova.virt.libvirt.driver [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Instance destroyed successfully.
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.546 182096 DEBUG nova.objects.instance [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lazy-loading 'resources' on Instance uuid c58940b5-b4b8-4e1b-a741-cce80dd02096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.555 182096 INFO nova.virt.libvirt.driver [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Deleting instance files /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096_del
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.559 182096 INFO nova.virt.libvirt.driver [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Deletion of /var/lib/nova/instances/c58940b5-b4b8-4e1b-a741-cce80dd02096_del complete
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.636 182096 INFO nova.scheduler.client.report [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Deleted allocations for instance c58940b5-b4b8-4e1b-a741-cce80dd02096
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.698 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.699 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.737 182096 DEBUG nova.compute.provider_tree [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.748 182096 DEBUG nova.scheduler.client.report [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.773 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:35 compute-0 nova_compute[182092]: 2026-01-23 09:20:35.831 182096 DEBUG oslo_concurrency.lockutils [None req-9ac9195a-49e3-4681-837c-dc275fc09653 476826df6a2744be869eff79367a0516 ce124b37bc3b4586ad3473321562d9fb - - default default] Lock "c58940b5-b4b8-4e1b-a741-cce80dd02096" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 16.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:36 compute-0 podman[215990]: 2026-01-23 09:20:36.214706578 +0000 UTC m=+0.047971445 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Jan 23 09:20:36 compute-0 ovn_controller[94697]: 2026-01-23T09:20:36Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bd:b7:b3 10.100.0.13
Jan 23 09:20:36 compute-0 ovn_controller[94697]: 2026-01-23T09:20:36Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bd:b7:b3 10.100.0.13
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.685 182096 DEBUG oslo_concurrency.lockutils [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-234110f5-482d-4f74-8806-8e50183b4820" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.686 182096 DEBUG oslo_concurrency.lockutils [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-234110f5-482d-4f74-8806-8e50183b4820" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.701 182096 DEBUG nova.objects.instance [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'flavor' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.721 182096 DEBUG nova.virt.libvirt.vif [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.721 182096 DEBUG nova.network.os_vif_util [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.722 182096 DEBUG nova.network.os_vif_util [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.724 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.725 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.727 182096 DEBUG nova.virt.libvirt.driver [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Attempting to detach device tap234110f5-48 from instance aaa5d40f-5771-495a-9e1a-76dab011324d from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.727 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] detach device xml: <interface type="ethernet">
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:f4:4a:5f"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <target dev="tap234110f5-48"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]: </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.731 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.733 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface>not found in domain: <domain type='kvm' id='24'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <name>instance-00000038</name>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <uuid>aaa5d40f-5771-495a-9e1a-76dab011324d</uuid>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:34</nova:creationTime>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="234110f5-482d-4f74-8806-8e50183b4820">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="9f6084b0-c202-400c-b135-db64f24411c2">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="7712d7f0-d6ca-4d95-af9d-6e66a1371f4e">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:36 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <system>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='serial'>aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='uuid'>aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </system>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <os>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </os>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <features>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </features>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk' index='2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.config' index='1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:38:78:6d'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='tap592a06ee-26'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:f4:4a:5f'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='tap234110f5-48'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='net1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:bf:1a:d5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='tap9f6084b0-c2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='net2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:bd:b7:b3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='tap7712d7f0-d6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='net3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log' append='off'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       </target>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log' append='off'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </console>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <video>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </video>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c216,c254</label>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c216,c254</imagelabel>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:20:36 compute-0 nova_compute[182092]: </domain>
Jan 23 09:20:36 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.734 182096 INFO nova.virt.libvirt.driver [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully detached device tap234110f5-48 from instance aaa5d40f-5771-495a-9e1a-76dab011324d from the persistent domain config.
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.735 182096 DEBUG nova.virt.libvirt.driver [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] (1/8): Attempting to detach device tap234110f5-48 with device alias net1 from instance aaa5d40f-5771-495a-9e1a-76dab011324d from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.735 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] detach device xml: <interface type="ethernet">
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:f4:4a:5f"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <target dev="tap234110f5-48"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]: </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.746 182096 DEBUG nova.compute.manager [req-25f82bb3-eb7d-46fe-abba-0c9e2273eae1 req-d4c4cae2-15bd-43a3-9b2f-987bd9dd3740 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.746 182096 DEBUG oslo_concurrency.lockutils [req-25f82bb3-eb7d-46fe-abba-0c9e2273eae1 req-d4c4cae2-15bd-43a3-9b2f-987bd9dd3740 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.746 182096 DEBUG oslo_concurrency.lockutils [req-25f82bb3-eb7d-46fe-abba-0c9e2273eae1 req-d4c4cae2-15bd-43a3-9b2f-987bd9dd3740 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.746 182096 DEBUG oslo_concurrency.lockutils [req-25f82bb3-eb7d-46fe-abba-0c9e2273eae1 req-d4c4cae2-15bd-43a3-9b2f-987bd9dd3740 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.747 182096 DEBUG nova.compute.manager [req-25f82bb3-eb7d-46fe-abba-0c9e2273eae1 req-d4c4cae2-15bd-43a3-9b2f-987bd9dd3740 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.747 182096 WARNING nova.compute.manager [req-25f82bb3-eb7d-46fe-abba-0c9e2273eae1 req-d4c4cae2-15bd-43a3-9b2f-987bd9dd3740 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e for instance with vm_state active and task_state None.
Jan 23 09:20:36 compute-0 kernel: tap234110f5-48 (unregistering): left promiscuous mode
Jan 23 09:20:36 compute-0 NetworkManager[54920]: <info>  [1769160036.8328] device (tap234110f5-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:20:36 compute-0 ovn_controller[94697]: 2026-01-23T09:20:36Z|00169|binding|INFO|Releasing lport 234110f5-482d-4f74-8806-8e50183b4820 from this chassis (sb_readonly=0)
Jan 23 09:20:36 compute-0 ovn_controller[94697]: 2026-01-23T09:20:36Z|00170|binding|INFO|Setting lport 234110f5-482d-4f74-8806-8e50183b4820 down in Southbound
Jan 23 09:20:36 compute-0 ovn_controller[94697]: 2026-01-23T09:20:36Z|00171|binding|INFO|Removing iface tap234110f5-48 ovn-installed in OVS
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.838 182096 DEBUG nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Received event <DeviceRemovedEvent: 1769160036.83804, aaa5d40f-5771-495a-9e1a-76dab011324d => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.842 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.843 182096 DEBUG nova.virt.libvirt.driver [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Start waiting for the detach event from libvirt for device tap234110f5-48 with device alias net1 for instance aaa5d40f-5771-495a-9e1a-76dab011324d _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.843 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.846 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:4a:5f 10.100.0.3'], port_security=['fa:16:3e:f4:4a:5f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e1e3e7d-99c7-4c17-b0f8-7664ef7a84a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=234110f5-482d-4f74-8806-8e50183b4820) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.847 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 234110f5-482d-4f74-8806-8e50183b4820 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 unbound from our chassis
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.849 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.851 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface>not found in domain: <domain type='kvm' id='24'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <name>instance-00000038</name>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <uuid>aaa5d40f-5771-495a-9e1a-76dab011324d</uuid>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:34</nova:creationTime>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="234110f5-482d-4f74-8806-8e50183b4820">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="9f6084b0-c202-400c-b135-db64f24411c2">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="7712d7f0-d6ca-4d95-af9d-6e66a1371f4e">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:36 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <system>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='serial'>aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='uuid'>aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </system>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <os>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </os>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <features>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </features>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk' index='2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.config' index='1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:38:78:6d'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='tap592a06ee-26'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:bf:1a:d5'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='tap9f6084b0-c2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='net2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:bd:b7:b3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target dev='tap7712d7f0-d6'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='net3'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log' append='off'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       </target>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log' append='off'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </console>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <video>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </video>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c216,c254</label>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c216,c254</imagelabel>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:20:36 compute-0 nova_compute[182092]: </domain>
Jan 23 09:20:36 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.852 182096 INFO nova.virt.libvirt.driver [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully detached device tap234110f5-48 from instance aaa5d40f-5771-495a-9e1a-76dab011324d from the live domain config.
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.853 182096 DEBUG nova.virt.libvirt.vif [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.853 182096 DEBUG nova.network.os_vif_util [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.856 182096 DEBUG nova.network.os_vif_util [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.856 182096 DEBUG os_vif [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.857 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.858 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap234110f5-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.859 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.860 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.863 182096 INFO os_vif [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48')
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.863 182096 DEBUG nova.virt.libvirt.guest [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:36</nova:creationTime>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="9f6084b0-c202-400c-b135-db64f24411c2">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     <nova:port uuid="7712d7f0-d6ca-4d95-af9d-6e66a1371f4e">
Jan 23 09:20:36 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:20:36 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:36 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:36 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:36 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.866 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e970bf-c732-4d8b-8ffb-571e51e52b0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.891 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad0d8cc-87f1-4994-8a48-b6b250b21d8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.893 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[74969bda-ca98-4ae4-a3bd-4b77f7d2a889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.913 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[23796ce7-ac48-4b94-9333-aead2a75782d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.926 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cef73bfc-37a5-4ac3-98e4-696f31fbdf33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 12, 'rx_bytes': 784, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 12, 'rx_bytes': 784, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349659, 'reachable_time': 27397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216019, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.936 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c72f93f-7d8e-448d-8505-9b80482e628b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349669, 'tstamp': 349669}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216020, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349671, 'tstamp': 349671}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216020, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.938 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.939 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.942 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.942 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.942 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:36.943 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:36 compute-0 nova_compute[182092]: 2026-01-23 09:20:36.958 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:38 compute-0 nova_compute[182092]: 2026-01-23 09:20:38.988 182096 DEBUG oslo_concurrency.lockutils [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:20:38 compute-0 nova_compute[182092]: 2026-01-23 09:20:38.988 182096 DEBUG oslo_concurrency.lockutils [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquired lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:20:38 compute-0 nova_compute[182092]: 2026-01-23 09:20:38.988 182096 DEBUG nova.network.neutron [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.149 182096 DEBUG nova.compute.manager [req-45aa5a6b-8746-4084-a364-fa3fffce0aa1 req-f6f1e334-6cf4-4af8-8404-fc908461368f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.149 182096 DEBUG oslo_concurrency.lockutils [req-45aa5a6b-8746-4084-a364-fa3fffce0aa1 req-f6f1e334-6cf4-4af8-8404-fc908461368f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.149 182096 DEBUG oslo_concurrency.lockutils [req-45aa5a6b-8746-4084-a364-fa3fffce0aa1 req-f6f1e334-6cf4-4af8-8404-fc908461368f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.149 182096 DEBUG oslo_concurrency.lockutils [req-45aa5a6b-8746-4084-a364-fa3fffce0aa1 req-f6f1e334-6cf4-4af8-8404-fc908461368f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.149 182096 DEBUG nova.compute.manager [req-45aa5a6b-8746-4084-a364-fa3fffce0aa1 req-f6f1e334-6cf4-4af8-8404-fc908461368f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.150 182096 WARNING nova.compute.manager [req-45aa5a6b-8746-4084-a364-fa3fffce0aa1 req-f6f1e334-6cf4-4af8-8404-fc908461368f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-234110f5-482d-4f74-8806-8e50183b4820 for instance with vm_state active and task_state None.
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.158 182096 DEBUG nova.compute.manager [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-deleted-234110f5-482d-4f74-8806-8e50183b4820 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.158 182096 INFO nova.compute.manager [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Neutron deleted interface 234110f5-482d-4f74-8806-8e50183b4820; detaching it from the instance and deleting it from the info cache
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.158 182096 DEBUG nova.network.neutron [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.179 182096 DEBUG nova.objects.instance [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lazy-loading 'system_metadata' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.221 182096 DEBUG nova.objects.instance [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lazy-loading 'flavor' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.259 182096 DEBUG nova.virt.libvirt.vif [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.259 182096 DEBUG nova.network.os_vif_util [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Converting VIF {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.259 182096 DEBUG nova.network.os_vif_util [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.261 182096 DEBUG nova.virt.libvirt.guest [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.263 182096 DEBUG nova.virt.libvirt.guest [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface>not found in domain: <domain type='kvm' id='24'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <name>instance-00000038</name>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <uuid>aaa5d40f-5771-495a-9e1a-76dab011324d</uuid>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:36</nova:creationTime>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="9f6084b0-c202-400c-b135-db64f24411c2">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="7712d7f0-d6ca-4d95-af9d-6e66a1371f4e">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:39 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <system>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='serial'>aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='uuid'>aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </system>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <os>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </os>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <features>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </features>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk' index='2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.config' index='1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:38:78:6d'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='tap592a06ee-26'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:bf:1a:d5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='tap9f6084b0-c2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='net2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:bd:b7:b3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='tap7712d7f0-d6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='net3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log' append='off'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       </target>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log' append='off'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </console>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <video>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </video>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c216,c254</label>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c216,c254</imagelabel>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:20:39 compute-0 nova_compute[182092]: </domain>
Jan 23 09:20:39 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.263 182096 DEBUG nova.virt.libvirt.guest [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.267 182096 DEBUG nova.virt.libvirt.guest [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f4:4a:5f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap234110f5-48"/></interface>not found in domain: <domain type='kvm' id='24'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <name>instance-00000038</name>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <uuid>aaa5d40f-5771-495a-9e1a-76dab011324d</uuid>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:36</nova:creationTime>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="9f6084b0-c202-400c-b135-db64f24411c2">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="7712d7f0-d6ca-4d95-af9d-6e66a1371f4e">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:39 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <system>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='serial'>aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='uuid'>aaa5d40f-5771-495a-9e1a-76dab011324d</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </system>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <os>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </os>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <features>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </features>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk' index='2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/disk.config' index='1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:38:78:6d'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='tap592a06ee-26'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:bf:1a:d5'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='tap9f6084b0-c2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='net2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:bd:b7:b3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target dev='tap7712d7f0-d6'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='net3'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log' append='off'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       </target>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d/console.log' append='off'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </console>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </input>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <video>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </video>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c216,c254</label>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c216,c254</imagelabel>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:20:39 compute-0 nova_compute[182092]: </domain>
Jan 23 09:20:39 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.267 182096 WARNING nova.virt.libvirt.driver [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Detaching interface fa:16:3e:f4:4a:5f failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap234110f5-48' not found.
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.268 182096 DEBUG nova.virt.libvirt.vif [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.268 182096 DEBUG nova.network.os_vif_util [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Converting VIF {"id": "234110f5-482d-4f74-8806-8e50183b4820", "address": "fa:16:3e:f4:4a:5f", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234110f5-48", "ovs_interfaceid": "234110f5-482d-4f74-8806-8e50183b4820", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.268 182096 DEBUG nova.network.os_vif_util [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.269 182096 DEBUG os_vif [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.270 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.270 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap234110f5-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.270 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.272 182096 INFO os_vif [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:4a:5f,bridge_name='br-int',has_traffic_filtering=True,id=234110f5-482d-4f74-8806-8e50183b4820,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234110f5-48')
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.272 182096 DEBUG nova.virt.libvirt.guest [req-afc9765f-d909-43ad-8309-460c5f33621f req-e02fdc4d-4e5c-446c-9f09-d7ca86859753 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:name>tempest-AttachInterfacesTestJSON-server-465181778</nova:name>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:20:39</nova:creationTime>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="592a06ee-2695-49b2-811e-88ecd02c4cc7">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="9f6084b0-c202-400c-b135-db64f24411c2">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     <nova:port uuid="7712d7f0-d6ca-4d95-af9d-6e66a1371f4e">
Jan 23 09:20:39 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:20:39 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:20:39 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:20:39 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:20:39 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.574 103978 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 6282aa7c-c816-4758-aba9-52939c22b988 with type ""
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.575 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:b7:b3 10.100.0.13'], port_security=['fa:16:3e:bd:b7:b3 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1561624706', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1561624706', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e1e3e7d-99c7-4c17-b0f8-7664ef7a84a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.575 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 unbound from our chassis
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.577 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:20:39 compute-0 ovn_controller[94697]: 2026-01-23T09:20:39Z|00172|binding|INFO|Removing iface tap7712d7f0-d6 ovn-installed in OVS
Jan 23 09:20:39 compute-0 ovn_controller[94697]: 2026-01-23T09:20:39Z|00173|binding|INFO|Removing lport 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e ovn-installed in OVS
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.585 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.592 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.595 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d62b56c5-e6b5-4c88-b49c-7083bfa71835]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.618 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[fd47de0e-5218-425e-8dfd-fd273c80237f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.621 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[98ec7993-4fd9-4133-bfbb-20b63b7ada32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.646 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a11cffea-96fc-4a4c-a94e-b60bc561cc8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.659 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b96d2001-8953-43d2-88c8-f75c56fcdfcb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 14, 'rx_bytes': 784, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 14, 'rx_bytes': 784, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349659, 'reachable_time': 27397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216026, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.670 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[116eed51-ecc7-45a6-b6a9-ff61339b863c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349669, 'tstamp': 349669}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216027, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349671, 'tstamp': 349671}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216027, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.672 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.673 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.674 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.674 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.675 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.675 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.789 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.789 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.789 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.789 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.789 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.796 182096 INFO nova.compute.manager [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Terminating instance
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.802 182096 DEBUG nova.compute.manager [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:20:39 compute-0 kernel: tap592a06ee-26 (unregistering): left promiscuous mode
Jan 23 09:20:39 compute-0 NetworkManager[54920]: <info>  [1769160039.8418] device (tap592a06ee-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:20:39 compute-0 ovn_controller[94697]: 2026-01-23T09:20:39Z|00174|binding|INFO|Releasing lport 592a06ee-2695-49b2-811e-88ecd02c4cc7 from this chassis (sb_readonly=0)
Jan 23 09:20:39 compute-0 ovn_controller[94697]: 2026-01-23T09:20:39Z|00175|binding|INFO|Setting lport 592a06ee-2695-49b2-811e-88ecd02c4cc7 down in Southbound
Jan 23 09:20:39 compute-0 ovn_controller[94697]: 2026-01-23T09:20:39Z|00176|binding|INFO|Removing iface tap592a06ee-26 ovn-installed in OVS
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.846 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.856 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.856 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.856 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.857 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.861 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:78:6d 10.100.0.7'], port_security=['fa:16:3e:38:78:6d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '18938afc-cae3-477c-aee1-5f343f0a5140', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=592a06ee-2695-49b2-811e-88ecd02c4cc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:39 compute-0 kernel: tap9f6084b0-c2 (unregistering): left promiscuous mode
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.863 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 592a06ee-2695-49b2-811e-88ecd02c4cc7 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 unbound from our chassis
Jan 23 09:20:39 compute-0 NetworkManager[54920]: <info>  [1769160039.8652] device (tap9f6084b0-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.865 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:20:39 compute-0 ovn_controller[94697]: 2026-01-23T09:20:39Z|00177|binding|INFO|Releasing lport 9f6084b0-c202-400c-b135-db64f24411c2 from this chassis (sb_readonly=0)
Jan 23 09:20:39 compute-0 ovn_controller[94697]: 2026-01-23T09:20:39Z|00178|binding|INFO|Setting lport 9f6084b0-c202-400c-b135-db64f24411c2 down in Southbound
Jan 23 09:20:39 compute-0 ovn_controller[94697]: 2026-01-23T09:20:39Z|00179|binding|INFO|Removing iface tap9f6084b0-c2 ovn-installed in OVS
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.870 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.872 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.874 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:1a:d5 10.100.0.14'], port_security=['fa:16:3e:bf:1a:d5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'aaa5d40f-5771-495a-9e1a-76dab011324d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e1e3e7d-99c7-4c17-b0f8-7664ef7a84a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=9f6084b0-c202-400c-b135-db64f24411c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:20:39 compute-0 kernel: tap7712d7f0-d6 (unregistering): left promiscuous mode
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.879 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b835431a-d6fd-4486-9e34-4852d37b8b24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 NetworkManager[54920]: <info>  [1769160039.8903] device (tap7712d7f0-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.897 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.904 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.908 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ae34af-0f3e-4b42-8852-f4e35d1c3ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.910 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[80590bf6-76bd-4a1a-b056-3546b95bc3d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000038.scope: Deactivated successfully.
Jan 23 09:20:39 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000038.scope: Consumed 11.742s CPU time.
Jan 23 09:20:39 compute-0 systemd-machined[153562]: Machine qemu-24-instance-00000038 terminated.
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.932 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d832d78a-caaf-4c3d-8c02-5136d4bb7271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.945 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[65609893-5c67-49fd-aca9-77ad0bc4db19]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 16, 'rx_bytes': 784, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 16, 'rx_bytes': 784, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349659, 'reachable_time': 27397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216053, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.957 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[66b2da46-ff23-4df0-8195-a9dd10cf93eb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349669, 'tstamp': 349669}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216054, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349671, 'tstamp': 349671}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216054, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.958 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.959 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 nova_compute[182092]: 2026-01-23 09:20:39.965 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.965 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.965 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.966 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.966 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.967 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 9f6084b0-c202-400c-b135-db64f24411c2 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 unbound from our chassis
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.968 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 502ff19d-7b13-4dc2-8ece-02806b418ba0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.968 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5339173e-57cd-47d8-8a41-1a3e45cbcb62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:39.969 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 namespace which is not needed anymore
Jan 23 09:20:40 compute-0 NetworkManager[54920]: <info>  [1769160040.0152] manager: (tap592a06ee-26): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Jan 23 09:20:40 compute-0 systemd-udevd[216032]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:20:40 compute-0 NetworkManager[54920]: <info>  [1769160040.0425] manager: (tap7712d7f0-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.070 182096 INFO nova.virt.libvirt.driver [-] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Instance destroyed successfully.
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.070 182096 DEBUG nova.objects.instance [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'resources' on Instance uuid aaa5d40f-5771-495a-9e1a-76dab011324d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.084 182096 DEBUG nova.virt.libvirt.vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.085 182096 DEBUG nova.network.os_vif_util [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.085 182096 DEBUG nova.network.os_vif_util [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:6d,bridge_name='br-int',has_traffic_filtering=True,id=592a06ee-2695-49b2-811e-88ecd02c4cc7,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap592a06ee-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.086 182096 DEBUG os_vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:6d,bridge_name='br-int',has_traffic_filtering=True,id=592a06ee-2695-49b2-811e-88ecd02c4cc7,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap592a06ee-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:20:40 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[215440]: [NOTICE]   (215444) : haproxy version is 2.8.14-c23fe91
Jan 23 09:20:40 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[215440]: [NOTICE]   (215444) : path to executable is /usr/sbin/haproxy
Jan 23 09:20:40 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[215440]: [WARNING]  (215444) : Exiting Master process...
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.086 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.087 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap592a06ee-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:40 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[215440]: [ALERT]    (215444) : Current worker (215447) exited with code 143 (Terminated)
Jan 23 09:20:40 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[215440]: [WARNING]  (215444) : All workers exited. Exiting... (0)
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.088 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.089 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:20:40 compute-0 systemd[1]: libpod-f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12.scope: Deactivated successfully.
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.093 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 podman[216085]: 2026-01-23 09:20:40.098761646 +0000 UTC m=+0.045870821 container died f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.099 182096 INFO os_vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:6d,bridge_name='br-int',has_traffic_filtering=True,id=592a06ee-2695-49b2-811e-88ecd02c4cc7,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap592a06ee-26')
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.101 182096 DEBUG nova.virt.libvirt.vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.102 182096 DEBUG nova.network.os_vif_util [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.103 182096 DEBUG nova.network.os_vif_util [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:1a:d5,bridge_name='br-int',has_traffic_filtering=True,id=9f6084b0-c202-400c-b135-db64f24411c2,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f6084b0-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.104 182096 DEBUG os_vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:1a:d5,bridge_name='br-int',has_traffic_filtering=True,id=9f6084b0-c202-400c-b135-db64f24411c2,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f6084b0-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.106 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.106 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f6084b0-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.107 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.109 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.112 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.114 182096 INFO os_vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:1a:d5,bridge_name='br-int',has_traffic_filtering=True,id=9f6084b0-c202-400c-b135-db64f24411c2,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f6084b0-c2')
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.115 182096 DEBUG nova.virt.libvirt.vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:19:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-465181778',display_name='tempest-AttachInterfacesTestJSON-server-465181778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-465181778',id=56,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCe4/Otb4hawnRFYnvo4CAqA/ytUhhndRyEDBd1in0pzG/02FvrR4ijgmNPBhQjW0SkFvyj6KWgwgPhx47rI/0qjk6WrKDu9fMaKLM1Luo/Xy4DT2IEfZTkkw+NDqsdTeA==',key_name='tempest-keypair-947369209',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:19:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-9b5jmbth',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:19:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=aaa5d40f-5771-495a-9e1a-76dab011324d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.115 182096 DEBUG nova.network.os_vif_util [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.115 182096 DEBUG nova.network.os_vif_util [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:b7:b3,bridge_name='br-int',has_traffic_filtering=True,id=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7712d7f0-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:20:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12-userdata-shm.mount: Deactivated successfully.
Jan 23 09:20:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-4cc901fb96740599edac43ffe76c6706020a93445cbdfbb297a6474cb59ad00b-merged.mount: Deactivated successfully.
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.119 182096 DEBUG os_vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:b7:b3,bridge_name='br-int',has_traffic_filtering=True,id=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7712d7f0-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.120 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 podman[216085]: 2026-01-23 09:20:40.120579734 +0000 UTC m=+0.067688899 container cleanup f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.120 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7712d7f0-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.121 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.122 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.123 182096 INFO os_vif [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:b7:b3,bridge_name='br-int',has_traffic_filtering=True,id=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7712d7f0-d6')
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.123 182096 INFO nova.virt.libvirt.driver [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Deleting instance files /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d_del
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.124 182096 INFO nova.virt.libvirt.driver [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Deletion of /var/lib/nova/instances/aaa5d40f-5771-495a-9e1a-76dab011324d_del complete
Jan 23 09:20:40 compute-0 systemd[1]: libpod-conmon-f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12.scope: Deactivated successfully.
Jan 23 09:20:40 compute-0 podman[216134]: 2026-01-23 09:20:40.166145729 +0000 UTC m=+0.024180226 container remove f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.169 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cfbdc28f-590d-40cf-8047-9d3a31cbcd54]: (4, ('Fri Jan 23 09:20:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 (f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12)\nf4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12\nFri Jan 23 09:20:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 (f4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12)\nf4181a4ddf3b06ce920a9739da1fa27d5d0085741d6bcb535f69962ea90a0e12\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.170 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9e122fd6-7e8c-4cf8-8e80-67d645859571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.171 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.172 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 kernel: tap502ff19d-70: left promiscuous mode
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.184 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.186 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf7d610-74b5-4a84-9a6a-bd4d8b977bbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.194 182096 INFO nova.compute.manager [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.195 182096 DEBUG oslo.service.loopingcall [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.196 182096 DEBUG nova.compute.manager [-] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.196 182096 DEBUG nova.network.neutron [-] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.199 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[da85f30f-bdda-4f58-9e35-931d599afcf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.199 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[102bb405-78e4-49ca-996e-4a5bd5b5a3a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.211 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d4137c-1c96-48e2-9801-8ce67a9e571c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349653, 'reachable_time': 23561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216146, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d502ff19d\x2d7b13\x2d4dc2\x2d8ece\x2d02806b418ba0.mount: Deactivated successfully.
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.213 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:20:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:20:40.213 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[bd45e14c-26cd-4d14-97d0-c664d380c7ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.590 182096 DEBUG nova.compute.manager [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-unplugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.591 182096 DEBUG oslo_concurrency.lockutils [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.591 182096 DEBUG oslo_concurrency.lockutils [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.591 182096 DEBUG oslo_concurrency.lockutils [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.591 182096 DEBUG nova.compute.manager [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-unplugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.591 182096 DEBUG nova.compute.manager [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-unplugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.592 182096 DEBUG nova.compute.manager [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.592 182096 DEBUG oslo_concurrency.lockutils [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.592 182096 DEBUG oslo_concurrency.lockutils [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.592 182096 DEBUG oslo_concurrency.lockutils [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.592 182096 DEBUG nova.compute.manager [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.592 182096 WARNING nova.compute.manager [req-b2a74ba2-6237-432c-a508-661957863fd3 req-22a79c93-fd43-4fcc-b465-efc0419d3844 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-592a06ee-2695-49b2-811e-88ecd02c4cc7 for instance with vm_state active and task_state deleting.
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.598 182096 INFO nova.network.neutron [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Port 234110f5-482d-4f74-8806-8e50183b4820 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.605 182096 DEBUG neutronclient.v2_0.client [-] Error message: {"NeutronError": {"type": "PortNotFound", "message": "Port 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 23 09:20:40 compute-0 nova_compute[182092]: 2026-01-23 09:20:40.605 182096 DEBUG nova.network.neutron [-] Unable to show port 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e as it no longer exists. _unbind_ports /usr/lib/python3.9/site-packages/nova/network/neutron.py:666
Jan 23 09:20:41 compute-0 nova_compute[182092]: 2026-01-23 09:20:41.268 182096 DEBUG nova.compute.manager [req-f1121358-de84-46ea-af01-f5aeffe29085 req-cd632f6c-4fb1-455d-a912-e10a98ecebdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-deleted-7712d7f0-d6ca-4d95-af9d-6e66a1371f4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:41 compute-0 nova_compute[182092]: 2026-01-23 09:20:41.268 182096 INFO nova.compute.manager [req-f1121358-de84-46ea-af01-f5aeffe29085 req-cd632f6c-4fb1-455d-a912-e10a98ecebdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Neutron deleted interface 7712d7f0-d6ca-4d95-af9d-6e66a1371f4e; detaching it from the instance and deleting it from the info cache
Jan 23 09:20:41 compute-0 nova_compute[182092]: 2026-01-23 09:20:41.268 182096 DEBUG nova.network.neutron [req-f1121358-de84-46ea-af01-f5aeffe29085 req-cd632f6c-4fb1-455d-a912-e10a98ecebdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:41 compute-0 nova_compute[182092]: 2026-01-23 09:20:41.300 182096 DEBUG nova.compute.manager [req-f1121358-de84-46ea-af01-f5aeffe29085 req-cd632f6c-4fb1-455d-a912-e10a98ecebdd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Detach interface failed, port_id=7712d7f0-d6ca-4d95-af9d-6e66a1371f4e, reason: Instance aaa5d40f-5771-495a-9e1a-76dab011324d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 09:20:41 compute-0 nova_compute[182092]: 2026-01-23 09:20:41.913 182096 DEBUG nova.network.neutron [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "address": "fa:16:3e:38:78:6d", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap592a06ee-26", "ovs_interfaceid": "592a06ee-2695-49b2-811e-88ecd02c4cc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:41 compute-0 nova_compute[182092]: 2026-01-23 09:20:41.926 182096 DEBUG oslo_concurrency.lockutils [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Releasing lock "refresh_cache-aaa5d40f-5771-495a-9e1a-76dab011324d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:20:41 compute-0 nova_compute[182092]: 2026-01-23 09:20:41.954 182096 DEBUG oslo_concurrency.lockutils [None req-7480139a-883c-47bd-b302-c031243818f8 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-aaa5d40f-5771-495a-9e1a-76dab011324d-234110f5-482d-4f74-8806-8e50183b4820" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:41 compute-0 nova_compute[182092]: 2026-01-23 09:20:41.960 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.691 182096 DEBUG nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-unplugged-9f6084b0-c202-400c-b135-db64f24411c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.692 182096 DEBUG oslo_concurrency.lockutils [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.692 182096 DEBUG oslo_concurrency.lockutils [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.692 182096 DEBUG oslo_concurrency.lockutils [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.692 182096 DEBUG nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-unplugged-9f6084b0-c202-400c-b135-db64f24411c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.692 182096 DEBUG nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-unplugged-9f6084b0-c202-400c-b135-db64f24411c2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.693 182096 DEBUG nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.693 182096 DEBUG oslo_concurrency.lockutils [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.693 182096 DEBUG oslo_concurrency.lockutils [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.693 182096 DEBUG oslo_concurrency.lockutils [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.693 182096 DEBUG nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] No waiting events found dispatching network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.694 182096 WARNING nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received unexpected event network-vif-plugged-9f6084b0-c202-400c-b135-db64f24411c2 for instance with vm_state active and task_state deleting.
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.694 182096 DEBUG nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-deleted-592a06ee-2695-49b2-811e-88ecd02c4cc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.694 182096 INFO nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Neutron deleted interface 592a06ee-2695-49b2-811e-88ecd02c4cc7; detaching it from the instance and deleting it from the info cache
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.694 182096 DEBUG nova.network.neutron [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [{"id": "9f6084b0-c202-400c-b135-db64f24411c2", "address": "fa:16:3e:bf:1a:d5", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f6084b0-c2", "ovs_interfaceid": "9f6084b0-c202-400c-b135-db64f24411c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "address": "fa:16:3e:bd:b7:b3", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7712d7f0-d6", "ovs_interfaceid": "7712d7f0-d6ca-4d95-af9d-6e66a1371f4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:42 compute-0 nova_compute[182092]: 2026-01-23 09:20:42.712 182096 DEBUG nova.compute.manager [req-c48f4ad6-5b76-4c10-b69c-eb613c6b1810 req-c603448b-5a86-4eb2-84db-699f96c74848 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Detach interface failed, port_id=592a06ee-2695-49b2-811e-88ecd02c4cc7, reason: Instance aaa5d40f-5771-495a-9e1a-76dab011324d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.370 182096 DEBUG nova.network.neutron [-] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.388 182096 INFO nova.compute.manager [-] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Took 3.19 seconds to deallocate network for instance.
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.430 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.430 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.474 182096 DEBUG nova.compute.provider_tree [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.483 182096 DEBUG nova.scheduler.client.report [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.500 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.517 182096 INFO nova.scheduler.client.report [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Deleted allocations for instance aaa5d40f-5771-495a-9e1a-76dab011324d
Jan 23 09:20:43 compute-0 nova_compute[182092]: 2026-01-23 09:20:43.567 182096 DEBUG oslo_concurrency.lockutils [None req-59cbe17b-d9a1-4060-8faa-658509bc4628 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "aaa5d40f-5771-495a-9e1a-76dab011324d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:44 compute-0 nova_compute[182092]: 2026-01-23 09:20:44.002 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:44 compute-0 nova_compute[182092]: 2026-01-23 09:20:44.157 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:44 compute-0 podman[216147]: 2026-01-23 09:20:44.229273096 +0000 UTC m=+0.064691923 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:20:44 compute-0 nova_compute[182092]: 2026-01-23 09:20:44.795 182096 DEBUG nova.compute.manager [req-6880f5f4-8d90-4975-9f42-c149b6d02de1 req-2e09b390-f369-42ac-b56f-fdcfeb9cce73 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Received event network-vif-deleted-9f6084b0-c202-400c-b135-db64f24411c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:20:45 compute-0 nova_compute[182092]: 2026-01-23 09:20:45.122 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:46 compute-0 nova_compute[182092]: 2026-01-23 09:20:46.933 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160031.9319696, c58940b5-b4b8-4e1b-a741-cce80dd02096 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:20:46 compute-0 nova_compute[182092]: 2026-01-23 09:20:46.933 182096 INFO nova.compute.manager [-] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] VM Stopped (Lifecycle Event)
Jan 23 09:20:46 compute-0 nova_compute[182092]: 2026-01-23 09:20:46.952 182096 DEBUG nova.compute.manager [None req-8b3c144f-f0ae-41d2-a06b-137dd8e7be09 - - - - - -] [instance: c58940b5-b4b8-4e1b-a741-cce80dd02096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:46 compute-0 nova_compute[182092]: 2026-01-23 09:20:46.961 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:50 compute-0 nova_compute[182092]: 2026-01-23 09:20:50.123 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:51 compute-0 nova_compute[182092]: 2026-01-23 09:20:51.962 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:52 compute-0 podman[216171]: 2026-01-23 09:20:52.203381134 +0000 UTC m=+0.040845901 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 09:20:52 compute-0 podman[216172]: 2026-01-23 09:20:52.204243911 +0000 UTC m=+0.040403827 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:20:55 compute-0 nova_compute[182092]: 2026-01-23 09:20:55.069 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160040.0680475, aaa5d40f-5771-495a-9e1a-76dab011324d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:20:55 compute-0 nova_compute[182092]: 2026-01-23 09:20:55.070 182096 INFO nova.compute.manager [-] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] VM Stopped (Lifecycle Event)
Jan 23 09:20:55 compute-0 nova_compute[182092]: 2026-01-23 09:20:55.085 182096 DEBUG nova.compute.manager [None req-6bc7c9ee-81e4-47d7-a5d3-7585efb0cc27 - - - - - -] [instance: aaa5d40f-5771-495a-9e1a-76dab011324d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:55 compute-0 nova_compute[182092]: 2026-01-23 09:20:55.126 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:55 compute-0 nova_compute[182092]: 2026-01-23 09:20:55.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.436 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.436 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.457 182096 DEBUG nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.538 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.538 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.542 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.542 182096 INFO nova.compute.claims [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.648 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.660 182096 DEBUG nova.compute.provider_tree [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.672 182096 DEBUG nova.scheduler.client.report [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.687 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.688 182096 DEBUG nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.751 182096 DEBUG nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.752 182096 DEBUG nova.network.neutron [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.767 182096 INFO nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.783 182096 DEBUG nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.875 182096 DEBUG nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.876 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.876 182096 INFO nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Creating image(s)
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.876 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "/var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.877 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "/var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.877 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "/var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.887 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.934 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.935 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.935 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.944 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.965 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.990 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:56 compute-0 nova_compute[182092]: 2026-01-23 09:20:56.991 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.013 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.013 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.014 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.044 182096 DEBUG nova.network.neutron [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.044 182096 DEBUG nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.058 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.059 182096 DEBUG nova.virt.disk.api [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Checking if we can resize image /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.059 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.104 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.105 182096 DEBUG nova.virt.disk.api [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Cannot resize image /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.105 182096 DEBUG nova.objects.instance [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lazy-loading 'migration_context' on Instance uuid bb4eac91-65f9-4ef0-a1e5-458feef0f384 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.139 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.139 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Ensure instance console log exists: /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.139 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.140 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.140 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.141 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.144 182096 WARNING nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.149 182096 DEBUG nova.virt.libvirt.host [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.150 182096 DEBUG nova.virt.libvirt.host [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.154 182096 DEBUG nova.virt.libvirt.host [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.154 182096 DEBUG nova.virt.libvirt.host [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.155 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.155 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.155 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.156 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.156 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.156 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.156 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.156 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.157 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.157 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.157 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.157 182096 DEBUG nova.virt.hardware [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.160 182096 DEBUG nova.objects.instance [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lazy-loading 'pci_devices' on Instance uuid bb4eac91-65f9-4ef0-a1e5-458feef0f384 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.181 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <uuid>bb4eac91-65f9-4ef0-a1e5-458feef0f384</uuid>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <name>instance-0000003e</name>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <nova:name>tempest-TenantUsagesTestJSON-server-465198873</nova:name>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:20:57</nova:creationTime>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:20:57 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:20:57 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:20:57 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:20:57 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:20:57 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:20:57 compute-0 nova_compute[182092]:         <nova:user uuid="2070d7c21cf34064b652a3caa7f95b81">tempest-TenantUsagesTestJSON-377865135-project-member</nova:user>
Jan 23 09:20:57 compute-0 nova_compute[182092]:         <nova:project uuid="283aedeeb4244b859ca2932a553eb1ce">tempest-TenantUsagesTestJSON-377865135</nova:project>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <system>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <entry name="serial">bb4eac91-65f9-4ef0-a1e5-458feef0f384</entry>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <entry name="uuid">bb4eac91-65f9-4ef0-a1e5-458feef0f384</entry>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     </system>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <os>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   </os>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <features>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   </features>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk.config"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/console.log" append="off"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <video>
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     </video>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:20:57 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:20:57 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:20:57 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:20:57 compute-0 nova_compute[182092]: </domain>
Jan 23 09:20:57 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.245 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.245 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.246 182096 INFO nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Using config drive
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.690 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.690 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.690 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.690 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.732 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.778 182096 INFO nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Creating config drive at /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk.config
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.783 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsu9aomhe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.796 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.796 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.843 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:57 compute-0 nova_compute[182092]: 2026-01-23 09:20:57.902 182096 DEBUG oslo_concurrency.processutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsu9aomhe" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:20:57 compute-0 systemd-machined[153562]: New machine qemu-27-instance-0000003e.
Jan 23 09:20:57 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-0000003e.
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.063 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.064 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5739MB free_disk=73.30596923828125GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.064 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.065 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.115 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance bb4eac91-65f9-4ef0-a1e5-458feef0f384 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.115 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.115 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.148 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.159 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.173 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160058.1735058, bb4eac91-65f9-4ef0-a1e5-458feef0f384 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.173 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] VM Resumed (Lifecycle Event)
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.175 182096 DEBUG nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.175 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.177 182096 INFO nova.virt.libvirt.driver [-] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Instance spawned successfully.
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.177 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.185 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.186 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.200 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.202 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.202 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.203 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.203 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.203 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.204 182096 DEBUG nova.virt.libvirt.driver [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.206 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.232 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.232 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160058.1744478, bb4eac91-65f9-4ef0-a1e5-458feef0f384 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.233 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] VM Started (Lifecycle Event)
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.266 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.268 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.295 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.301 182096 INFO nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Took 1.43 seconds to spawn the instance on the hypervisor.
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.302 182096 DEBUG nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.364 182096 INFO nova.compute.manager [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Took 1.85 seconds to build instance.
Jan 23 09:20:58 compute-0 nova_compute[182092]: 2026-01-23 09:20:58.375 182096 DEBUG oslo_concurrency.lockutils [None req-00c025e9-484e-4aa6-8aa7-b74d6685b519 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:20:59 compute-0 nova_compute[182092]: 2026-01-23 09:20:59.185 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:59 compute-0 nova_compute[182092]: 2026-01-23 09:20:59.186 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:20:59 compute-0 nova_compute[182092]: 2026-01-23 09:20:59.275 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:20:59 compute-0 nova_compute[182092]: 2026-01-23 09:20:59.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:20:59 compute-0 nova_compute[182092]: 2026-01-23 09:20:59.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.127 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.176 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.176 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.176 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.176 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.177 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.184 182096 INFO nova.compute.manager [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Terminating instance
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.191 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "refresh_cache-bb4eac91-65f9-4ef0-a1e5-458feef0f384" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.191 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquired lock "refresh_cache-bb4eac91-65f9-4ef0-a1e5-458feef0f384" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.191 182096 DEBUG nova.network.neutron [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.439 182096 DEBUG nova.network.neutron [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.705 182096 DEBUG nova.network.neutron [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.719 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Releasing lock "refresh_cache-bb4eac91-65f9-4ef0-a1e5-458feef0f384" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.720 182096 DEBUG nova.compute.manager [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:21:00 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Jan 23 09:21:00 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003e.scope: Consumed 2.773s CPU time.
Jan 23 09:21:00 compute-0 systemd-machined[153562]: Machine qemu-27-instance-0000003e terminated.
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.945 182096 INFO nova.virt.libvirt.driver [-] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Instance destroyed successfully.
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.945 182096 DEBUG nova.objects.instance [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lazy-loading 'resources' on Instance uuid bb4eac91-65f9-4ef0-a1e5-458feef0f384 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.966 182096 INFO nova.virt.libvirt.driver [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Deleting instance files /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384_del
Jan 23 09:21:00 compute-0 nova_compute[182092]: 2026-01-23 09:21:00.967 182096 INFO nova.virt.libvirt.driver [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Deletion of /var/lib/nova/instances/bb4eac91-65f9-4ef0-a1e5-458feef0f384_del complete
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.018 182096 INFO nova.compute.manager [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.018 182096 DEBUG oslo.service.loopingcall [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.019 182096 DEBUG nova.compute.manager [-] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.019 182096 DEBUG nova.network.neutron [-] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.166 182096 DEBUG nova.network.neutron [-] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.180 182096 DEBUG nova.network.neutron [-] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.190 182096 INFO nova.compute.manager [-] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Took 0.17 seconds to deallocate network for instance.
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.242 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.242 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.291 182096 DEBUG nova.compute.provider_tree [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.300 182096 DEBUG nova.scheduler.client.report [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.318 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.342 182096 INFO nova.scheduler.client.report [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Deleted allocations for instance bb4eac91-65f9-4ef0-a1e5-458feef0f384
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.395 182096 DEBUG oslo_concurrency.lockutils [None req-3ed1f028-707e-4a62-a661-758947edc116 2070d7c21cf34064b652a3caa7f95b81 283aedeeb4244b859ca2932a553eb1ce - - default default] Lock "bb4eac91-65f9-4ef0-a1e5-458feef0f384" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:01 compute-0 nova_compute[182092]: 2026-01-23 09:21:01.965 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.320 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.320 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.332 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.395 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.395 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.399 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.399 182096 INFO nova.compute.claims [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.479 182096 DEBUG nova.compute.provider_tree [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.504 182096 DEBUG nova.scheduler.client.report [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.528 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.528 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.589 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.590 182096 DEBUG nova.network.neutron [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.605 182096 INFO nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.619 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.706 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.706 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.707 182096 INFO nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Creating image(s)
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.707 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.707 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.708 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.718 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.762 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.763 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.764 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.773 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.815 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.816 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.836 182096 DEBUG nova.policy [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.839 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.840 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.840 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.882 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.883 182096 DEBUG nova.virt.disk.api [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Checking if we can resize image /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.883 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.926 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.926 182096 DEBUG nova.virt.disk.api [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Cannot resize image /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.927 182096 DEBUG nova.objects.instance [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'migration_context' on Instance uuid 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.948 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.948 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Ensure instance console log exists: /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.948 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.949 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:03 compute-0 nova_compute[182092]: 2026-01-23 09:21:03.949 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:04 compute-0 podman[216283]: 2026-01-23 09:21:04.211221755 +0000 UTC m=+0.042221907 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:21:04 compute-0 podman[216282]: 2026-01-23 09:21:04.236131539 +0000 UTC m=+0.067815934 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 09:21:04 compute-0 nova_compute[182092]: 2026-01-23 09:21:04.913 182096 DEBUG nova.network.neutron [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Successfully created port: 7c108f42-128c-4c60-86d0-3b9925670206 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:21:05 compute-0 nova_compute[182092]: 2026-01-23 09:21:05.129 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:05 compute-0 nova_compute[182092]: 2026-01-23 09:21:05.740 182096 DEBUG nova.network.neutron [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Successfully updated port: 7c108f42-128c-4c60-86d0-3b9925670206 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:21:05 compute-0 nova_compute[182092]: 2026-01-23 09:21:05.752 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:05 compute-0 nova_compute[182092]: 2026-01-23 09:21:05.753 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquired lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:05 compute-0 nova_compute[182092]: 2026-01-23 09:21:05.753 182096 DEBUG nova.network.neutron [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:21:05 compute-0 nova_compute[182092]: 2026-01-23 09:21:05.981 182096 DEBUG nova.network.neutron [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.797 182096 DEBUG nova.network.neutron [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.805 182096 DEBUG nova.compute.manager [req-9230f9e0-5bfb-40c5-a460-1b200d5ceaca req-57210f56-4305-4d19-a5ff-9ce6774c15a3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-changed-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.806 182096 DEBUG nova.compute.manager [req-9230f9e0-5bfb-40c5-a460-1b200d5ceaca req-57210f56-4305-4d19-a5ff-9ce6774c15a3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing instance network info cache due to event network-changed-7c108f42-128c-4c60-86d0-3b9925670206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.806 182096 DEBUG oslo_concurrency.lockutils [req-9230f9e0-5bfb-40c5-a460-1b200d5ceaca req-57210f56-4305-4d19-a5ff-9ce6774c15a3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.811 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Releasing lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.811 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Instance network_info: |[{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.811 182096 DEBUG oslo_concurrency.lockutils [req-9230f9e0-5bfb-40c5-a460-1b200d5ceaca req-57210f56-4305-4d19-a5ff-9ce6774c15a3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.811 182096 DEBUG nova.network.neutron [req-9230f9e0-5bfb-40c5-a460-1b200d5ceaca req-57210f56-4305-4d19-a5ff-9ce6774c15a3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing network info cache for port 7c108f42-128c-4c60-86d0-3b9925670206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.813 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Start _get_guest_xml network_info=[{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.817 182096 WARNING nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.821 182096 DEBUG nova.virt.libvirt.host [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.822 182096 DEBUG nova.virt.libvirt.host [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.824 182096 DEBUG nova.virt.libvirt.host [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.824 182096 DEBUG nova.virt.libvirt.host [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.825 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.825 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.826 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.826 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.826 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.826 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.826 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.826 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.827 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.827 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.827 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.827 182096 DEBUG nova.virt.hardware [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.830 182096 DEBUG nova.virt.libvirt.vif [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:21:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-782237197',display_name='tempest-tempest.common.compute-instance-782237197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-782237197',id=64,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDg8ahhHaQkg+6lm53SYjvMozWMMN2AcJr5YfQfUI1hgnJLtemARhOfCUa25epRzPLvxhSZO4sYNIBGK0jSQNUEbVpqpbF1sr79CtjLFRispul5jio1OhsZycmL1OUJNvA==',key_name='tempest-keypair-1189478671',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-objhyfjm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:21:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=2fbdbef7-69ac-429d-9d7c-2aa24650cdf3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.830 182096 DEBUG nova.network.os_vif_util [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.831 182096 DEBUG nova.network.os_vif_util [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:c2:8b,bridge_name='br-int',has_traffic_filtering=True,id=7c108f42-128c-4c60-86d0-3b9925670206,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c108f42-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.831 182096 DEBUG nova.objects.instance [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.842 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <uuid>2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</uuid>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <name>instance-00000040</name>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <nova:name>tempest-tempest.common.compute-instance-782237197</nova:name>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:21:06</nova:creationTime>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:21:06 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:21:06 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:21:06 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:21:06 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:21:06 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:21:06 compute-0 nova_compute[182092]:         <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:21:06 compute-0 nova_compute[182092]:         <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:21:06 compute-0 nova_compute[182092]:         <nova:port uuid="7c108f42-128c-4c60-86d0-3b9925670206">
Jan 23 09:21:06 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <system>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <entry name="serial">2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</entry>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <entry name="uuid">2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</entry>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </system>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <os>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   </os>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <features>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   </features>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.config"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:7a:c2:8b"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <target dev="tap7c108f42-12"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/console.log" append="off"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <video>
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </video>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:21:06 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:21:06 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:21:06 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:21:06 compute-0 nova_compute[182092]: </domain>
Jan 23 09:21:06 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.843 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Preparing to wait for external event network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.843 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.844 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.844 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.844 182096 DEBUG nova.virt.libvirt.vif [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:21:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-782237197',display_name='tempest-tempest.common.compute-instance-782237197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-782237197',id=64,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDg8ahhHaQkg+6lm53SYjvMozWMMN2AcJr5YfQfUI1hgnJLtemARhOfCUa25epRzPLvxhSZO4sYNIBGK0jSQNUEbVpqpbF1sr79CtjLFRispul5jio1OhsZycmL1OUJNvA==',key_name='tempest-keypair-1189478671',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-objhyfjm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:21:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=2fbdbef7-69ac-429d-9d7c-2aa24650cdf3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.845 182096 DEBUG nova.network.os_vif_util [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.845 182096 DEBUG nova.network.os_vif_util [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:c2:8b,bridge_name='br-int',has_traffic_filtering=True,id=7c108f42-128c-4c60-86d0-3b9925670206,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c108f42-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.845 182096 DEBUG os_vif [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:c2:8b,bridge_name='br-int',has_traffic_filtering=True,id=7c108f42-128c-4c60-86d0-3b9925670206,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c108f42-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.846 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.846 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.846 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.848 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.848 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c108f42-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.849 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c108f42-12, col_values=(('external_ids', {'iface-id': '7c108f42-128c-4c60-86d0-3b9925670206', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:c2:8b', 'vm-uuid': '2fbdbef7-69ac-429d-9d7c-2aa24650cdf3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.850 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:06 compute-0 NetworkManager[54920]: <info>  [1769160066.8513] manager: (tap7c108f42-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.852 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.855 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.855 182096 INFO os_vif [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:c2:8b,bridge_name='br-int',has_traffic_filtering=True,id=7c108f42-128c-4c60-86d0-3b9925670206,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c108f42-12')
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.901 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.901 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.901 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:7a:c2:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.902 182096 INFO nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Using config drive
Jan 23 09:21:06 compute-0 podman[216322]: 2026-01-23 09:21:06.912747793 +0000 UTC m=+0.037944166 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal)
Jan 23 09:21:06 compute-0 nova_compute[182092]: 2026-01-23 09:21:06.965 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.360 182096 INFO nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Creating config drive at /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.config
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.366 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpup4unzny execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.485 182096 DEBUG oslo_concurrency.processutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpup4unzny" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:07 compute-0 kernel: tap7c108f42-12: entered promiscuous mode
Jan 23 09:21:07 compute-0 NetworkManager[54920]: <info>  [1769160067.5228] manager: (tap7c108f42-12): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Jan 23 09:21:07 compute-0 ovn_controller[94697]: 2026-01-23T09:21:07Z|00180|binding|INFO|Claiming lport 7c108f42-128c-4c60-86d0-3b9925670206 for this chassis.
Jan 23 09:21:07 compute-0 ovn_controller[94697]: 2026-01-23T09:21:07Z|00181|binding|INFO|7c108f42-128c-4c60-86d0-3b9925670206: Claiming fa:16:3e:7a:c2:8b 10.100.0.12
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.526 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.532 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 NetworkManager[54920]: <info>  [1769160067.5379] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Jan 23 09:21:07 compute-0 NetworkManager[54920]: <info>  [1769160067.5382] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.537 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.539 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:c2:8b 10.100.0.12'], port_security=['fa:16:3e:7a:c2:8b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2fbdbef7-69ac-429d-9d7c-2aa24650cdf3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8f3fb4cf-2840-4de0-97eb-8fffad852b11', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=7c108f42-128c-4c60-86d0-3b9925670206) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.540 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 7c108f42-128c-4c60-86d0-3b9925670206 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 bound to our chassis
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.541 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:21:07 compute-0 systemd-machined[153562]: New machine qemu-28-instance-00000040.
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.549 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d90625-6dfb-4ac7-a29c-f138ea32fab1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.550 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap502ff19d-71 in ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.553 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap502ff19d-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.553 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[576fe39e-3787-4545-9ea9-06e01f751f44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.553 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b740f8a-7d9d-4b0c-8f3e-1e5632be3535]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 systemd-udevd[216359]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.561 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d817bb-47a6-4851-9f7e-8e5db07f13df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 NetworkManager[54920]: <info>  [1769160067.5657] device (tap7c108f42-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:21:07 compute-0 NetworkManager[54920]: <info>  [1769160067.5661] device (tap7c108f42-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:21:07 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-00000040.
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.586 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[be936691-69ed-47be-b0a0-46722a799c0e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.606 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa5280e-66c4-46f1-b69e-99ea01765386]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 NetworkManager[54920]: <info>  [1769160067.6225] manager: (tap502ff19d-70): new Veth device (/org/freedesktop/NetworkManager/Devices/103)
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.621 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8170a7ee-5763-4e9e-8df3-d52002b15d62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.645 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ba5a59-7893-4797-ab97-c9a81a5af25c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.647 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8156be3d-1051-4aea-9993-f954a598578f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 NetworkManager[54920]: <info>  [1769160067.6599] device (tap502ff19d-70): carrier: link connected
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.663 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea1eb0f-4b29-4561-8b8b-4df20c50a308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.674 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4c6d198f-3ad9-49be-bf44-f46780efdf2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357314, 'reachable_time': 31993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216383, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.683 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5987ae87-f9a9-4aae-a779-ad2b9e02f241]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefb:cef2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 357314, 'tstamp': 357314}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216384, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.692 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0ede43c2-9647-44d8-a7d4-35ec6b24da4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357314, 'reachable_time': 31993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216385, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.696 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.709 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.714 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b88fd8d6-f876-4021-a547-784244471618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_controller[94697]: 2026-01-23T09:21:07Z|00182|binding|INFO|Setting lport 7c108f42-128c-4c60-86d0-3b9925670206 ovn-installed in OVS
Jan 23 09:21:07 compute-0 ovn_controller[94697]: 2026-01-23T09:21:07Z|00183|binding|INFO|Setting lport 7c108f42-128c-4c60-86d0-3b9925670206 up in Southbound
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.720 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.754 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[15078c4d-5c6d-4ac0-936b-46df30c5959d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.755 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.755 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.756 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:07 compute-0 kernel: tap502ff19d-70: entered promiscuous mode
Jan 23 09:21:07 compute-0 NetworkManager[54920]: <info>  [1769160067.7587] manager: (tap502ff19d-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.757 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.760 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:07 compute-0 ovn_controller[94697]: 2026-01-23T09:21:07Z|00184|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.763 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/502ff19d-7b13-4dc2-8ece-02806b418ba0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/502ff19d-7b13-4dc2-8ece-02806b418ba0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.763 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a5b50e42-9d05-40d6-bd69-5ae894ae4f50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.764 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/502ff19d-7b13-4dc2-8ece-02806b418ba0.pid.haproxy
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:21:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:07.765 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'env', 'PROCESS_TAG=haproxy-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/502ff19d-7b13-4dc2-8ece-02806b418ba0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.773 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.798 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160067.7985165, 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.799 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] VM Started (Lifecycle Event)
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.819 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.822 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160067.798696, 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.822 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] VM Paused (Lifecycle Event)
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.836 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.838 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:21:07 compute-0 nova_compute[182092]: 2026-01-23 09:21:07.856 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:21:08 compute-0 podman[216420]: 2026-01-23 09:21:08.049897672 +0000 UTC m=+0.028578847 container create ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:21:08 compute-0 systemd[1]: Started libpod-conmon-ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575.scope.
Jan 23 09:21:08 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:21:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71e7e25259d786d533040dd75b3cb0a96c25fd49187cab7e7ebcd567d8d63ee1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:21:08 compute-0 podman[216420]: 2026-01-23 09:21:08.102907361 +0000 UTC m=+0.081588547 container init ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 09:21:08 compute-0 podman[216420]: 2026-01-23 09:21:08.107516397 +0000 UTC m=+0.086197562 container start ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 09:21:08 compute-0 podman[216420]: 2026-01-23 09:21:08.037337547 +0000 UTC m=+0.016018712 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:21:08 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[216432]: [NOTICE]   (216436) : New worker (216438) forked
Jan 23 09:21:08 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[216432]: [NOTICE]   (216436) : Loading success.
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.285 182096 DEBUG nova.network.neutron [req-9230f9e0-5bfb-40c5-a460-1b200d5ceaca req-57210f56-4305-4d19-a5ff-9ce6774c15a3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updated VIF entry in instance network info cache for port 7c108f42-128c-4c60-86d0-3b9925670206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.286 182096 DEBUG nova.network.neutron [req-9230f9e0-5bfb-40c5-a460-1b200d5ceaca req-57210f56-4305-4d19-a5ff-9ce6774c15a3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.299 182096 DEBUG oslo_concurrency.lockutils [req-9230f9e0-5bfb-40c5-a460-1b200d5ceaca req-57210f56-4305-4d19-a5ff-9ce6774c15a3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.925 182096 DEBUG nova.compute.manager [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.926 182096 DEBUG oslo_concurrency.lockutils [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.926 182096 DEBUG oslo_concurrency.lockutils [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.926 182096 DEBUG oslo_concurrency.lockutils [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.926 182096 DEBUG nova.compute.manager [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Processing event network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.927 182096 DEBUG nova.compute.manager [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.927 182096 DEBUG oslo_concurrency.lockutils [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.927 182096 DEBUG oslo_concurrency.lockutils [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.927 182096 DEBUG oslo_concurrency.lockutils [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.928 182096 DEBUG nova.compute.manager [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] No waiting events found dispatching network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.928 182096 WARNING nova.compute.manager [req-07a99105-1543-4c0e-be9a-c8cfcf923476 req-60ab788f-5a35-46c4-ae77-efc22a0119b0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received unexpected event network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 for instance with vm_state building and task_state spawning.
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.929 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.933 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160068.9322712, 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.933 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] VM Resumed (Lifecycle Event)
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.934 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.937 182096 INFO nova.virt.libvirt.driver [-] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Instance spawned successfully.
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.938 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.949 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.951 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.959 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.960 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.960 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.961 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.961 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.961 182096 DEBUG nova.virt.libvirt.driver [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:08 compute-0 nova_compute[182092]: 2026-01-23 09:21:08.965 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:21:09 compute-0 nova_compute[182092]: 2026-01-23 09:21:09.054 182096 INFO nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Took 5.35 seconds to spawn the instance on the hypervisor.
Jan 23 09:21:09 compute-0 nova_compute[182092]: 2026-01-23 09:21:09.054 182096 DEBUG nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:09 compute-0 nova_compute[182092]: 2026-01-23 09:21:09.208 182096 INFO nova.compute.manager [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Took 5.83 seconds to build instance.
Jan 23 09:21:09 compute-0 nova_compute[182092]: 2026-01-23 09:21:09.239 182096 DEBUG oslo_concurrency.lockutils [None req-bf336a67-eeaa-4591-8bce-3268c852ced5 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.919s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:10 compute-0 nova_compute[182092]: 2026-01-23 09:21:10.400 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:11 compute-0 nova_compute[182092]: 2026-01-23 09:21:11.852 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:11 compute-0 nova_compute[182092]: 2026-01-23 09:21:11.966 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:14 compute-0 nova_compute[182092]: 2026-01-23 09:21:14.465 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:14 compute-0 ovn_controller[94697]: 2026-01-23T09:21:14Z|00185|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:21:14 compute-0 nova_compute[182092]: 2026-01-23 09:21:14.721 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:15 compute-0 podman[216443]: 2026-01-23 09:21:15.219485304 +0000 UTC m=+0.057652129 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:21:15 compute-0 nova_compute[182092]: 2026-01-23 09:21:15.751 182096 DEBUG nova.compute.manager [req-5ec1fc0f-569c-4407-b454-1dea5105a2cc req-8410c4a6-7ede-48a1-a466-ffc1b770c5ff 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-changed-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:15 compute-0 nova_compute[182092]: 2026-01-23 09:21:15.751 182096 DEBUG nova.compute.manager [req-5ec1fc0f-569c-4407-b454-1dea5105a2cc req-8410c4a6-7ede-48a1-a466-ffc1b770c5ff 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing instance network info cache due to event network-changed-7c108f42-128c-4c60-86d0-3b9925670206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:21:15 compute-0 nova_compute[182092]: 2026-01-23 09:21:15.752 182096 DEBUG oslo_concurrency.lockutils [req-5ec1fc0f-569c-4407-b454-1dea5105a2cc req-8410c4a6-7ede-48a1-a466-ffc1b770c5ff 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:15 compute-0 nova_compute[182092]: 2026-01-23 09:21:15.752 182096 DEBUG oslo_concurrency.lockutils [req-5ec1fc0f-569c-4407-b454-1dea5105a2cc req-8410c4a6-7ede-48a1-a466-ffc1b770c5ff 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:15 compute-0 nova_compute[182092]: 2026-01-23 09:21:15.752 182096 DEBUG nova.network.neutron [req-5ec1fc0f-569c-4407-b454-1dea5105a2cc req-8410c4a6-7ede-48a1-a466-ffc1b770c5ff 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing network info cache for port 7c108f42-128c-4c60-86d0-3b9925670206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:21:15 compute-0 nova_compute[182092]: 2026-01-23 09:21:15.945 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160060.9440432, bb4eac91-65f9-4ef0-a1e5-458feef0f384 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:21:15 compute-0 nova_compute[182092]: 2026-01-23 09:21:15.945 182096 INFO nova.compute.manager [-] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] VM Stopped (Lifecycle Event)
Jan 23 09:21:16 compute-0 nova_compute[182092]: 2026-01-23 09:21:16.625 182096 DEBUG nova.compute.manager [None req-49688c91-1f14-471e-95d6-3dd08e09ff1a - - - - - -] [instance: bb4eac91-65f9-4ef0-a1e5-458feef0f384] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:16 compute-0 nova_compute[182092]: 2026-01-23 09:21:16.855 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:16 compute-0 nova_compute[182092]: 2026-01-23 09:21:16.924 182096 DEBUG nova.compute.manager [req-0297b486-c30c-4272-b897-5ad8d4775525 req-c7fa1393-a3d6-49a3-9a6a-f9b419befbed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-changed-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:16 compute-0 nova_compute[182092]: 2026-01-23 09:21:16.924 182096 DEBUG nova.compute.manager [req-0297b486-c30c-4272-b897-5ad8d4775525 req-c7fa1393-a3d6-49a3-9a6a-f9b419befbed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing instance network info cache due to event network-changed-7c108f42-128c-4c60-86d0-3b9925670206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:21:16 compute-0 nova_compute[182092]: 2026-01-23 09:21:16.925 182096 DEBUG oslo_concurrency.lockutils [req-0297b486-c30c-4272-b897-5ad8d4775525 req-c7fa1393-a3d6-49a3-9a6a-f9b419befbed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:16 compute-0 nova_compute[182092]: 2026-01-23 09:21:16.967 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:17 compute-0 nova_compute[182092]: 2026-01-23 09:21:17.535 182096 DEBUG nova.network.neutron [req-5ec1fc0f-569c-4407-b454-1dea5105a2cc req-8410c4a6-7ede-48a1-a466-ffc1b770c5ff 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updated VIF entry in instance network info cache for port 7c108f42-128c-4c60-86d0-3b9925670206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:21:17 compute-0 nova_compute[182092]: 2026-01-23 09:21:17.536 182096 DEBUG nova.network.neutron [req-5ec1fc0f-569c-4407-b454-1dea5105a2cc req-8410c4a6-7ede-48a1-a466-ffc1b770c5ff 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:18 compute-0 nova_compute[182092]: 2026-01-23 09:21:18.488 182096 DEBUG oslo_concurrency.lockutils [req-5ec1fc0f-569c-4407-b454-1dea5105a2cc req-8410c4a6-7ede-48a1-a466-ffc1b770c5ff 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:18 compute-0 nova_compute[182092]: 2026-01-23 09:21:18.489 182096 DEBUG oslo_concurrency.lockutils [req-0297b486-c30c-4272-b897-5ad8d4775525 req-c7fa1393-a3d6-49a3-9a6a-f9b419befbed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:18 compute-0 nova_compute[182092]: 2026-01-23 09:21:18.489 182096 DEBUG nova.network.neutron [req-0297b486-c30c-4272-b897-5ad8d4775525 req-c7fa1393-a3d6-49a3-9a6a-f9b419befbed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing network info cache for port 7c108f42-128c-4c60-86d0-3b9925670206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:21:19 compute-0 nova_compute[182092]: 2026-01-23 09:21:19.855 182096 DEBUG nova.network.neutron [req-0297b486-c30c-4272-b897-5ad8d4775525 req-c7fa1393-a3d6-49a3-9a6a-f9b419befbed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updated VIF entry in instance network info cache for port 7c108f42-128c-4c60-86d0-3b9925670206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:21:19 compute-0 nova_compute[182092]: 2026-01-23 09:21:19.856 182096 DEBUG nova.network.neutron [req-0297b486-c30c-4272-b897-5ad8d4775525 req-c7fa1393-a3d6-49a3-9a6a-f9b419befbed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:19 compute-0 nova_compute[182092]: 2026-01-23 09:21:19.874 182096 DEBUG oslo_concurrency.lockutils [req-0297b486-c30c-4272-b897-5ad8d4775525 req-c7fa1393-a3d6-49a3-9a6a-f9b419befbed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:20 compute-0 ovn_controller[94697]: 2026-01-23T09:21:20Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7a:c2:8b 10.100.0.12
Jan 23 09:21:20 compute-0 ovn_controller[94697]: 2026-01-23T09:21:20Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:c2:8b 10.100.0.12
Jan 23 09:21:21 compute-0 nova_compute[182092]: 2026-01-23 09:21:21.859 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:21 compute-0 nova_compute[182092]: 2026-01-23 09:21:21.970 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:23 compute-0 podman[216482]: 2026-01-23 09:21:23.203720115 +0000 UTC m=+0.035776768 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:21:23 compute-0 podman[216481]: 2026-01-23 09:21:23.208239852 +0000 UTC m=+0.044689593 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 23 09:21:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:23.618 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:21:23 compute-0 nova_compute[182092]: 2026-01-23 09:21:23.618 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:23.621 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:21:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:23.622 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:24 compute-0 nova_compute[182092]: 2026-01-23 09:21:24.353 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:26 compute-0 nova_compute[182092]: 2026-01-23 09:21:26.862 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:26 compute-0 nova_compute[182092]: 2026-01-23 09:21:26.971 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:27 compute-0 nova_compute[182092]: 2026-01-23 09:21:27.009 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:27 compute-0 ovn_controller[94697]: 2026-01-23T09:21:27Z|00186|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:21:27 compute-0 nova_compute[182092]: 2026-01-23 09:21:27.669 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:31 compute-0 nova_compute[182092]: 2026-01-23 09:21:31.864 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:31 compute-0 nova_compute[182092]: 2026-01-23 09:21:31.974 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:34 compute-0 nova_compute[182092]: 2026-01-23 09:21:34.191 182096 DEBUG oslo_concurrency.lockutils [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "interface-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-4f2fd023-0fe8-4946-bd4d-158c350e1470" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:34 compute-0 nova_compute[182092]: 2026-01-23 09:21:34.191 182096 DEBUG oslo_concurrency.lockutils [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-4f2fd023-0fe8-4946-bd4d-158c350e1470" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:34 compute-0 nova_compute[182092]: 2026-01-23 09:21:34.191 182096 DEBUG nova.objects.instance [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'flavor' on Instance uuid 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:34 compute-0 nova_compute[182092]: 2026-01-23 09:21:34.378 182096 DEBUG nova.compute.manager [req-810404c9-2e3e-413d-9391-427778b5b9a0 req-d6039fb8-c6bf-473a-8494-8d01bcee264b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-changed-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:34 compute-0 nova_compute[182092]: 2026-01-23 09:21:34.378 182096 DEBUG nova.compute.manager [req-810404c9-2e3e-413d-9391-427778b5b9a0 req-d6039fb8-c6bf-473a-8494-8d01bcee264b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing instance network info cache due to event network-changed-7c108f42-128c-4c60-86d0-3b9925670206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:21:34 compute-0 nova_compute[182092]: 2026-01-23 09:21:34.379 182096 DEBUG oslo_concurrency.lockutils [req-810404c9-2e3e-413d-9391-427778b5b9a0 req-d6039fb8-c6bf-473a-8494-8d01bcee264b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:34 compute-0 nova_compute[182092]: 2026-01-23 09:21:34.379 182096 DEBUG oslo_concurrency.lockutils [req-810404c9-2e3e-413d-9391-427778b5b9a0 req-d6039fb8-c6bf-473a-8494-8d01bcee264b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:34 compute-0 nova_compute[182092]: 2026-01-23 09:21:34.379 182096 DEBUG nova.network.neutron [req-810404c9-2e3e-413d-9391-427778b5b9a0 req-d6039fb8-c6bf-473a-8494-8d01bcee264b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing network info cache for port 7c108f42-128c-4c60-86d0-3b9925670206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:21:35 compute-0 podman[216520]: 2026-01-23 09:21:35.207321168 +0000 UTC m=+0.044741251 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:21:35 compute-0 nova_compute[182092]: 2026-01-23 09:21:35.219 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:35 compute-0 podman[216521]: 2026-01-23 09:21:35.228215287 +0000 UTC m=+0.064443892 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:21:35 compute-0 nova_compute[182092]: 2026-01-23 09:21:35.557 182096 DEBUG nova.objects.instance [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:35 compute-0 nova_compute[182092]: 2026-01-23 09:21:35.578 182096 DEBUG nova.network.neutron [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:21:36 compute-0 nova_compute[182092]: 2026-01-23 09:21:36.552 182096 DEBUG nova.policy [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '654d6343796442d7946c6adfe1179a1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:21:36 compute-0 nova_compute[182092]: 2026-01-23 09:21:36.656 182096 DEBUG nova.network.neutron [req-810404c9-2e3e-413d-9391-427778b5b9a0 req-d6039fb8-c6bf-473a-8494-8d01bcee264b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updated VIF entry in instance network info cache for port 7c108f42-128c-4c60-86d0-3b9925670206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:21:36 compute-0 nova_compute[182092]: 2026-01-23 09:21:36.657 182096 DEBUG nova.network.neutron [req-810404c9-2e3e-413d-9391-427778b5b9a0 req-d6039fb8-c6bf-473a-8494-8d01bcee264b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:36 compute-0 nova_compute[182092]: 2026-01-23 09:21:36.680 182096 DEBUG oslo_concurrency.lockutils [req-810404c9-2e3e-413d-9391-427778b5b9a0 req-d6039fb8-c6bf-473a-8494-8d01bcee264b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:36 compute-0 nova_compute[182092]: 2026-01-23 09:21:36.866 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:36 compute-0 nova_compute[182092]: 2026-01-23 09:21:36.975 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:37 compute-0 podman[216558]: 2026-01-23 09:21:37.201770228 +0000 UTC m=+0.040496062 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Jan 23 09:21:37 compute-0 nova_compute[182092]: 2026-01-23 09:21:37.674 182096 DEBUG nova.network.neutron [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Successfully updated port: 4f2fd023-0fe8-4946-bd4d-158c350e1470 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:21:37 compute-0 nova_compute[182092]: 2026-01-23 09:21:37.688 182096 DEBUG oslo_concurrency.lockutils [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:37 compute-0 nova_compute[182092]: 2026-01-23 09:21:37.689 182096 DEBUG oslo_concurrency.lockutils [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquired lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:37 compute-0 nova_compute[182092]: 2026-01-23 09:21:37.689 182096 DEBUG nova.network.neutron [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:21:37 compute-0 nova_compute[182092]: 2026-01-23 09:21:37.795 182096 DEBUG nova.compute.manager [req-8778fff9-a1c0-4801-b2cd-eeb7c6372375 req-a07b088c-b4cf-4603-a151-e8d1734dea7b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-changed-4f2fd023-0fe8-4946-bd4d-158c350e1470 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:37 compute-0 nova_compute[182092]: 2026-01-23 09:21:37.795 182096 DEBUG nova.compute.manager [req-8778fff9-a1c0-4801-b2cd-eeb7c6372375 req-a07b088c-b4cf-4603-a151-e8d1734dea7b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing instance network info cache due to event network-changed-4f2fd023-0fe8-4946-bd4d-158c350e1470. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:21:37 compute-0 nova_compute[182092]: 2026-01-23 09:21:37.795 182096 DEBUG oslo_concurrency.lockutils [req-8778fff9-a1c0-4801-b2cd-eeb7c6372375 req-a07b088c-b4cf-4603-a151-e8d1734dea7b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:37 compute-0 nova_compute[182092]: 2026-01-23 09:21:37.951 182096 WARNING nova.network.neutron [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] 502ff19d-7b13-4dc2-8ece-02806b418ba0 already exists in list: networks containing: ['502ff19d-7b13-4dc2-8ece-02806b418ba0']. ignoring it
Jan 23 09:21:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:39.856 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:39.857 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:39.857 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:40 compute-0 nova_compute[182092]: 2026-01-23 09:21:40.111 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.768 182096 DEBUG nova.network.neutron [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.789 182096 DEBUG oslo_concurrency.lockutils [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Releasing lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.790 182096 DEBUG oslo_concurrency.lockutils [req-8778fff9-a1c0-4801-b2cd-eeb7c6372375 req-a07b088c-b4cf-4603-a151-e8d1734dea7b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.790 182096 DEBUG nova.network.neutron [req-8778fff9-a1c0-4801-b2cd-eeb7c6372375 req-a07b088c-b4cf-4603-a151-e8d1734dea7b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Refreshing network info cache for port 4f2fd023-0fe8-4946-bd4d-158c350e1470 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.792 182096 DEBUG nova.virt.libvirt.vif [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:21:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-782237197',display_name='tempest-tempest.common.compute-instance-782237197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-782237197',id=64,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDg8ahhHaQkg+6lm53SYjvMozWMMN2AcJr5YfQfUI1hgnJLtemARhOfCUa25epRzPLvxhSZO4sYNIBGK0jSQNUEbVpqpbF1sr79CtjLFRispul5jio1OhsZycmL1OUJNvA==',key_name='tempest-keypair-1189478671',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:21:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-objhyfjm',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:21:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=2fbdbef7-69ac-429d-9d7c-2aa24650cdf3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.793 182096 DEBUG nova.network.os_vif_util [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.793 182096 DEBUG nova.network.os_vif_util [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.794 182096 DEBUG os_vif [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.794 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.795 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.795 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.797 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.798 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f2fd023-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.798 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f2fd023-0f, col_values=(('external_ids', {'iface-id': '4f2fd023-0fe8-4946-bd4d-158c350e1470', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:78:44', 'vm-uuid': '2fbdbef7-69ac-429d-9d7c-2aa24650cdf3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.799 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 NetworkManager[54920]: <info>  [1769160101.8002] manager: (tap4f2fd023-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.801 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.805 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.806 182096 INFO os_vif [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f')
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.807 182096 DEBUG nova.virt.libvirt.vif [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:21:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-782237197',display_name='tempest-tempest.common.compute-instance-782237197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-782237197',id=64,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDg8ahhHaQkg+6lm53SYjvMozWMMN2AcJr5YfQfUI1hgnJLtemARhOfCUa25epRzPLvxhSZO4sYNIBGK0jSQNUEbVpqpbF1sr79CtjLFRispul5jio1OhsZycmL1OUJNvA==',key_name='tempest-keypair-1189478671',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:21:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-objhyfjm',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:21:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=2fbdbef7-69ac-429d-9d7c-2aa24650cdf3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.807 182096 DEBUG nova.network.os_vif_util [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.807 182096 DEBUG nova.network.os_vif_util [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.808 182096 DEBUG nova.virt.libvirt.guest [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] attach device xml: <interface type="ethernet">
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:eb:78:44"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <target dev="tap4f2fd023-0f"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]: </interface>
Jan 23 09:21:41 compute-0 nova_compute[182092]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 23 09:21:41 compute-0 kernel: tap4f2fd023-0f: entered promiscuous mode
Jan 23 09:21:41 compute-0 NetworkManager[54920]: <info>  [1769160101.8168] manager: (tap4f2fd023-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Jan 23 09:21:41 compute-0 ovn_controller[94697]: 2026-01-23T09:21:41Z|00187|binding|INFO|Claiming lport 4f2fd023-0fe8-4946-bd4d-158c350e1470 for this chassis.
Jan 23 09:21:41 compute-0 ovn_controller[94697]: 2026-01-23T09:21:41Z|00188|binding|INFO|4f2fd023-0fe8-4946-bd4d-158c350e1470: Claiming fa:16:3e:eb:78:44 10.100.0.10
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.822 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 ovn_controller[94697]: 2026-01-23T09:21:41Z|00189|binding|INFO|Setting lport 4f2fd023-0fe8-4946-bd4d-158c350e1470 up in Southbound
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.833 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:78:44 10.100.0.10'], port_security=['fa:16:3e:eb:78:44 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-2077805407', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2fbdbef7-69ac-429d-9d7c-2aa24650cdf3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-2077805407', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '7', 'neutron:security_group_ids': '2e1e3e7d-99c7-4c17-b0f8-7664ef7a84a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=4f2fd023-0fe8-4946-bd4d-158c350e1470) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.834 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 4f2fd023-0fe8-4946-bd4d-158c350e1470 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 bound to our chassis
Jan 23 09:21:41 compute-0 ovn_controller[94697]: 2026-01-23T09:21:41Z|00190|binding|INFO|Setting lport 4f2fd023-0fe8-4946-bd4d-158c350e1470 ovn-installed in OVS
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.836 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.835 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.839 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 systemd-udevd[216583]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.848 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[48e74d11-e23b-4a23-9730-ab934645b60a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:41 compute-0 NetworkManager[54920]: <info>  [1769160101.8559] device (tap4f2fd023-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:21:41 compute-0 NetworkManager[54920]: <info>  [1769160101.8564] device (tap4f2fd023-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.873 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[7095b975-7800-49a5-8699-1315efade4b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.875 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc8a005-5c5f-499f-8102-d35a9bb4c987]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.886 182096 DEBUG nova.virt.libvirt.driver [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.886 182096 DEBUG nova.virt.libvirt.driver [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.887 182096 DEBUG nova.virt.libvirt.driver [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:7a:c2:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.887 182096 DEBUG nova.virt.libvirt.driver [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] No VIF found with MAC fa:16:3e:eb:78:44, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.898 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d515e329-c105-4d88-bc16-3f61e66a9119]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.902 182096 DEBUG nova.virt.libvirt.guest [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <nova:name>tempest-tempest.common.compute-instance-782237197</nova:name>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:21:41</nova:creationTime>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:port uuid="7c108f42-128c-4c60-86d0-3b9925670206">
Jan 23 09:21:41 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     <nova:port uuid="4f2fd023-0fe8-4946-bd4d-158c350e1470">
Jan 23 09:21:41 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:21:41 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:21:41 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:21:41 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:21:41 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.910 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[33caa03d-0198-456e-ad1e-f482e5b56749]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357314, 'reachable_time': 31993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216590, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.921 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f27841d4-3b21-46fa-ace4-6a06cbb86138]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 357321, 'tstamp': 357321}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216591, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 357323, 'tstamp': 357323}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216591, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.922 182096 DEBUG oslo_concurrency.lockutils [None req-b695c469-1e59-4f5e-bc1f-bfbd9469b548 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-4f2fd023-0fe8-4946-bd4d-158c350e1470" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.923 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.924 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.925 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.925 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.925 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:41.925 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:21:41 compute-0 nova_compute[182092]: 2026-01-23 09:21:41.976 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:42 compute-0 nova_compute[182092]: 2026-01-23 09:21:42.058 182096 DEBUG nova.compute.manager [req-8b739fa6-2477-448e-bb36-5586873b43c0 req-94034ea5-f495-4280-8a4a-d582a5aac3a9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:42 compute-0 nova_compute[182092]: 2026-01-23 09:21:42.059 182096 DEBUG oslo_concurrency.lockutils [req-8b739fa6-2477-448e-bb36-5586873b43c0 req-94034ea5-f495-4280-8a4a-d582a5aac3a9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:42 compute-0 nova_compute[182092]: 2026-01-23 09:21:42.059 182096 DEBUG oslo_concurrency.lockutils [req-8b739fa6-2477-448e-bb36-5586873b43c0 req-94034ea5-f495-4280-8a4a-d582a5aac3a9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:42 compute-0 nova_compute[182092]: 2026-01-23 09:21:42.060 182096 DEBUG oslo_concurrency.lockutils [req-8b739fa6-2477-448e-bb36-5586873b43c0 req-94034ea5-f495-4280-8a4a-d582a5aac3a9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:42 compute-0 nova_compute[182092]: 2026-01-23 09:21:42.060 182096 DEBUG nova.compute.manager [req-8b739fa6-2477-448e-bb36-5586873b43c0 req-94034ea5-f495-4280-8a4a-d582a5aac3a9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] No waiting events found dispatching network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:21:42 compute-0 nova_compute[182092]: 2026-01-23 09:21:42.060 182096 WARNING nova.compute.manager [req-8b739fa6-2477-448e-bb36-5586873b43c0 req-94034ea5-f495-4280-8a4a-d582a5aac3a9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received unexpected event network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 for instance with vm_state active and task_state None.
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.417 182096 DEBUG oslo_concurrency.lockutils [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "interface-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-4f2fd023-0fe8-4946-bd4d-158c350e1470" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.417 182096 DEBUG oslo_concurrency.lockutils [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-4f2fd023-0fe8-4946-bd4d-158c350e1470" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.434 182096 DEBUG nova.objects.instance [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'flavor' on Instance uuid 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.452 182096 DEBUG nova.virt.libvirt.vif [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:21:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-782237197',display_name='tempest-tempest.common.compute-instance-782237197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-782237197',id=64,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDg8ahhHaQkg+6lm53SYjvMozWMMN2AcJr5YfQfUI1hgnJLtemARhOfCUa25epRzPLvxhSZO4sYNIBGK0jSQNUEbVpqpbF1sr79CtjLFRispul5jio1OhsZycmL1OUJNvA==',key_name='tempest-keypair-1189478671',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:21:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-objhyfjm',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:21:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=2fbdbef7-69ac-429d-9d7c-2aa24650cdf3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.452 182096 DEBUG nova.network.os_vif_util [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.453 182096 DEBUG nova.network.os_vif_util [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.455 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:eb:78:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4f2fd023-0f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.457 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:eb:78:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4f2fd023-0f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.458 182096 DEBUG nova.virt.libvirt.driver [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Attempting to detach device tap4f2fd023-0f from instance 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.459 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] detach device xml: <interface type="ethernet">
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:eb:78:44"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <target dev="tap4f2fd023-0f"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]: </interface>
Jan 23 09:21:43 compute-0 nova_compute[182092]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.463 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:eb:78:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4f2fd023-0f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.466 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:eb:78:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4f2fd023-0f"/></interface>not found in domain: <domain type='kvm' id='28'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <name>instance-00000040</name>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <uuid>2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</uuid>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:name>tempest-tempest.common.compute-instance-782237197</nova:name>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:21:41</nova:creationTime>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:port uuid="7c108f42-128c-4c60-86d0-3b9925670206">
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:port uuid="4f2fd023-0fe8-4946-bd4d-158c350e1470">
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:21:43 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <system>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='serial'>2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='uuid'>2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </system>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <os>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </os>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <features>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </features>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk' index='2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.config' index='1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:7a:c2:8b'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target dev='tap7c108f42-12'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:eb:78:44'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target dev='tap4f2fd023-0f'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='net1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/console.log' append='off'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       </target>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/console.log' append='off'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </console>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <video>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </video>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c174,c971</label>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c174,c971</imagelabel>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:21:43 compute-0 nova_compute[182092]: </domain>
Jan 23 09:21:43 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.468 182096 INFO nova.virt.libvirt.driver [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully detached device tap4f2fd023-0f from instance 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 from the persistent domain config.
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.468 182096 DEBUG nova.virt.libvirt.driver [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] (1/8): Attempting to detach device tap4f2fd023-0f with device alias net1 from instance 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.469 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] detach device xml: <interface type="ethernet">
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:eb:78:44"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <target dev="tap4f2fd023-0f"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]: </interface>
Jan 23 09:21:43 compute-0 nova_compute[182092]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 09:21:43 compute-0 kernel: tap4f2fd023-0f (unregistering): left promiscuous mode
Jan 23 09:21:43 compute-0 NetworkManager[54920]: <info>  [1769160103.5114] device (tap4f2fd023-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.516 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:43 compute-0 ovn_controller[94697]: 2026-01-23T09:21:43Z|00191|binding|INFO|Releasing lport 4f2fd023-0fe8-4946-bd4d-158c350e1470 from this chassis (sb_readonly=0)
Jan 23 09:21:43 compute-0 ovn_controller[94697]: 2026-01-23T09:21:43Z|00192|binding|INFO|Setting lport 4f2fd023-0fe8-4946-bd4d-158c350e1470 down in Southbound
Jan 23 09:21:43 compute-0 ovn_controller[94697]: 2026-01-23T09:21:43Z|00193|binding|INFO|Removing iface tap4f2fd023-0f ovn-installed in OVS
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.519 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.521 182096 DEBUG nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Received event <DeviceRemovedEvent: 1769160103.520131, 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.522 182096 DEBUG nova.virt.libvirt.driver [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Start waiting for the detach event from libvirt for device tap4f2fd023-0f with device alias net1 for instance 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.522 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:eb:78:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4f2fd023-0f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.522 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:78:44 10.100.0.10'], port_security=['fa:16:3e:eb:78:44 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-2077805407', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2fbdbef7-69ac-429d-9d7c-2aa24650cdf3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-2077805407', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '9', 'neutron:security_group_ids': '2e1e3e7d-99c7-4c17-b0f8-7664ef7a84a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=4f2fd023-0fe8-4946-bd4d-158c350e1470) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.523 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 4f2fd023-0fe8-4946-bd4d-158c350e1470 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 unbound from our chassis
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.525 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:eb:78:44"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4f2fd023-0f"/></interface>not found in domain: <domain type='kvm' id='28'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <name>instance-00000040</name>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <uuid>2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</uuid>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:name>tempest-tempest.common.compute-instance-782237197</nova:name>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:21:41</nova:creationTime>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:port uuid="7c108f42-128c-4c60-86d0-3b9925670206">
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:port uuid="4f2fd023-0fe8-4946-bd4d-158c350e1470">
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:21:43 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <system>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='serial'>2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='uuid'>2fbdbef7-69ac-429d-9d7c-2aa24650cdf3</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </system>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <os>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </os>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <features>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </features>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk' index='2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/disk.config' index='1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:7a:c2:8b'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target dev='tap7c108f42-12'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/console.log' append='off'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       </target>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3/console.log' append='off'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </console>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <video>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </video>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c174,c971</label>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c174,c971</imagelabel>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:21:43 compute-0 nova_compute[182092]: </domain>
Jan 23 09:21:43 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.525 182096 INFO nova.virt.libvirt.driver [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully detached device tap4f2fd023-0f from instance 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 from the live domain config.
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.525 182096 DEBUG nova.virt.libvirt.vif [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:21:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-782237197',display_name='tempest-tempest.common.compute-instance-782237197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-782237197',id=64,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDg8ahhHaQkg+6lm53SYjvMozWMMN2AcJr5YfQfUI1hgnJLtemARhOfCUa25epRzPLvxhSZO4sYNIBGK0jSQNUEbVpqpbF1sr79CtjLFRispul5jio1OhsZycmL1OUJNvA==',key_name='tempest-keypair-1189478671',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:21:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-objhyfjm',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:21:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=2fbdbef7-69ac-429d-9d7c-2aa24650cdf3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.526 182096 DEBUG nova.network.os_vif_util [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.526 182096 DEBUG nova.network.os_vif_util [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.526 182096 DEBUG os_vif [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.528 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.528 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f2fd023-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.531 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 502ff19d-7b13-4dc2-8ece-02806b418ba0
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.533 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.534 182096 INFO os_vif [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f')
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.535 182096 DEBUG nova.virt.libvirt.guest [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:name>tempest-tempest.common.compute-instance-782237197</nova:name>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:21:43</nova:creationTime>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:user uuid="654d6343796442d7946c6adfe1179a1f">tempest-AttachInterfacesTestJSON-1921741714-project-member</nova:user>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:project uuid="c2a29923657746f0a0e0d489a2b1a730">tempest-AttachInterfacesTestJSON-1921741714</nova:project>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     <nova:port uuid="7c108f42-128c-4c60-86d0-3b9925670206">
Jan 23 09:21:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:21:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:21:43 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:21:43 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:21:43 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.541 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[00cd0028-605e-4ec7-b0ea-41bb66efe3f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.561 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1da633fc-f727-45a6-9beb-976cbdbdb867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.563 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[18b443a5-1aa1-4cd7-a49b-73e5f066cfdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.582 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[f34e7e04-810a-4f3e-872e-ee7563ff2afc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.594 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[31ff71a8-be90-457b-bcf1-8bd0874c82fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap502ff19d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:ce:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357314, 'reachable_time': 31993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216601, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.605 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee68411-f772-4147-a74f-d65b9cce887b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 357321, 'tstamp': 357321}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216602, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap502ff19d-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 357323, 'tstamp': 357323}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216602, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.606 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.607 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:43 compute-0 nova_compute[182092]: 2026-01-23 09:21:43.608 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.610 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap502ff19d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.610 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.610 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap502ff19d-70, col_values=(('external_ids', {'iface-id': '21509454-c1b4-453c-b803-0f28e59a6f24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:43.611 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:21:44 compute-0 nova_compute[182092]: 2026-01-23 09:21:44.357 182096 DEBUG nova.network.neutron [req-8778fff9-a1c0-4801-b2cd-eeb7c6372375 req-a07b088c-b4cf-4603-a151-e8d1734dea7b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updated VIF entry in instance network info cache for port 4f2fd023-0fe8-4946-bd4d-158c350e1470. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:21:44 compute-0 nova_compute[182092]: 2026-01-23 09:21:44.357 182096 DEBUG nova.network.neutron [req-8778fff9-a1c0-4801-b2cd-eeb7c6372375 req-a07b088c-b4cf-4603-a151-e8d1734dea7b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.015 182096 DEBUG nova.compute.manager [req-fac53786-c9df-4281-ab33-5c6b3a7e88f9 req-50e9c65b-9739-4eb8-8ac3-97120f3c4a0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.016 182096 DEBUG oslo_concurrency.lockutils [req-fac53786-c9df-4281-ab33-5c6b3a7e88f9 req-50e9c65b-9739-4eb8-8ac3-97120f3c4a0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.016 182096 DEBUG oslo_concurrency.lockutils [req-fac53786-c9df-4281-ab33-5c6b3a7e88f9 req-50e9c65b-9739-4eb8-8ac3-97120f3c4a0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.016 182096 DEBUG oslo_concurrency.lockutils [req-fac53786-c9df-4281-ab33-5c6b3a7e88f9 req-50e9c65b-9739-4eb8-8ac3-97120f3c4a0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.016 182096 DEBUG nova.compute.manager [req-fac53786-c9df-4281-ab33-5c6b3a7e88f9 req-50e9c65b-9739-4eb8-8ac3-97120f3c4a0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] No waiting events found dispatching network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.017 182096 WARNING nova.compute.manager [req-fac53786-c9df-4281-ab33-5c6b3a7e88f9 req-50e9c65b-9739-4eb8-8ac3-97120f3c4a0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received unexpected event network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 for instance with vm_state active and task_state None.
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.201 182096 DEBUG oslo_concurrency.lockutils [req-8778fff9-a1c0-4801-b2cd-eeb7c6372375 req-a07b088c-b4cf-4603-a151-e8d1734dea7b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:46 compute-0 podman[216603]: 2026-01-23 09:21:46.220418757 +0000 UTC m=+0.057817470 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.461 182096 DEBUG oslo_concurrency.lockutils [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.461 182096 DEBUG oslo_concurrency.lockutils [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquired lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.461 182096 DEBUG nova.network.neutron [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:21:46 compute-0 ovn_controller[94697]: 2026-01-23T09:21:46Z|00194|binding|INFO|Releasing lport 21509454-c1b4-453c-b803-0f28e59a6f24 from this chassis (sb_readonly=0)
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.908 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.930 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.930 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.930 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.930 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.931 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.936 182096 INFO nova.compute.manager [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Terminating instance
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.944 182096 DEBUG nova.compute.manager [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:21:46 compute-0 kernel: tap7c108f42-12 (unregistering): left promiscuous mode
Jan 23 09:21:46 compute-0 NetworkManager[54920]: <info>  [1769160106.9686] device (tap7c108f42-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:21:46 compute-0 ovn_controller[94697]: 2026-01-23T09:21:46Z|00195|binding|INFO|Releasing lport 7c108f42-128c-4c60-86d0-3b9925670206 from this chassis (sb_readonly=0)
Jan 23 09:21:46 compute-0 ovn_controller[94697]: 2026-01-23T09:21:46Z|00196|binding|INFO|Setting lport 7c108f42-128c-4c60-86d0-3b9925670206 down in Southbound
Jan 23 09:21:46 compute-0 ovn_controller[94697]: 2026-01-23T09:21:46Z|00197|binding|INFO|Removing iface tap7c108f42-12 ovn-installed in OVS
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.974 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.976 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:46.981 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:c2:8b 10.100.0.12'], port_security=['fa:16:3e:7a:c2:8b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2fbdbef7-69ac-429d-9d7c-2aa24650cdf3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2a29923657746f0a0e0d489a2b1a730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8f3fb4cf-2840-4de0-97eb-8fffad852b11', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b884c6a0-77e7-4548-adf1-cecbda671e9b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=7c108f42-128c-4c60-86d0-3b9925670206) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:21:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:46.982 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 7c108f42-128c-4c60-86d0-3b9925670206 in datapath 502ff19d-7b13-4dc2-8ece-02806b418ba0 unbound from our chassis
Jan 23 09:21:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:46.984 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 502ff19d-7b13-4dc2-8ece-02806b418ba0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:21:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:46.985 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1619a3-e20a-423d-9857-ca09e290567a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:46.986 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 namespace which is not needed anymore
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.991 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:46 compute-0 nova_compute[182092]: 2026-01-23 09:21:46.993 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:47 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000040.scope: Deactivated successfully.
Jan 23 09:21:47 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000040.scope: Consumed 11.585s CPU time.
Jan 23 09:21:47 compute-0 systemd-machined[153562]: Machine qemu-28-instance-00000040 terminated.
Jan 23 09:21:47 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[216432]: [NOTICE]   (216436) : haproxy version is 2.8.14-c23fe91
Jan 23 09:21:47 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[216432]: [NOTICE]   (216436) : path to executable is /usr/sbin/haproxy
Jan 23 09:21:47 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[216432]: [WARNING]  (216436) : Exiting Master process...
Jan 23 09:21:47 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[216432]: [ALERT]    (216436) : Current worker (216438) exited with code 143 (Terminated)
Jan 23 09:21:47 compute-0 neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0[216432]: [WARNING]  (216436) : All workers exited. Exiting... (0)
Jan 23 09:21:47 compute-0 systemd[1]: libpod-ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575.scope: Deactivated successfully.
Jan 23 09:21:47 compute-0 podman[216647]: 2026-01-23 09:21:47.085176949 +0000 UTC m=+0.033448546 container died ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:21:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575-userdata-shm.mount: Deactivated successfully.
Jan 23 09:21:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-71e7e25259d786d533040dd75b3cb0a96c25fd49187cab7e7ebcd567d8d63ee1-merged.mount: Deactivated successfully.
Jan 23 09:21:47 compute-0 podman[216647]: 2026-01-23 09:21:47.112308175 +0000 UTC m=+0.060579772 container cleanup ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:21:47 compute-0 systemd[1]: libpod-conmon-ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575.scope: Deactivated successfully.
Jan 23 09:21:47 compute-0 podman[216671]: 2026-01-23 09:21:47.154115739 +0000 UTC m=+0.024661238 container remove ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 09:21:47 compute-0 NetworkManager[54920]: <info>  [1769160107.1580] manager: (tap7c108f42-12): new Tun device (/org/freedesktop/NetworkManager/Devices/107)
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.160 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.158 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4af278-10ab-4351-b6de-64e6698f0c59]: (4, ('Fri Jan 23 09:21:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 (ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575)\nab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575\nFri Jan 23 09:21:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 (ab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575)\nab030d638479e18824765af34a0a13603afa0feb6d5c69636ca2b640a6b59575\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.162 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0b619a06-accd-4805-aca0-ddd110f68169]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.164 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap502ff19d-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.165 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:47 compute-0 kernel: tap502ff19d-70: left promiscuous mode
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.182 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.184 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c646c0-2733-4bee-95c4-a6f7421eef0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.189 182096 INFO nova.virt.libvirt.driver [-] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Instance destroyed successfully.
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.190 182096 DEBUG nova.objects.instance [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lazy-loading 'resources' on Instance uuid 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.189 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[105ce9c5-61e9-4344-af63-d564d39627e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.190 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[276d2283-c871-4a90-aa28-77467216046d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.204 182096 DEBUG nova.virt.libvirt.vif [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:21:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-782237197',display_name='tempest-tempest.common.compute-instance-782237197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-782237197',id=64,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDg8ahhHaQkg+6lm53SYjvMozWMMN2AcJr5YfQfUI1hgnJLtemARhOfCUa25epRzPLvxhSZO4sYNIBGK0jSQNUEbVpqpbF1sr79CtjLFRispul5jio1OhsZycmL1OUJNvA==',key_name='tempest-keypair-1189478671',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:21:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-objhyfjm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:21:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=2fbdbef7-69ac-429d-9d7c-2aa24650cdf3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.204 182096 DEBUG nova.network.os_vif_util [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.202 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6edbe594-ea2b-4880-8d38-6b73e23e841f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357308, 'reachable_time': 26025, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216694, 'error': None, 'target': 'ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.205 182096 DEBUG nova.network.os_vif_util [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:c2:8b,bridge_name='br-int',has_traffic_filtering=True,id=7c108f42-128c-4c60-86d0-3b9925670206,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c108f42-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.205 182096 DEBUG os_vif [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:c2:8b,bridge_name='br-int',has_traffic_filtering=True,id=7c108f42-128c-4c60-86d0-3b9925670206,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c108f42-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:21:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d502ff19d\x2d7b13\x2d4dc2\x2d8ece\x2d02806b418ba0.mount: Deactivated successfully.
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.207 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.207 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c108f42-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.207 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-502ff19d-7b13-4dc2-8ece-02806b418ba0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:21:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:21:47.207 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8b08bd-79bf-4c12-9029-6a459095c583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.208 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.209 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.211 182096 INFO os_vif [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:c2:8b,bridge_name='br-int',has_traffic_filtering=True,id=7c108f42-128c-4c60-86d0-3b9925670206,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c108f42-12')
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.211 182096 DEBUG nova.virt.libvirt.vif [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:21:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-782237197',display_name='tempest-tempest.common.compute-instance-782237197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-782237197',id=64,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDg8ahhHaQkg+6lm53SYjvMozWMMN2AcJr5YfQfUI1hgnJLtemARhOfCUa25epRzPLvxhSZO4sYNIBGK0jSQNUEbVpqpbF1sr79CtjLFRispul5jio1OhsZycmL1OUJNvA==',key_name='tempest-keypair-1189478671',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:21:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2a29923657746f0a0e0d489a2b1a730',ramdisk_id='',reservation_id='r-objhyfjm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-1921741714',owner_user_name='tempest-AttachInterfacesTestJSON-1921741714-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:21:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='654d6343796442d7946c6adfe1179a1f',uuid=2fbdbef7-69ac-429d-9d7c-2aa24650cdf3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.212 182096 DEBUG nova.network.os_vif_util [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converting VIF {"id": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "address": "fa:16:3e:eb:78:44", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f2fd023-0f", "ovs_interfaceid": "4f2fd023-0fe8-4946-bd4d-158c350e1470", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.212 182096 DEBUG nova.network.os_vif_util [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.212 182096 DEBUG os_vif [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.213 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.213 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f2fd023-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.213 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.214 182096 INFO os_vif [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:78:44,bridge_name='br-int',has_traffic_filtering=True,id=4f2fd023-0fe8-4946-bd4d-158c350e1470,network=Network(502ff19d-7b13-4dc2-8ece-02806b418ba0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4f2fd023-0f')
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.215 182096 INFO nova.virt.libvirt.driver [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Deleting instance files /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3_del
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.215 182096 INFO nova.virt.libvirt.driver [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Deletion of /var/lib/nova/instances/2fbdbef7-69ac-429d-9d7c-2aa24650cdf3_del complete
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.272 182096 DEBUG nova.compute.manager [req-ef7220ce-7fa2-4371-a4e9-b95f3b9e16c8 req-5ade5c28-82ae-4f81-9658-816d9fd0182c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-unplugged-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.272 182096 DEBUG oslo_concurrency.lockutils [req-ef7220ce-7fa2-4371-a4e9-b95f3b9e16c8 req-5ade5c28-82ae-4f81-9658-816d9fd0182c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.272 182096 DEBUG oslo_concurrency.lockutils [req-ef7220ce-7fa2-4371-a4e9-b95f3b9e16c8 req-5ade5c28-82ae-4f81-9658-816d9fd0182c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.272 182096 DEBUG oslo_concurrency.lockutils [req-ef7220ce-7fa2-4371-a4e9-b95f3b9e16c8 req-5ade5c28-82ae-4f81-9658-816d9fd0182c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.272 182096 DEBUG nova.compute.manager [req-ef7220ce-7fa2-4371-a4e9-b95f3b9e16c8 req-5ade5c28-82ae-4f81-9658-816d9fd0182c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] No waiting events found dispatching network-vif-unplugged-7c108f42-128c-4c60-86d0-3b9925670206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.273 182096 DEBUG nova.compute.manager [req-ef7220ce-7fa2-4371-a4e9-b95f3b9e16c8 req-5ade5c28-82ae-4f81-9658-816d9fd0182c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-unplugged-7c108f42-128c-4c60-86d0-3b9925670206 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.282 182096 INFO nova.compute.manager [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.282 182096 DEBUG oslo.service.loopingcall [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.282 182096 DEBUG nova.compute.manager [-] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.282 182096 DEBUG nova.network.neutron [-] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.903 182096 INFO nova.network.neutron [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Port 4f2fd023-0fe8-4946-bd4d-158c350e1470 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.904 182096 DEBUG nova.network.neutron [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [{"id": "7c108f42-128c-4c60-86d0-3b9925670206", "address": "fa:16:3e:7a:c2:8b", "network": {"id": "502ff19d-7b13-4dc2-8ece-02806b418ba0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1996409307-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2a29923657746f0a0e0d489a2b1a730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c108f42-12", "ovs_interfaceid": "7c108f42-128c-4c60-86d0-3b9925670206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.925 182096 DEBUG oslo_concurrency.lockutils [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Releasing lock "refresh_cache-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:21:47 compute-0 nova_compute[182092]: 2026-01-23 09:21:47.954 182096 DEBUG oslo_concurrency.lockutils [None req-e46adc1b-d7e5-4900-ab75-220f9a021084 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "interface-2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-4f2fd023-0fe8-4946-bd4d-158c350e1470" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.343 182096 DEBUG nova.compute.manager [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-unplugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.344 182096 DEBUG oslo_concurrency.lockutils [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.344 182096 DEBUG oslo_concurrency.lockutils [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.344 182096 DEBUG oslo_concurrency.lockutils [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.344 182096 DEBUG nova.compute.manager [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] No waiting events found dispatching network-vif-unplugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.344 182096 DEBUG nova.compute.manager [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-unplugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.344 182096 DEBUG nova.compute.manager [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.345 182096 DEBUG oslo_concurrency.lockutils [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.345 182096 DEBUG oslo_concurrency.lockutils [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.345 182096 DEBUG oslo_concurrency.lockutils [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.345 182096 DEBUG nova.compute.manager [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] No waiting events found dispatching network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.345 182096 WARNING nova.compute.manager [req-e8ac0532-c976-4714-8762-f168e352a400 req-713818cc-d8b0-4375-b0eb-791228c3707c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received unexpected event network-vif-plugged-4f2fd023-0fe8-4946-bd4d-158c350e1470 for instance with vm_state active and task_state deleting.
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.398 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "88007f09-2a63-400d-b5d6-8cc5e235cfec" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.398 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "88007f09-2a63-400d-b5d6-8cc5e235cfec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.415 182096 DEBUG nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.489 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.489 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.494 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.494 182096 INFO nova.compute.claims [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.646 182096 DEBUG nova.compute.provider_tree [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.659 182096 DEBUG nova.scheduler.client.report [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.678 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.678 182096 DEBUG nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.716 182096 DEBUG nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.716 182096 DEBUG nova.network.neutron [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.726 182096 INFO nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.737 182096 DEBUG nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.804 182096 DEBUG nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.805 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.805 182096 INFO nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Creating image(s)
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.805 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "/var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.806 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "/var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.806 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "/var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.834 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.896 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.896 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.897 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.906 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.952 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.953 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.968 182096 DEBUG nova.network.neutron [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.968 182096 DEBUG nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.976 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.977 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:48 compute-0 nova_compute[182092]: 2026-01-23 09:21:48.977 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.035 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.036 182096 DEBUG nova.virt.disk.api [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Checking if we can resize image /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.036 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.087 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.087 182096 DEBUG nova.virt.disk.api [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Cannot resize image /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.088 182096 DEBUG nova.objects.instance [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 88007f09-2a63-400d-b5d6-8cc5e235cfec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.113 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.113 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Ensure instance console log exists: /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.114 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.114 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.114 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.116 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.120 182096 WARNING nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.127 182096 DEBUG nova.virt.libvirt.host [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.128 182096 DEBUG nova.virt.libvirt.host [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.131 182096 DEBUG nova.virt.libvirt.host [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.131 182096 DEBUG nova.virt.libvirt.host [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.132 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.132 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.133 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.133 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.133 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.133 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.133 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.133 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.134 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.134 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.134 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.134 182096 DEBUG nova.virt.hardware [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.137 182096 DEBUG nova.objects.instance [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 88007f09-2a63-400d-b5d6-8cc5e235cfec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.155 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <uuid>88007f09-2a63-400d-b5d6-8cc5e235cfec</uuid>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <name>instance-00000043</name>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <nova:name>tempest-ListImageFiltersTestJSON-server-315426533</nova:name>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:21:49</nova:creationTime>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:21:49 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:21:49 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:21:49 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:21:49 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:21:49 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:21:49 compute-0 nova_compute[182092]:         <nova:user uuid="3d72b076966a4293938c0de4745cb90e">tempest-ListImageFiltersTestJSON-229235271-project-member</nova:user>
Jan 23 09:21:49 compute-0 nova_compute[182092]:         <nova:project uuid="264309705d864400acd32d564396e5c0">tempest-ListImageFiltersTestJSON-229235271</nova:project>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <system>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <entry name="serial">88007f09-2a63-400d-b5d6-8cc5e235cfec</entry>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <entry name="uuid">88007f09-2a63-400d-b5d6-8cc5e235cfec</entry>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     </system>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <os>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   </os>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <features>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   </features>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk.config"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/console.log" append="off"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <video>
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     </video>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:21:49 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:21:49 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:21:49 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:21:49 compute-0 nova_compute[182092]: </domain>
Jan 23 09:21:49 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.193 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.193 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.194 182096 INFO nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Using config drive
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.338 182096 INFO nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Creating config drive at /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk.config
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.342 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5z4y04al execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.467 182096 DEBUG oslo_concurrency.processutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5z4y04al" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.483 182096 DEBUG nova.compute.manager [req-3da7cb8e-5dfc-4279-beaa-ccc478b15dc3 req-25a0271f-130f-4a33-bce2-ba57a6507d2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.483 182096 DEBUG oslo_concurrency.lockutils [req-3da7cb8e-5dfc-4279-beaa-ccc478b15dc3 req-25a0271f-130f-4a33-bce2-ba57a6507d2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.484 182096 DEBUG oslo_concurrency.lockutils [req-3da7cb8e-5dfc-4279-beaa-ccc478b15dc3 req-25a0271f-130f-4a33-bce2-ba57a6507d2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.484 182096 DEBUG oslo_concurrency.lockutils [req-3da7cb8e-5dfc-4279-beaa-ccc478b15dc3 req-25a0271f-130f-4a33-bce2-ba57a6507d2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.484 182096 DEBUG nova.compute.manager [req-3da7cb8e-5dfc-4279-beaa-ccc478b15dc3 req-25a0271f-130f-4a33-bce2-ba57a6507d2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] No waiting events found dispatching network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.484 182096 WARNING nova.compute.manager [req-3da7cb8e-5dfc-4279-beaa-ccc478b15dc3 req-25a0271f-130f-4a33-bce2-ba57a6507d2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received unexpected event network-vif-plugged-7c108f42-128c-4c60-86d0-3b9925670206 for instance with vm_state active and task_state deleting.
Jan 23 09:21:49 compute-0 systemd-machined[153562]: New machine qemu-29-instance-00000043.
Jan 23 09:21:49 compute-0 systemd[1]: Started Virtual Machine qemu-29-instance-00000043.
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.628 182096 DEBUG nova.network.neutron [-] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.641 182096 INFO nova.compute.manager [-] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Took 2.36 seconds to deallocate network for instance.
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.697 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.698 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.767 182096 DEBUG nova.compute.provider_tree [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.782 182096 DEBUG nova.scheduler.client.report [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.800 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.831 182096 INFO nova.scheduler.client.report [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Deleted allocations for instance 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3
Jan 23 09:21:49 compute-0 nova_compute[182092]: 2026-01-23 09:21:49.896 182096 DEBUG oslo_concurrency.lockutils [None req-88321b4f-d5a7-4a77-8a90-10d04fcafe68 654d6343796442d7946c6adfe1179a1f c2a29923657746f0a0e0d489a2b1a730 - - default default] Lock "2fbdbef7-69ac-429d-9d7c-2aa24650cdf3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.966s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.013 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160110.013037, 88007f09-2a63-400d-b5d6-8cc5e235cfec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.013 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] VM Resumed (Lifecycle Event)
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.015 182096 DEBUG nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.016 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.018 182096 INFO nova.virt.libvirt.driver [-] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Instance spawned successfully.
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.018 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.038 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.040 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.040 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.041 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.041 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.041 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.042 182096 DEBUG nova.virt.libvirt.driver [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.044 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.073 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.073 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160110.0150542, 88007f09-2a63-400d-b5d6-8cc5e235cfec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.073 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] VM Started (Lifecycle Event)
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.102 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.104 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.121 182096 INFO nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Took 1.32 seconds to spawn the instance on the hypervisor.
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.122 182096 DEBUG nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.142 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.188 182096 INFO nova.compute.manager [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Took 1.73 seconds to build instance.
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.199 182096 DEBUG oslo_concurrency.lockutils [None req-84aa54a3-9c99-44db-a973-4f54f1aaf87e 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "88007f09-2a63-400d-b5d6-8cc5e235cfec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.345 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:50 compute-0 nova_compute[182092]: 2026-01-23 09:21:50.509 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:51 compute-0 nova_compute[182092]: 2026-01-23 09:21:51.666 182096 DEBUG nova.compute.manager [req-4c2f5817-f5ca-444d-81dd-6b9d2de65b5f req-4d6350cd-ee22-4593-9a88-04fed29ccbd8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Received event network-vif-deleted-7c108f42-128c-4c60-86d0-3b9925670206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:21:51 compute-0 nova_compute[182092]: 2026-01-23 09:21:51.993 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:52 compute-0 nova_compute[182092]: 2026-01-23 09:21:52.208 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:54 compute-0 podman[216740]: 2026-01-23 09:21:54.23319678 +0000 UTC m=+0.068727904 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:21:54 compute-0 podman[216739]: 2026-01-23 09:21:54.237881889 +0000 UTC m=+0.074202232 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 09:21:56 compute-0 nova_compute[182092]: 2026-01-23 09:21:56.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:56 compute-0 nova_compute[182092]: 2026-01-23 09:21:56.994 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:57 compute-0 nova_compute[182092]: 2026-01-23 09:21:57.209 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:21:57 compute-0 nova_compute[182092]: 2026-01-23 09:21:57.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:58 compute-0 nova_compute[182092]: 2026-01-23 09:21:58.438 182096 DEBUG nova.compute.manager [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:21:58 compute-0 nova_compute[182092]: 2026-01-23 09:21:58.480 182096 INFO nova.compute.manager [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] instance snapshotting
Jan 23 09:21:58 compute-0 nova_compute[182092]: 2026-01-23 09:21:58.646 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:58 compute-0 nova_compute[182092]: 2026-01-23 09:21:58.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:58 compute-0 nova_compute[182092]: 2026-01-23 09:21:58.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:58 compute-0 nova_compute[182092]: 2026-01-23 09:21:58.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:21:58 compute-0 nova_compute[182092]: 2026-01-23 09:21:58.856 182096 INFO nova.virt.libvirt.driver [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Beginning live snapshot process
Jan 23 09:21:59 compute-0 virtqemud[181713]: invalid argument: disk vda does not have an active block job
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.054 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.101 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json -f qcow2" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.102 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.156 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.167 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.224 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.225 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp065h8mm7/1709aeb9f8574af6bd760508dd8f3456.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.248 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp065h8mm7/1709aeb9f8574af6bd760508dd8f3456.delta 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.250 182096 INFO nova.virt.libvirt.driver [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.287 182096 DEBUG nova.virt.libvirt.guest [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.290 182096 INFO nova.virt.libvirt.driver [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.314 182096 DEBUG nova.privsep.utils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.314 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp065h8mm7/1709aeb9f8574af6bd760508dd8f3456.delta /var/lib/nova/instances/snapshots/tmp065h8mm7/1709aeb9f8574af6bd760508dd8f3456 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.364 182096 DEBUG oslo_concurrency.processutils [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp065h8mm7/1709aeb9f8574af6bd760508dd8f3456.delta /var/lib/nova/instances/snapshots/tmp065h8mm7/1709aeb9f8574af6bd760508dd8f3456" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.364 182096 INFO nova.virt.libvirt.driver [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Snapshot extracted, beginning image upload
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.691 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.691 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.691 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.691 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.735 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.797 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.798 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:21:59 compute-0 nova_compute[182092]: 2026-01-23 09:21:59.843 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.050 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.052 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5603MB free_disk=73.27423095703125GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.052 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.052 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.125 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 88007f09-2a63-400d-b5d6-8cc5e235cfec actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.125 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.125 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.167 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.183 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.279 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:22:00 compute-0 nova_compute[182092]: 2026-01-23 09:22:00.279 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.264 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.264 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.265 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.414 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-88007f09-2a63-400d-b5d6-8cc5e235cfec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.414 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-88007f09-2a63-400d-b5d6-8cc5e235cfec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.414 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.415 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88007f09-2a63-400d-b5d6-8cc5e235cfec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.581 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.678 182096 INFO nova.virt.libvirt.driver [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Snapshot image upload complete
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.678 182096 INFO nova.compute.manager [None req-e5fe7054-e957-4995-a979-37dc2410aef9 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Took 3.19 seconds to snapshot the instance on the hypervisor.
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.853 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.869 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-88007f09-2a63-400d-b5d6-8cc5e235cfec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.869 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.869 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:22:01 compute-0 nova_compute[182092]: 2026-01-23 09:22:01.997 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:02 compute-0 nova_compute[182092]: 2026-01-23 09:22:02.187 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160107.1865106, 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:02 compute-0 nova_compute[182092]: 2026-01-23 09:22:02.187 182096 INFO nova.compute.manager [-] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] VM Stopped (Lifecycle Event)
Jan 23 09:22:02 compute-0 nova_compute[182092]: 2026-01-23 09:22:02.207 182096 DEBUG nova.compute.manager [None req-d7e6bf0a-15e3-4b01-8851-8803093f9b1f - - - - - -] [instance: 2fbdbef7-69ac-429d-9d7c-2aa24650cdf3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:02 compute-0 nova_compute[182092]: 2026-01-23 09:22:02.210 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.169 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.169 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.209 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.374 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.374 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.378 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.379 182096 INFO nova.compute.claims [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.686 182096 DEBUG nova.compute.provider_tree [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.830 182096 DEBUG nova.scheduler.client.report [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.876 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:05 compute-0 nova_compute[182092]: 2026-01-23 09:22:05.877 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.095 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.095 182096 DEBUG nova.network.neutron [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.147 182096 INFO nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.178 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:22:06 compute-0 podman[216820]: 2026-01-23 09:22:06.20686465 +0000 UTC m=+0.037964986 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:22:06 compute-0 podman[216819]: 2026-01-23 09:22:06.207041363 +0000 UTC m=+0.040469542 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.311 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.311 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.312 182096 INFO nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Creating image(s)
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.312 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "/var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.312 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "/var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.313 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "/var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.323 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.370 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.371 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.372 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.381 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.427 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.428 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.450 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.451 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.452 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.498 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.499 182096 DEBUG nova.virt.disk.api [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Checking if we can resize image /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.500 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.547 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.548 182096 DEBUG nova.virt.disk.api [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Cannot resize image /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.548 182096 DEBUG nova.objects.instance [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lazy-loading 'migration_context' on Instance uuid 39d03a92-ae38-47c1-aa4a-8041f0b84b15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.641 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.641 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Ensure instance console log exists: /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.641 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.642 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.642 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:06 compute-0 nova_compute[182092]: 2026-01-23 09:22:06.999 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:07 compute-0 nova_compute[182092]: 2026-01-23 09:22:07.210 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:07 compute-0 nova_compute[182092]: 2026-01-23 09:22:07.379 182096 DEBUG nova.policy [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '98de0e9eb03748a688af5ed612c515ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a7ee14425b6747efb2e72842d0c056ad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:22:08 compute-0 podman[216871]: 2026-01-23 09:22:08.20134109 +0000 UTC m=+0.038700934 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public)
Jan 23 09:22:08 compute-0 nova_compute[182092]: 2026-01-23 09:22:08.865 182096 DEBUG nova.network.neutron [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Successfully created port: 481d5a7e-9b12-4201-ad8a-d88913d78f20 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:22:12 compute-0 nova_compute[182092]: 2026-01-23 09:22:12.000 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:12 compute-0 nova_compute[182092]: 2026-01-23 09:22:12.212 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:13 compute-0 nova_compute[182092]: 2026-01-23 09:22:13.460 182096 DEBUG nova.network.neutron [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Successfully updated port: 481d5a7e-9b12-4201-ad8a-d88913d78f20 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:22:13 compute-0 nova_compute[182092]: 2026-01-23 09:22:13.481 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "refresh_cache-39d03a92-ae38-47c1-aa4a-8041f0b84b15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:22:13 compute-0 nova_compute[182092]: 2026-01-23 09:22:13.481 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquired lock "refresh_cache-39d03a92-ae38-47c1-aa4a-8041f0b84b15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:22:13 compute-0 nova_compute[182092]: 2026-01-23 09:22:13.481 182096 DEBUG nova.network.neutron [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:22:13 compute-0 nova_compute[182092]: 2026-01-23 09:22:13.795 182096 DEBUG nova.compute.manager [req-e134e6bc-76e6-4507-aa5c-0b8923e58ee3 req-09dfdeaf-8f02-44b1-aa8b-9fc86dba9a77 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received event network-changed-481d5a7e-9b12-4201-ad8a-d88913d78f20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:13 compute-0 nova_compute[182092]: 2026-01-23 09:22:13.796 182096 DEBUG nova.compute.manager [req-e134e6bc-76e6-4507-aa5c-0b8923e58ee3 req-09dfdeaf-8f02-44b1-aa8b-9fc86dba9a77 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Refreshing instance network info cache due to event network-changed-481d5a7e-9b12-4201-ad8a-d88913d78f20. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:22:13 compute-0 nova_compute[182092]: 2026-01-23 09:22:13.796 182096 DEBUG oslo_concurrency.lockutils [req-e134e6bc-76e6-4507-aa5c-0b8923e58ee3 req-09dfdeaf-8f02-44b1-aa8b-9fc86dba9a77 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-39d03a92-ae38-47c1-aa4a-8041f0b84b15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:22:13 compute-0 nova_compute[182092]: 2026-01-23 09:22:13.947 182096 DEBUG nova.network.neutron [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.148 182096 DEBUG nova.network.neutron [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Updating instance_info_cache with network_info: [{"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.178 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Releasing lock "refresh_cache-39d03a92-ae38-47c1-aa4a-8041f0b84b15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.178 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Instance network_info: |[{"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.179 182096 DEBUG oslo_concurrency.lockutils [req-e134e6bc-76e6-4507-aa5c-0b8923e58ee3 req-09dfdeaf-8f02-44b1-aa8b-9fc86dba9a77 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-39d03a92-ae38-47c1-aa4a-8041f0b84b15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.179 182096 DEBUG nova.network.neutron [req-e134e6bc-76e6-4507-aa5c-0b8923e58ee3 req-09dfdeaf-8f02-44b1-aa8b-9fc86dba9a77 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Refreshing network info cache for port 481d5a7e-9b12-4201-ad8a-d88913d78f20 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.181 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Start _get_guest_xml network_info=[{"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.184 182096 WARNING nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.190 182096 DEBUG nova.virt.libvirt.host [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.190 182096 DEBUG nova.virt.libvirt.host [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.195 182096 DEBUG nova.virt.libvirt.host [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.196 182096 DEBUG nova.virt.libvirt.host [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.197 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.197 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.197 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.198 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.198 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.198 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.198 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.198 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.199 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.199 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.199 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.199 182096 DEBUG nova.virt.hardware [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.202 182096 DEBUG nova.virt.libvirt.vif [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:22:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2002074765',display_name='tempest-tempest.common.compute-instance-2002074765-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2002074765-2',id=69,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a7ee14425b6747efb2e72842d0c056ad',ramdisk_id='',reservation_id='r-k2h5jsjr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1292493636',owner_user_name='tempest-MultipleCreateTestJSON-1292493636-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:22:06Z,user_data=None,user_id='98de0e9eb03748a688af5ed612c515ce',uuid=39d03a92-ae38-47c1-aa4a-8041f0b84b15,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.202 182096 DEBUG nova.network.os_vif_util [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converting VIF {"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.202 182096 DEBUG nova.network.os_vif_util [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:1b:04,bridge_name='br-int',has_traffic_filtering=True,id=481d5a7e-9b12-4201-ad8a-d88913d78f20,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap481d5a7e-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.203 182096 DEBUG nova.objects.instance [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lazy-loading 'pci_devices' on Instance uuid 39d03a92-ae38-47c1-aa4a-8041f0b84b15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.227 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <uuid>39d03a92-ae38-47c1-aa4a-8041f0b84b15</uuid>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <name>instance-00000045</name>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <nova:name>tempest-tempest.common.compute-instance-2002074765-2</nova:name>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:22:16</nova:creationTime>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:22:16 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:22:16 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:22:16 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:22:16 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:22:16 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:22:16 compute-0 nova_compute[182092]:         <nova:user uuid="98de0e9eb03748a688af5ed612c515ce">tempest-MultipleCreateTestJSON-1292493636-project-member</nova:user>
Jan 23 09:22:16 compute-0 nova_compute[182092]:         <nova:project uuid="a7ee14425b6747efb2e72842d0c056ad">tempest-MultipleCreateTestJSON-1292493636</nova:project>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:22:16 compute-0 nova_compute[182092]:         <nova:port uuid="481d5a7e-9b12-4201-ad8a-d88913d78f20">
Jan 23 09:22:16 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <system>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <entry name="serial">39d03a92-ae38-47c1-aa4a-8041f0b84b15</entry>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <entry name="uuid">39d03a92-ae38-47c1-aa4a-8041f0b84b15</entry>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </system>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <os>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   </os>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <features>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   </features>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk.config"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:01:1b:04"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <target dev="tap481d5a7e-9b"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/console.log" append="off"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <video>
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </video>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:22:16 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:22:16 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:22:16 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:22:16 compute-0 nova_compute[182092]: </domain>
Jan 23 09:22:16 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.228 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Preparing to wait for external event network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.229 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.229 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.229 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.229 182096 DEBUG nova.virt.libvirt.vif [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:22:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2002074765',display_name='tempest-tempest.common.compute-instance-2002074765-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2002074765-2',id=69,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a7ee14425b6747efb2e72842d0c056ad',ramdisk_id='',reservation_id='r-k2h5jsjr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1292493636',owner_user_name='tempest-MultipleCreateTestJSON-1292493636-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:22:06Z,user_data=None,user_id='98de0e9eb03748a688af5ed612c515ce',uuid=39d03a92-ae38-47c1-aa4a-8041f0b84b15,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.230 182096 DEBUG nova.network.os_vif_util [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converting VIF {"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.230 182096 DEBUG nova.network.os_vif_util [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:1b:04,bridge_name='br-int',has_traffic_filtering=True,id=481d5a7e-9b12-4201-ad8a-d88913d78f20,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap481d5a7e-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.230 182096 DEBUG os_vif [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:1b:04,bridge_name='br-int',has_traffic_filtering=True,id=481d5a7e-9b12-4201-ad8a-d88913d78f20,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap481d5a7e-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.231 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.231 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.231 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.233 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.233 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap481d5a7e-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.233 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap481d5a7e-9b, col_values=(('external_ids', {'iface-id': '481d5a7e-9b12-4201-ad8a-d88913d78f20', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:1b:04', 'vm-uuid': '39d03a92-ae38-47c1-aa4a-8041f0b84b15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.234 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:16 compute-0 NetworkManager[54920]: <info>  [1769160136.2352] manager: (tap481d5a7e-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.237 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.240 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.240 182096 INFO os_vif [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:1b:04,bridge_name='br-int',has_traffic_filtering=True,id=481d5a7e-9b12-4201-ad8a-d88913d78f20,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap481d5a7e-9b')
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.276 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.276 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.277 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] No VIF found with MAC fa:16:3e:01:1b:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:22:16 compute-0 nova_compute[182092]: 2026-01-23 09:22:16.277 182096 INFO nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Using config drive
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.000 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 podman[216891]: 2026-01-23 09:22:17.219253371 +0000 UTC m=+0.058132684 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.251 182096 INFO nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Creating config drive at /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk.config
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.255 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp21xc_q9y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.374 182096 DEBUG oslo_concurrency.processutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp21xc_q9y" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:17 compute-0 kernel: tap481d5a7e-9b: entered promiscuous mode
Jan 23 09:22:17 compute-0 NetworkManager[54920]: <info>  [1769160137.4164] manager: (tap481d5a7e-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Jan 23 09:22:17 compute-0 ovn_controller[94697]: 2026-01-23T09:22:17Z|00198|binding|INFO|Claiming lport 481d5a7e-9b12-4201-ad8a-d88913d78f20 for this chassis.
Jan 23 09:22:17 compute-0 ovn_controller[94697]: 2026-01-23T09:22:17Z|00199|binding|INFO|481d5a7e-9b12-4201-ad8a-d88913d78f20: Claiming fa:16:3e:01:1b:04 10.100.0.10
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.418 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.424 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.431 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:1b:04 10.100.0.10'], port_security=['fa:16:3e:01:1b:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '39d03a92-ae38-47c1-aa4a-8041f0b84b15', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dae6fd97-3d1f-488f-995e-721a38374a58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7ee14425b6747efb2e72842d0c056ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': '91792383-edee-41db-a8be-9e88e85faeab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3159c470-c9ca-403b-a7f0-e2ef4dc1d89a, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=481d5a7e-9b12-4201-ad8a-d88913d78f20) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.432 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 481d5a7e-9b12-4201-ad8a-d88913d78f20 in datapath dae6fd97-3d1f-488f-995e-721a38374a58 bound to our chassis
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.433 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dae6fd97-3d1f-488f-995e-721a38374a58
Jan 23 09:22:17 compute-0 systemd-udevd[216931]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.443 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f263d3ba-13ac-4fd4-8e95-bd3f80087d41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.443 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdae6fd97-31 in ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.445 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdae6fd97-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.445 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3887b0-067d-4361-8910-87ecce921b7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.447 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[62870062-bc57-4ab1-b4d5-238d474fda01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 NetworkManager[54920]: <info>  [1769160137.4538] device (tap481d5a7e-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:22:17 compute-0 NetworkManager[54920]: <info>  [1769160137.4564] device (tap481d5a7e-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.459 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[73fa3d9f-e314-4e91-b39f-0dcd65a2093b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 systemd-machined[153562]: New machine qemu-30-instance-00000045.
Jan 23 09:22:17 compute-0 systemd[1]: Started Virtual Machine qemu-30-instance-00000045.
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.479 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 ovn_controller[94697]: 2026-01-23T09:22:17Z|00200|binding|INFO|Setting lport 481d5a7e-9b12-4201-ad8a-d88913d78f20 ovn-installed in OVS
Jan 23 09:22:17 compute-0 ovn_controller[94697]: 2026-01-23T09:22:17Z|00201|binding|INFO|Setting lport 481d5a7e-9b12-4201-ad8a-d88913d78f20 up in Southbound
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.484 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.488 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[32ee01dc-23b2-44fe-909a-988965e67e21]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.508 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4680bc06-b08f-411e-a765-a5144b1fda6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.511 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[508d13cd-a4c2-419f-bc6e-a704831c6527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 NetworkManager[54920]: <info>  [1769160137.5122] manager: (tapdae6fd97-30): new Veth device (/org/freedesktop/NetworkManager/Devices/110)
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.534 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9206cc11-e236-48ac-a862-86783008d828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.536 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a6092711-dae0-443d-b586-29198c0f45f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 NetworkManager[54920]: <info>  [1769160137.5566] device (tapdae6fd97-30): carrier: link connected
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.560 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[44d31202-ff03-4627-b43a-9386684a2509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.573 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c60da6ee-605a-442d-842b-6de38984d663]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdae6fd97-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:10:35'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364304, 'reachable_time': 37679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216957, 'error': None, 'target': 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.584 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[807ffbc5-8c2c-40c1-b586-d42fc8a1dfc8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:1035'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364304, 'tstamp': 364304}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216958, 'error': None, 'target': 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.599 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[31be45f9-cee7-417f-9b6a-075ab0b2da81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdae6fd97-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:10:35'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364304, 'reachable_time': 37679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216959, 'error': None, 'target': 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.619 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[76232975-c020-4c47-92d5-b2e137736918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.657 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[96c24f3f-480e-4cc9-b43d-d6e7be904bab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.658 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdae6fd97-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.659 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.659 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdae6fd97-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.660 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 NetworkManager[54920]: <info>  [1769160137.6613] manager: (tapdae6fd97-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Jan 23 09:22:17 compute-0 kernel: tapdae6fd97-30: entered promiscuous mode
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.663 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdae6fd97-30, col_values=(('external_ids', {'iface-id': 'fa4a8929-9892-4386-8add-a0f479d3fa44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.663 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 ovn_controller[94697]: 2026-01-23T09:22:17Z|00202|binding|INFO|Releasing lport fa4a8929-9892-4386-8add-a0f479d3fa44 from this chassis (sb_readonly=0)
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.675 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.676 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.676 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dae6fd97-3d1f-488f-995e-721a38374a58.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dae6fd97-3d1f-488f-995e-721a38374a58.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.676 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[968b87a4-8fa7-4899-afa2-5a80d472b6cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.677 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-dae6fd97-3d1f-488f-995e-721a38374a58
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/dae6fd97-3d1f-488f-995e-721a38374a58.pid.haproxy
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID dae6fd97-3d1f-488f-995e-721a38374a58
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:22:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:17.678 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'env', 'PROCESS_TAG=haproxy-dae6fd97-3d1f-488f-995e-721a38374a58', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dae6fd97-3d1f-488f-995e-721a38374a58.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:22:17 compute-0 podman[216993]: 2026-01-23 09:22:17.953176679 +0000 UTC m=+0.032740358 container create 39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.964 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160137.9636078, 39d03a92-ae38-47c1-aa4a-8041f0b84b15 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:17 compute-0 nova_compute[182092]: 2026-01-23 09:22:17.964 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] VM Started (Lifecycle Event)
Jan 23 09:22:17 compute-0 systemd[1]: Started libpod-conmon-39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be.scope.
Jan 23 09:22:18 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.009 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.012 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160137.9637613, 39d03a92-ae38-47c1-aa4a-8041f0b84b15 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.013 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] VM Paused (Lifecycle Event)
Jan 23 09:22:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/475ae066e61fbccd74c747ce753a1570257f5c10d5ba2be90e662d7c066ac1f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:22:18 compute-0 podman[216993]: 2026-01-23 09:22:18.021261869 +0000 UTC m=+0.100825559 container init 39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:22:18 compute-0 podman[216993]: 2026-01-23 09:22:18.025893537 +0000 UTC m=+0.105457217 container start 39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:22:18 compute-0 podman[216993]: 2026-01-23 09:22:17.937061516 +0000 UTC m=+0.016625216 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:22:18 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217006]: [NOTICE]   (217010) : New worker (217012) forked
Jan 23 09:22:18 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217006]: [NOTICE]   (217010) : Loading success.
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.043 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.046 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.059 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.256 182096 DEBUG nova.compute.manager [req-83adbf31-1044-4954-8781-803c8ebe11ab req-52111f52-4cf0-4d93-8917-a10b1a86b73b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received event network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.256 182096 DEBUG oslo_concurrency.lockutils [req-83adbf31-1044-4954-8781-803c8ebe11ab req-52111f52-4cf0-4d93-8917-a10b1a86b73b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.256 182096 DEBUG oslo_concurrency.lockutils [req-83adbf31-1044-4954-8781-803c8ebe11ab req-52111f52-4cf0-4d93-8917-a10b1a86b73b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.256 182096 DEBUG oslo_concurrency.lockutils [req-83adbf31-1044-4954-8781-803c8ebe11ab req-52111f52-4cf0-4d93-8917-a10b1a86b73b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.257 182096 DEBUG nova.compute.manager [req-83adbf31-1044-4954-8781-803c8ebe11ab req-52111f52-4cf0-4d93-8917-a10b1a86b73b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Processing event network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.257 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.259 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160138.25932, 39d03a92-ae38-47c1-aa4a-8041f0b84b15 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.259 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] VM Resumed (Lifecycle Event)
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.261 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.263 182096 INFO nova.virt.libvirt.driver [-] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Instance spawned successfully.
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.263 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.288 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.288 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.289 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.289 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.289 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.290 182096 DEBUG nova.virt.libvirt.driver [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.293 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.294 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.329 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.371 182096 INFO nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Took 12.06 seconds to spawn the instance on the hypervisor.
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.371 182096 DEBUG nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.468 182096 INFO nova.compute.manager [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Took 13.14 seconds to build instance.
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.497 182096 DEBUG oslo_concurrency.lockutils [None req-cd3e0907-5790-412e-ab5c-30649ef10ca5 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.941 182096 DEBUG nova.network.neutron [req-e134e6bc-76e6-4507-aa5c-0b8923e58ee3 req-09dfdeaf-8f02-44b1-aa8b-9fc86dba9a77 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Updated VIF entry in instance network info cache for port 481d5a7e-9b12-4201-ad8a-d88913d78f20. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.941 182096 DEBUG nova.network.neutron [req-e134e6bc-76e6-4507-aa5c-0b8923e58ee3 req-09dfdeaf-8f02-44b1-aa8b-9fc86dba9a77 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Updating instance_info_cache with network_info: [{"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:18 compute-0 nova_compute[182092]: 2026-01-23 09:22:18.952 182096 DEBUG oslo_concurrency.lockutils [req-e134e6bc-76e6-4507-aa5c-0b8923e58ee3 req-09dfdeaf-8f02-44b1-aa8b-9fc86dba9a77 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-39d03a92-ae38-47c1-aa4a-8041f0b84b15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:22:20 compute-0 nova_compute[182092]: 2026-01-23 09:22:20.599 182096 DEBUG nova.compute.manager [req-bc939111-d0f6-4fa6-b1fd-f14e9b2627b8 req-f5c06f9d-2fdf-4b9d-a27b-8782515a3de7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received event network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:20 compute-0 nova_compute[182092]: 2026-01-23 09:22:20.599 182096 DEBUG oslo_concurrency.lockutils [req-bc939111-d0f6-4fa6-b1fd-f14e9b2627b8 req-f5c06f9d-2fdf-4b9d-a27b-8782515a3de7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:20 compute-0 nova_compute[182092]: 2026-01-23 09:22:20.600 182096 DEBUG oslo_concurrency.lockutils [req-bc939111-d0f6-4fa6-b1fd-f14e9b2627b8 req-f5c06f9d-2fdf-4b9d-a27b-8782515a3de7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:20 compute-0 nova_compute[182092]: 2026-01-23 09:22:20.600 182096 DEBUG oslo_concurrency.lockutils [req-bc939111-d0f6-4fa6-b1fd-f14e9b2627b8 req-f5c06f9d-2fdf-4b9d-a27b-8782515a3de7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:20 compute-0 nova_compute[182092]: 2026-01-23 09:22:20.600 182096 DEBUG nova.compute.manager [req-bc939111-d0f6-4fa6-b1fd-f14e9b2627b8 req-f5c06f9d-2fdf-4b9d-a27b-8782515a3de7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] No waiting events found dispatching network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:22:20 compute-0 nova_compute[182092]: 2026-01-23 09:22:20.600 182096 WARNING nova.compute.manager [req-bc939111-d0f6-4fa6-b1fd-f14e9b2627b8 req-f5c06f9d-2fdf-4b9d-a27b-8782515a3de7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received unexpected event network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 for instance with vm_state active and task_state None.
Jan 23 09:22:21 compute-0 nova_compute[182092]: 2026-01-23 09:22:21.235 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.003 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.597 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.597 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.598 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.598 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.598 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.605 182096 INFO nova.compute.manager [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Terminating instance
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.612 182096 DEBUG nova.compute.manager [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:22:22 compute-0 kernel: tap481d5a7e-9b (unregistering): left promiscuous mode
Jan 23 09:22:22 compute-0 NetworkManager[54920]: <info>  [1769160142.6292] device (tap481d5a7e-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:22:22 compute-0 ovn_controller[94697]: 2026-01-23T09:22:22Z|00203|binding|INFO|Releasing lport 481d5a7e-9b12-4201-ad8a-d88913d78f20 from this chassis (sb_readonly=0)
Jan 23 09:22:22 compute-0 ovn_controller[94697]: 2026-01-23T09:22:22Z|00204|binding|INFO|Setting lport 481d5a7e-9b12-4201-ad8a-d88913d78f20 down in Southbound
Jan 23 09:22:22 compute-0 ovn_controller[94697]: 2026-01-23T09:22:22Z|00205|binding|INFO|Removing iface tap481d5a7e-9b ovn-installed in OVS
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.634 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.635 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.638 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:1b:04 10.100.0.10'], port_security=['fa:16:3e:01:1b:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '39d03a92-ae38-47c1-aa4a-8041f0b84b15', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dae6fd97-3d1f-488f-995e-721a38374a58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7ee14425b6747efb2e72842d0c056ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '91792383-edee-41db-a8be-9e88e85faeab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3159c470-c9ca-403b-a7f0-e2ef4dc1d89a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=481d5a7e-9b12-4201-ad8a-d88913d78f20) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.639 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 481d5a7e-9b12-4201-ad8a-d88913d78f20 in datapath dae6fd97-3d1f-488f-995e-721a38374a58 unbound from our chassis
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.641 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dae6fd97-3d1f-488f-995e-721a38374a58, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.644 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2683d632-f6c8-403b-be09-f42218b3b044]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.645 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 namespace which is not needed anymore
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.651 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000045.scope: Deactivated successfully.
Jan 23 09:22:22 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000045.scope: Consumed 4.838s CPU time.
Jan 23 09:22:22 compute-0 systemd-machined[153562]: Machine qemu-30-instance-00000045 terminated.
Jan 23 09:22:22 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217006]: [NOTICE]   (217010) : haproxy version is 2.8.14-c23fe91
Jan 23 09:22:22 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217006]: [NOTICE]   (217010) : path to executable is /usr/sbin/haproxy
Jan 23 09:22:22 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217006]: [WARNING]  (217010) : Exiting Master process...
Jan 23 09:22:22 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217006]: [ALERT]    (217010) : Current worker (217012) exited with code 143 (Terminated)
Jan 23 09:22:22 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217006]: [WARNING]  (217010) : All workers exited. Exiting... (0)
Jan 23 09:22:22 compute-0 systemd[1]: libpod-39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be.scope: Deactivated successfully.
Jan 23 09:22:22 compute-0 conmon[217006]: conmon 39272a39d644b4c61ba2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be.scope/container/memory.events
Jan 23 09:22:22 compute-0 podman[217038]: 2026-01-23 09:22:22.747912579 +0000 UTC m=+0.035297784 container died 39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 09:22:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-475ae066e61fbccd74c747ce753a1570257f5c10d5ba2be90e662d7c066ac1f7-merged.mount: Deactivated successfully.
Jan 23 09:22:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be-userdata-shm.mount: Deactivated successfully.
Jan 23 09:22:22 compute-0 podman[217038]: 2026-01-23 09:22:22.768944527 +0000 UTC m=+0.056329733 container cleanup 39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 09:22:22 compute-0 systemd[1]: libpod-conmon-39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be.scope: Deactivated successfully.
Jan 23 09:22:22 compute-0 podman[217064]: 2026-01-23 09:22:22.807127744 +0000 UTC m=+0.023826623 container remove 39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.810 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ecadc99a-c494-4603-9ade-bdcd7eeb3ccc]: (4, ('Fri Jan 23 09:22:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 (39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be)\n39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be\nFri Jan 23 09:22:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 (39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be)\n39272a39d644b4c61ba25cd4dea459940b05b3812712c7e2b52882665f1995be\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.811 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bc773101-7739-4357-bebc-a71d38e1de1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.812 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdae6fd97-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.813 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 kernel: tapdae6fd97-30: left promiscuous mode
Jan 23 09:22:22 compute-0 NetworkManager[54920]: <info>  [1769160142.8293] manager: (tap481d5a7e-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.829 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.831 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.833 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[785c8f41-c5bf-47ea-b87d-f79933c45b1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.841 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b3231128-2ddc-439e-a20e-4ce9a369d5d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.842 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3e6c42b6-8c13-414c-b390-37cb301e4d5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.854 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cb535ba5-9825-4144-b066-18612a5902f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364299, 'reachable_time': 37326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217088, 'error': None, 'target': 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:22 compute-0 systemd[1]: run-netns-ovnmeta\x2ddae6fd97\x2d3d1f\x2d488f\x2d995e\x2d721a38374a58.mount: Deactivated successfully.
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.856 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:22:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:22.856 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[10cebcd4-1555-4b96-ac6d-f0875f517f19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.860 182096 INFO nova.virt.libvirt.driver [-] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Instance destroyed successfully.
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.860 182096 DEBUG nova.objects.instance [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lazy-loading 'resources' on Instance uuid 39d03a92-ae38-47c1-aa4a-8041f0b84b15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.896 182096 DEBUG nova.virt.libvirt.vif [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:22:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2002074765',display_name='tempest-tempest.common.compute-instance-2002074765-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2002074765-2',id=69,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-23T09:22:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a7ee14425b6747efb2e72842d0c056ad',ramdisk_id='',reservation_id='r-k2h5jsjr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1292493636',owner_user_name='tempest-MultipleCreateTestJSON-1292493636-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:22:18Z,user_data=None,user_id='98de0e9eb03748a688af5ed612c515ce',uuid=39d03a92-ae38-47c1-aa4a-8041f0b84b15,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.896 182096 DEBUG nova.network.os_vif_util [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converting VIF {"id": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "address": "fa:16:3e:01:1b:04", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap481d5a7e-9b", "ovs_interfaceid": "481d5a7e-9b12-4201-ad8a-d88913d78f20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.897 182096 DEBUG nova.network.os_vif_util [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:1b:04,bridge_name='br-int',has_traffic_filtering=True,id=481d5a7e-9b12-4201-ad8a-d88913d78f20,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap481d5a7e-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.897 182096 DEBUG os_vif [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:1b:04,bridge_name='br-int',has_traffic_filtering=True,id=481d5a7e-9b12-4201-ad8a-d88913d78f20,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap481d5a7e-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.899 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.899 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap481d5a7e-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.900 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.901 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.903 182096 INFO os_vif [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:1b:04,bridge_name='br-int',has_traffic_filtering=True,id=481d5a7e-9b12-4201-ad8a-d88913d78f20,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap481d5a7e-9b')
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.903 182096 INFO nova.virt.libvirt.driver [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Deleting instance files /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15_del
Jan 23 09:22:22 compute-0 nova_compute[182092]: 2026-01-23 09:22:22.904 182096 INFO nova.virt.libvirt.driver [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Deletion of /var/lib/nova/instances/39d03a92-ae38-47c1-aa4a-8041f0b84b15_del complete
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.043 182096 DEBUG nova.compute.manager [req-8fe179cf-5a39-43c1-8d9a-01be38f88c2b req-16bb0f84-002c-4903-ab2e-a2e62440239e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received event network-vif-unplugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.043 182096 DEBUG oslo_concurrency.lockutils [req-8fe179cf-5a39-43c1-8d9a-01be38f88c2b req-16bb0f84-002c-4903-ab2e-a2e62440239e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.043 182096 DEBUG oslo_concurrency.lockutils [req-8fe179cf-5a39-43c1-8d9a-01be38f88c2b req-16bb0f84-002c-4903-ab2e-a2e62440239e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.043 182096 DEBUG oslo_concurrency.lockutils [req-8fe179cf-5a39-43c1-8d9a-01be38f88c2b req-16bb0f84-002c-4903-ab2e-a2e62440239e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.044 182096 DEBUG nova.compute.manager [req-8fe179cf-5a39-43c1-8d9a-01be38f88c2b req-16bb0f84-002c-4903-ab2e-a2e62440239e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] No waiting events found dispatching network-vif-unplugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.044 182096 DEBUG nova.compute.manager [req-8fe179cf-5a39-43c1-8d9a-01be38f88c2b req-16bb0f84-002c-4903-ab2e-a2e62440239e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received event network-vif-unplugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.087 182096 INFO nova.compute.manager [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.087 182096 DEBUG oslo.service.loopingcall [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.088 182096 DEBUG nova.compute.manager [-] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:22:23 compute-0 nova_compute[182092]: 2026-01-23 09:22:23.088 182096 DEBUG nova.network.neutron [-] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.258 182096 DEBUG nova.network.neutron [-] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.274 182096 INFO nova.compute.manager [-] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Took 1.19 seconds to deallocate network for instance.
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.363 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.363 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.364 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "88007f09-2a63-400d-b5d6-8cc5e235cfec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.364 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "88007f09-2a63-400d-b5d6-8cc5e235cfec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.364 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "88007f09-2a63-400d-b5d6-8cc5e235cfec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.364 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "88007f09-2a63-400d-b5d6-8cc5e235cfec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.365 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "88007f09-2a63-400d-b5d6-8cc5e235cfec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.371 182096 INFO nova.compute.manager [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Terminating instance
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.378 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "refresh_cache-88007f09-2a63-400d-b5d6-8cc5e235cfec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.378 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquired lock "refresh_cache-88007f09-2a63-400d-b5d6-8cc5e235cfec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.378 182096 DEBUG nova.network.neutron [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.446 182096 DEBUG nova.compute.provider_tree [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.466 182096 DEBUG nova.scheduler.client.report [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.533 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.575 182096 INFO nova.scheduler.client.report [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Deleted allocations for instance 39d03a92-ae38-47c1-aa4a-8041f0b84b15
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.620 182096 DEBUG nova.compute.manager [req-f44df43e-3913-4666-8808-0399724a5024 req-edd31744-2e8b-4280-b2da-c9cc722520ce 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received event network-vif-deleted-481d5a7e-9b12-4201-ad8a-d88913d78f20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.678 182096 DEBUG oslo_concurrency.lockutils [None req-5648ea1f-2742-4dc5-8292-e575da142337 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:24 compute-0 nova_compute[182092]: 2026-01-23 09:22:24.707 182096 DEBUG nova.network.neutron [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.189 182096 DEBUG nova.compute.manager [req-78bf4fd7-d71a-4f5c-a0b4-59973a78c780 req-697944eb-54d8-469b-8feb-4e5b61d4b680 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received event network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.190 182096 DEBUG oslo_concurrency.lockutils [req-78bf4fd7-d71a-4f5c-a0b4-59973a78c780 req-697944eb-54d8-469b-8feb-4e5b61d4b680 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.190 182096 DEBUG oslo_concurrency.lockutils [req-78bf4fd7-d71a-4f5c-a0b4-59973a78c780 req-697944eb-54d8-469b-8feb-4e5b61d4b680 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.190 182096 DEBUG oslo_concurrency.lockutils [req-78bf4fd7-d71a-4f5c-a0b4-59973a78c780 req-697944eb-54d8-469b-8feb-4e5b61d4b680 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "39d03a92-ae38-47c1-aa4a-8041f0b84b15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.190 182096 DEBUG nova.compute.manager [req-78bf4fd7-d71a-4f5c-a0b4-59973a78c780 req-697944eb-54d8-469b-8feb-4e5b61d4b680 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] No waiting events found dispatching network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.190 182096 WARNING nova.compute.manager [req-78bf4fd7-d71a-4f5c-a0b4-59973a78c780 req-697944eb-54d8-469b-8feb-4e5b61d4b680 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Received unexpected event network-vif-plugged-481d5a7e-9b12-4201-ad8a-d88913d78f20 for instance with vm_state deleted and task_state None.
Jan 23 09:22:25 compute-0 podman[217095]: 2026-01-23 09:22:25.200555561 +0000 UTC m=+0.033573702 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:22:25 compute-0 podman[217094]: 2026-01-23 09:22:25.209378095 +0000 UTC m=+0.047317501 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.653 182096 DEBUG nova.network.neutron [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.675 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Releasing lock "refresh_cache-88007f09-2a63-400d-b5d6-8cc5e235cfec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.675 182096 DEBUG nova.compute.manager [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:22:25 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000043.scope: Deactivated successfully.
Jan 23 09:22:25 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000043.scope: Consumed 11.239s CPU time.
Jan 23 09:22:25 compute-0 systemd-machined[153562]: Machine qemu-29-instance-00000043 terminated.
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.905 182096 INFO nova.virt.libvirt.driver [-] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Instance destroyed successfully.
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.905 182096 DEBUG nova.objects.instance [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lazy-loading 'resources' on Instance uuid 88007f09-2a63-400d-b5d6-8cc5e235cfec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.920 182096 INFO nova.virt.libvirt.driver [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Deleting instance files /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec_del
Jan 23 09:22:25 compute-0 nova_compute[182092]: 2026-01-23 09:22:25.921 182096 INFO nova.virt.libvirt.driver [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Deletion of /var/lib/nova/instances/88007f09-2a63-400d-b5d6-8cc5e235cfec_del complete
Jan 23 09:22:26 compute-0 nova_compute[182092]: 2026-01-23 09:22:26.013 182096 INFO nova.compute.manager [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 23 09:22:26 compute-0 nova_compute[182092]: 2026-01-23 09:22:26.013 182096 DEBUG oslo.service.loopingcall [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:22:26 compute-0 nova_compute[182092]: 2026-01-23 09:22:26.013 182096 DEBUG nova.compute.manager [-] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:22:26 compute-0 nova_compute[182092]: 2026-01-23 09:22:26.013 182096 DEBUG nova.network.neutron [-] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:22:26 compute-0 nova_compute[182092]: 2026-01-23 09:22:26.903 182096 DEBUG nova.network.neutron [-] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:22:26 compute-0 nova_compute[182092]: 2026-01-23 09:22:26.928 182096 DEBUG nova.network.neutron [-] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:26 compute-0 nova_compute[182092]: 2026-01-23 09:22:26.962 182096 INFO nova.compute.manager [-] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Took 0.95 seconds to deallocate network for instance.
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.004 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.048 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.049 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.096 182096 DEBUG nova.compute.provider_tree [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.118 182096 DEBUG nova.scheduler.client.report [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.147 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.189 182096 INFO nova.scheduler.client.report [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Deleted allocations for instance 88007f09-2a63-400d-b5d6-8cc5e235cfec
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.266 182096 DEBUG oslo_concurrency.lockutils [None req-184b1d47-04d7-4314-b203-cb49c4814438 3d72b076966a4293938c0de4745cb90e 264309705d864400acd32d564396e5c0 - - default default] Lock "88007f09-2a63-400d-b5d6-8cc5e235cfec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.902s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:27.568 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:22:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:27.569 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.570 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:27 compute-0 nova_compute[182092]: 2026-01-23 09:22:27.901 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.292 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "3565c36e-66c4-408c-9b5d-34356739efc5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.293 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3565c36e-66c4-408c-9b5d-34356739efc5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.314 182096 DEBUG nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.422 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.422 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.428 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.428 182096 INFO nova.compute.claims [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.559 182096 DEBUG nova.compute.provider_tree [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.579 182096 DEBUG nova.scheduler.client.report [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.600 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.601 182096 DEBUG nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.658 182096 DEBUG nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.673 182096 INFO nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.678 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.679 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.699 182096 DEBUG nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.702 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.829 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.830 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.834 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.835 182096 INFO nova.compute.claims [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.892 182096 DEBUG nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.893 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.894 182096 INFO nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Creating image(s)
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.894 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.894 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.895 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.905 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.953 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.953 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.954 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:29 compute-0 nova_compute[182092]: 2026-01-23 09:22:29.964 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.010 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.011 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.034 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.035 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.035 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.066 182096 DEBUG nova.compute.provider_tree [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.081 182096 DEBUG nova.scheduler.client.report [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.085 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.086 182096 DEBUG nova.virt.disk.api [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Checking if we can resize image /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.086 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.101 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.102 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.134 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.134 182096 DEBUG nova.virt.disk.api [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Cannot resize image /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.134 182096 DEBUG nova.objects.instance [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'migration_context' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.151 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.152 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Ensure instance console log exists: /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.152 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.152 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.153 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.154 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.157 182096 WARNING nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.161 182096 DEBUG nova.virt.libvirt.host [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.162 182096 DEBUG nova.virt.libvirt.host [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.171 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.171 182096 DEBUG nova.network.neutron [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.174 182096 DEBUG nova.virt.libvirt.host [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.174 182096 DEBUG nova.virt.libvirt.host [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.175 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.175 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.176 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.176 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.176 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.176 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.177 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.177 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.177 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.177 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.177 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.178 182096 DEBUG nova.virt.hardware [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.180 182096 DEBUG nova.objects.instance [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'pci_devices' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.190 182096 INFO nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.198 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <uuid>3565c36e-66c4-408c-9b5d-34356739efc5</uuid>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <name>instance-00000048</name>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerShowV247Test-server-2102390312</nova:name>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:22:30</nova:creationTime>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:22:30 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:22:30 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:22:30 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:22:30 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:22:30 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:22:30 compute-0 nova_compute[182092]:         <nova:user uuid="06fe70d62c9946f9b10d7d260a8d24aa">tempest-ServerShowV247Test-2078701084-project-member</nova:user>
Jan 23 09:22:30 compute-0 nova_compute[182092]:         <nova:project uuid="a6a3a3aa1b474ddd8cf8d23d2ea93efa">tempest-ServerShowV247Test-2078701084</nova:project>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <system>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <entry name="serial">3565c36e-66c4-408c-9b5d-34356739efc5</entry>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <entry name="uuid">3565c36e-66c4-408c-9b5d-34356739efc5</entry>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     </system>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <os>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   </os>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <features>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   </features>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.config"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/console.log" append="off"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <video>
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     </video>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:22:30 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:22:30 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:22:30 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:22:30 compute-0 nova_compute[182092]: </domain>
Jan 23 09:22:30 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.208 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.252 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.252 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.253 182096 INFO nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Using config drive
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.331 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.332 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.333 182096 INFO nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Creating image(s)
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.333 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "/var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.334 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "/var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.334 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "/var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.345 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.392 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.393 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.394 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.403 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.449 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.450 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.473 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.474 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.475 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.522 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.523 182096 DEBUG nova.virt.disk.api [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Checking if we can resize image /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.523 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.570 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.571 182096 DEBUG nova.virt.disk.api [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Cannot resize image /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.571 182096 DEBUG nova.objects.instance [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lazy-loading 'migration_context' on Instance uuid 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.585 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.586 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Ensure instance console log exists: /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.586 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.586 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.587 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.909 182096 INFO nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Creating config drive at /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.config
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.914 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplhllc_z0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:30 compute-0 nova_compute[182092]: 2026-01-23 09:22:30.992 182096 DEBUG nova.policy [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '98de0e9eb03748a688af5ed612c515ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a7ee14425b6747efb2e72842d0c056ad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.032 182096 DEBUG oslo_concurrency.processutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplhllc_z0" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:31 compute-0 systemd-machined[153562]: New machine qemu-31-instance-00000048.
Jan 23 09:22:31 compute-0 systemd[1]: Started Virtual Machine qemu-31-instance-00000048.
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.250 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160151.2497597, 3565c36e-66c4-408c-9b5d-34356739efc5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.250 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] VM Resumed (Lifecycle Event)
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.253 182096 DEBUG nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.253 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.256 182096 INFO nova.virt.libvirt.driver [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance spawned successfully.
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.256 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.275 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.275 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.275 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.276 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.276 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.277 182096 DEBUG nova.virt.libvirt.driver [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.280 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.282 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.302 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.303 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160151.2539835, 3565c36e-66c4-408c-9b5d-34356739efc5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.303 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] VM Started (Lifecycle Event)
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.327 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.329 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.344 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.355 182096 INFO nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Took 1.46 seconds to spawn the instance on the hypervisor.
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.356 182096 DEBUG nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.417 182096 INFO nova.compute.manager [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Took 2.01 seconds to build instance.
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.433 182096 DEBUG oslo_concurrency.lockutils [None req-b3193712-0f69-42bc-a1a2-c381f7d398a9 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3565c36e-66c4-408c-9b5d-34356739efc5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:31 compute-0 nova_compute[182092]: 2026-01-23 09:22:31.695 182096 DEBUG nova.network.neutron [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Successfully created port: 245227da-dfb0-477a-8626-df9b5425ae01 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:22:32 compute-0 nova_compute[182092]: 2026-01-23 09:22:32.006 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:32 compute-0 nova_compute[182092]: 2026-01-23 09:22:32.662 182096 DEBUG nova.network.neutron [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Successfully updated port: 245227da-dfb0-477a-8626-df9b5425ae01 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:22:32 compute-0 nova_compute[182092]: 2026-01-23 09:22:32.680 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "refresh_cache-1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:22:32 compute-0 nova_compute[182092]: 2026-01-23 09:22:32.680 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquired lock "refresh_cache-1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:22:32 compute-0 nova_compute[182092]: 2026-01-23 09:22:32.681 182096 DEBUG nova.network.neutron [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:22:32 compute-0 nova_compute[182092]: 2026-01-23 09:22:32.902 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:32.999 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'name': 'tempest-ServerShowV247Test-server-2102390312', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000048', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'hostId': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:32.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.001 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.001 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-2102390312>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-2102390312>]
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:22:33 compute-0 nova_compute[182092]: 2026-01-23 09:22:33.006 182096 DEBUG nova.compute.manager [req-9a3efa2c-b0f4-4876-9651-6d23334e4b45 req-90141b6a-7cd2-4df2-b0df-76cd1a5ccca2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received event network-changed-245227da-dfb0-477a-8626-df9b5425ae01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:33 compute-0 nova_compute[182092]: 2026-01-23 09:22:33.006 182096 DEBUG nova.compute.manager [req-9a3efa2c-b0f4-4876-9651-6d23334e4b45 req-90141b6a-7cd2-4df2-b0df-76cd1a5ccca2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Refreshing instance network info cache due to event network-changed-245227da-dfb0-477a-8626-df9b5425ae01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:22:33 compute-0 nova_compute[182092]: 2026-01-23 09:22:33.006 182096 DEBUG oslo_concurrency.lockutils [req-9a3efa2c-b0f4-4876-9651-6d23334e4b45 req-90141b6a-7cd2-4df2-b0df-76cd1a5ccca2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.020 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.020 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b43d51e3-478b-43fd-9323-f00aad15b7ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.001977', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c387726-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': 'efc915d1204bc64bb576ba0067e1a56cfc0f13397d029b1dffb411ea75f23c57'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.001977', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c388180-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': '4b0f069df6dda93ec684360a46790abb6b4cb7a6e7aaca15ce349a494d1fb288'}]}, 'timestamp': '2026-01-23 09:22:33.020971', '_unique_id': 'f398258d55624046ae2344f2d097d079'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.021 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.022 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.030 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.030 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e0cec8a-3148-4b45-ac73-2a169781aef9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.022448', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c39f98e-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.551830387, 'message_signature': '5b018b53e2f06cc374cc6ec5759e641151257119162e78c451c5f48c335759cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.022448', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c3a02d0-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.551830387, 'message_signature': '297873e526e8f2e6fe5acdbcf15290464d93109678a3dcedc0449ef633731b09'}]}, 'timestamp': '2026-01-23 09:22:33.030824', '_unique_id': 'dc17c6c919fd4e9d8f233ca053b96464'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.032 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.032 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-2102390312>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-2102390312>]
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.032 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.032 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-2102390312>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-2102390312>]
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.032 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.043 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/cpu volume: 1730000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0056a39-a0ad-41dd-814c-800e97a2c5bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1730000000, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'timestamp': '2026-01-23T09:22:33.032597', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0c3bfedc-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.57291164, 'message_signature': 'f0b60c7cfa69b927274a806e8d278e434d104465413e2162179f8cc5c4432521'}]}, 'timestamp': '2026-01-23 09:22:33.043838', '_unique_id': '3c6cb849c75743ac850d6a1353648581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.044 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec5c6069-8bc8-4f07-9390-4ae3004b51b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.045015', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c3c3550-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': '4ffbff8017a037a68464708734b486fb49156aa6da7fa49d55eb51e3a7cb35a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.045015', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c3c3d34-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': '5f89c55ebad1b575bf3caac749d2ab8a26ec56c5ef272d10023df3421855c2c0'}]}, 'timestamp': '2026-01-23 09:22:33.045434', '_unique_id': '137f46b9180047a594a5207466eb9880'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.045 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.046 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.046 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.046 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3400d2d6-a5c5-4c09-aaf6-b4bf3a32f95f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.046682', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c3c76dc-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.551830387, 'message_signature': '5f1a87f38989f5c7d670c3340f6859a515b9aafdcfb6c186c4fb49c2e388e56e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.046682', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c3c7eb6-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.551830387, 'message_signature': 'a85dbfb1681675ec89e6a615fbcfc639f4d42af0a95c125f122b59f49f950aa9'}]}, 'timestamp': '2026-01-23 09:22:33.047097', '_unique_id': '6d012fde4cbb413d85f2118623b73839'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.047 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.048 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.048 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.048 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35e52737-d863-41c2-8e68-617f716cad2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.048155', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c3cb020-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': 'eb341c566c7434fe1c4a5804d3f05d6f7512e91dda6c02338d39d6fdba8f9405'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.048155', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c3cb872-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': 'bbf0758cabe483d8b69e80b59cafc496372d202583e20c7252e66a2f82c3784c'}]}, 'timestamp': '2026-01-23 09:22:33.048575', '_unique_id': 'd76fb2b7cd89425bb307fb7923d2a3da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 3565c36e-66c4-408c-9b5d-34356739efc5: ceilometer.compute.pollsters.NoVolumeException
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.049 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e2bcb61-4d80-4a64-b75e-761f5b3340de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.049941', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c3cf5ee-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.551830387, 'message_signature': 'c3a3b4c0861b01f05a23b69c3f9726e63dd42024e430b5c03b68b29f67da208f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.049941', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c3cfdc8-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.551830387, 'message_signature': '9a49754ded2b10c5a312a7196fbe7d7559cd7580d098d34a6cc7af5e80664381'}]}, 'timestamp': '2026-01-23 09:22:33.050348', '_unique_id': '7036aa056d294d1f91cc2171713d0c66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.050 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.051 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.051 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73fafc12-64d4-47b9-bc83-6278b13f83c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.051419', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c3d2fbe-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': '728c4ac499429c9dbfbc5fb037ba93d44780e767b51dcab9829882af5a666fb3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.051419', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c3d3860-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': '2511446cb7faa58efe3f9ea79c06a9e0f5d1f6bb96eec3b0abaf0bdccb362d28'}]}, 'timestamp': '2026-01-23 09:22:33.051852', '_unique_id': 'b8da96aad1ef4a2e8ddd65408372a885'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.read.latency volume: 175438394 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.read.latency volume: 3311748 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da20e864-b4a5-4c95-9de4-d5048a30320c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 175438394, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.053053', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c3d6f56-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': '658b8053f7de46e23fa7be7e923f99b6e9acb328845c1bde7e013594ebf0989c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3311748, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.053053', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c3d771c-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': 'e413f5af355e0be77fa6d86ac9b5d41ade7efe285cc3af53e0c236596adf0ce2'}]}, 'timestamp': '2026-01-23 09:22:33.053463', '_unique_id': 'feff8271fa9a456282ce42d499ae040c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.054 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.054 12 DEBUG ceilometer.compute.pollsters [-] 3565c36e-66c4-408c-9b5d-34356739efc5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30dce3a0-f96d-4abf-a265-ab109266898e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-vda', 'timestamp': '2026-01-23T09:22:33.054621', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0c3dad90-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': 'af6b9714defdc481e0e95e4e7d784927d29e2c6fcade1e3678c4335c66a6010f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '06fe70d62c9946f9b10d7d260a8d24aa', 'user_name': None, 'project_id': 'a6a3a3aa1b474ddd8cf8d23d2ea93efa', 'project_name': None, 'resource_id': '3565c36e-66c4-408c-9b5d-34356739efc5-sda', 'timestamp': '2026-01-23T09:22:33.054621', 'resource_metadata': {'display_name': 'tempest-ServerShowV247Test-server-2102390312', 'name': 'instance-00000048', 'instance_id': '3565c36e-66c4-408c-9b5d-34356739efc5', 'instance_type': 'm1.nano', 'host': '80929ae39b23f398222d5f8fa754f9ac6f7ab73c750af8a5a86f0292', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0c3db56a-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3658.531358876, 'message_signature': '1077ccb3bbe0f0e14b96cbb6d064cfe38c3778d64c0ae371d85a3e787c56b913'}]}, 'timestamp': '2026-01-23 09:22:33.055051', '_unique_id': '059f76ff25174d958f5cf3e1105c525a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.056 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.056 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-2102390312>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-2102390312>]
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:22:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:22:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:22:33 compute-0 nova_compute[182092]: 2026-01-23 09:22:33.460 182096 DEBUG nova.network.neutron [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:22:34 compute-0 nova_compute[182092]: 2026-01-23 09:22:34.805 182096 INFO nova.compute.manager [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Rebuilding instance
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.010 182096 DEBUG nova.network.neutron [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Updating instance_info_cache with network_info: [{"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.036 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Releasing lock "refresh_cache-1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.037 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Instance network_info: |[{"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.037 182096 DEBUG oslo_concurrency.lockutils [req-9a3efa2c-b0f4-4876-9651-6d23334e4b45 req-90141b6a-7cd2-4df2-b0df-76cd1a5ccca2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.037 182096 DEBUG nova.network.neutron [req-9a3efa2c-b0f4-4876-9651-6d23334e4b45 req-90141b6a-7cd2-4df2-b0df-76cd1a5ccca2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Refreshing network info cache for port 245227da-dfb0-477a-8626-df9b5425ae01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.040 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Start _get_guest_xml network_info=[{"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.043 182096 WARNING nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.047 182096 DEBUG nova.virt.libvirt.host [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.048 182096 DEBUG nova.virt.libvirt.host [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.053 182096 DEBUG nova.virt.libvirt.host [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.053 182096 DEBUG nova.virt.libvirt.host [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.054 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.054 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.055 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.055 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.055 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.055 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.056 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.058 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.058 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.058 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.060 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.060 182096 DEBUG nova.virt.hardware [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.063 182096 DEBUG nova.virt.libvirt.vif [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:22:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-379842775',display_name='tempest-MultipleCreateTestJSON-server-379842775-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-379842775-1',id=73,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a7ee14425b6747efb2e72842d0c056ad',ramdisk_id='',reservation_id='r-b6v7lkbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1292493636',owner_user_name='tempest-MultipleCreateTestJSON-1292493636-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:22:30Z,user_data=None,user_id='98de0e9eb03748a688af5ed612c515ce',uuid=1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.063 182096 DEBUG nova.network.os_vif_util [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converting VIF {"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.064 182096 DEBUG nova.network.os_vif_util [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:c9:19,bridge_name='br-int',has_traffic_filtering=True,id=245227da-dfb0-477a-8626-df9b5425ae01,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap245227da-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.065 182096 DEBUG nova.objects.instance [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lazy-loading 'pci_devices' on Instance uuid 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.077 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <uuid>1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214</uuid>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <name>instance-00000049</name>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <nova:name>tempest-MultipleCreateTestJSON-server-379842775-1</nova:name>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:22:35</nova:creationTime>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:22:35 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:22:35 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:22:35 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:22:35 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:22:35 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:22:35 compute-0 nova_compute[182092]:         <nova:user uuid="98de0e9eb03748a688af5ed612c515ce">tempest-MultipleCreateTestJSON-1292493636-project-member</nova:user>
Jan 23 09:22:35 compute-0 nova_compute[182092]:         <nova:project uuid="a7ee14425b6747efb2e72842d0c056ad">tempest-MultipleCreateTestJSON-1292493636</nova:project>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:22:35 compute-0 nova_compute[182092]:         <nova:port uuid="245227da-dfb0-477a-8626-df9b5425ae01">
Jan 23 09:22:35 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <system>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <entry name="serial">1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214</entry>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <entry name="uuid">1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214</entry>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </system>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <os>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   </os>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <features>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   </features>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk.config"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:a2:c9:19"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <target dev="tap245227da-df"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/console.log" append="off"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <video>
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </video>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:22:35 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:22:35 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:22:35 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:22:35 compute-0 nova_compute[182092]: </domain>
Jan 23 09:22:35 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.081 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Preparing to wait for external event network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.081 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.081 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.082 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.082 182096 DEBUG nova.virt.libvirt.vif [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:22:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-379842775',display_name='tempest-MultipleCreateTestJSON-server-379842775-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-379842775-1',id=73,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a7ee14425b6747efb2e72842d0c056ad',ramdisk_id='',reservation_id='r-b6v7lkbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1292493636',owner_user_name='tempest-MultipleCreateTestJSON-1292493636-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:22:30Z,user_data=None,user_id='98de0e9eb03748a688af5ed612c515ce',uuid=1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.083 182096 DEBUG nova.network.os_vif_util [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converting VIF {"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.083 182096 DEBUG nova.network.os_vif_util [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:c9:19,bridge_name='br-int',has_traffic_filtering=True,id=245227da-dfb0-477a-8626-df9b5425ae01,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap245227da-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.083 182096 DEBUG os_vif [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:c9:19,bridge_name='br-int',has_traffic_filtering=True,id=245227da-dfb0-477a-8626-df9b5425ae01,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap245227da-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.084 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.084 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.085 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.087 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.087 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap245227da-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.087 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap245227da-df, col_values=(('external_ids', {'iface-id': '245227da-dfb0-477a-8626-df9b5425ae01', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:c9:19', 'vm-uuid': '1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:35 compute-0 NetworkManager[54920]: <info>  [1769160155.0895] manager: (tap245227da-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.088 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.096 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.097 182096 INFO os_vif [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:c9:19,bridge_name='br-int',has_traffic_filtering=True,id=245227da-dfb0-477a-8626-df9b5425ae01,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap245227da-df')
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.138 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.138 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.138 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] No VIF found with MAC fa:16:3e:a2:c9:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.139 182096 INFO nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Using config drive
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.445 182096 DEBUG nova.compute.manager [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.527 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'pci_requests' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.540 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'pci_devices' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.552 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'resources' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.566 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'migration_context' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.570 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.615 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.617 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.808 182096 INFO nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Creating config drive at /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk.config
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.813 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9o6wk8us execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.930 182096 DEBUG oslo_concurrency.processutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9o6wk8us" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:35 compute-0 kernel: tap245227da-df: entered promiscuous mode
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.969 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:35 compute-0 NetworkManager[54920]: <info>  [1769160155.9711] manager: (tap245227da-df): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Jan 23 09:22:35 compute-0 ovn_controller[94697]: 2026-01-23T09:22:35Z|00206|binding|INFO|Claiming lport 245227da-dfb0-477a-8626-df9b5425ae01 for this chassis.
Jan 23 09:22:35 compute-0 ovn_controller[94697]: 2026-01-23T09:22:35Z|00207|binding|INFO|245227da-dfb0-477a-8626-df9b5425ae01: Claiming fa:16:3e:a2:c9:19 10.100.0.9
Jan 23 09:22:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.981 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:c9:19 10.100.0.9'], port_security=['fa:16:3e:a2:c9:19 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dae6fd97-3d1f-488f-995e-721a38374a58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7ee14425b6747efb2e72842d0c056ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': '91792383-edee-41db-a8be-9e88e85faeab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3159c470-c9ca-403b-a7f0-e2ef4dc1d89a, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=245227da-dfb0-477a-8626-df9b5425ae01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:22:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.983 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 245227da-dfb0-477a-8626-df9b5425ae01 in datapath dae6fd97-3d1f-488f-995e-721a38374a58 bound to our chassis
Jan 23 09:22:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.984 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dae6fd97-3d1f-488f-995e-721a38374a58
Jan 23 09:22:35 compute-0 ovn_controller[94697]: 2026-01-23T09:22:35Z|00208|binding|INFO|Setting lport 245227da-dfb0-477a-8626-df9b5425ae01 ovn-installed in OVS
Jan 23 09:22:35 compute-0 ovn_controller[94697]: 2026-01-23T09:22:35Z|00209|binding|INFO|Setting lport 245227da-dfb0-477a-8626-df9b5425ae01 up in Southbound
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.986 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:35 compute-0 nova_compute[182092]: 2026-01-23 09:22:35.989 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.992 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5a04b5b9-5590-4c00-9889-a2c4eacc9483]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.996 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdae6fd97-31 in ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:22:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.998 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdae6fd97-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:22:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.998 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1002df-9589-4ff1-81d5-49847fcf790d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:35.999 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[66568451-5ee3-4d0b-841a-1cbf1688a1ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.008 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[7163dca6-a3d2-416d-84ec-a2a9dd2b6e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 systemd-udevd[217220]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:22:36 compute-0 systemd-machined[153562]: New machine qemu-32-instance-00000049.
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.018 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ca923909-ef3a-420f-a2b9-3d17c96e3bb0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 systemd[1]: Started Virtual Machine qemu-32-instance-00000049.
Jan 23 09:22:36 compute-0 NetworkManager[54920]: <info>  [1769160156.0250] device (tap245227da-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:22:36 compute-0 NetworkManager[54920]: <info>  [1769160156.0255] device (tap245227da-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.048 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[20900c38-cb2f-46ca-89f9-c192f15a8afa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 NetworkManager[54920]: <info>  [1769160156.0554] manager: (tapdae6fd97-30): new Veth device (/org/freedesktop/NetworkManager/Devices/115)
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.056 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab474e8-fac1-4b9f-bbd6-484e51fe13cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 systemd-udevd[217224]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.084 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[edac83d6-8143-40b8-a3b1-ebefdb5e9510]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.090 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0eb39c-70e2-4c51-9988-b5b5aa251c2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 NetworkManager[54920]: <info>  [1769160156.1065] device (tapdae6fd97-30): carrier: link connected
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.111 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[310954e1-8896-45d1-be22-4ac77ebf1c28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.123 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[99afb75f-97c6-4057-af53-026a4abe3475]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdae6fd97-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:10:35'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366159, 'reachable_time': 30452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217245, 'error': None, 'target': 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.134 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9079d34c-d56f-4192-984a-b28a456c14c5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:1035'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366159, 'tstamp': 366159}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217246, 'error': None, 'target': 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.146 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aab02af8-7a25-4753-bc91-be7d62931f2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdae6fd97-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:10:35'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366159, 'reachable_time': 30452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217247, 'error': None, 'target': 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.170 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2bc12e-2963-4f67-8b10-18b78b5e092f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.212 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1efb73d6-d6d0-4dfd-83fb-1a430d7b0919]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.214 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdae6fd97-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.214 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.214 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdae6fd97-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:36 compute-0 NetworkManager[54920]: <info>  [1769160156.2167] manager: (tapdae6fd97-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 23 09:22:36 compute-0 kernel: tapdae6fd97-30: entered promiscuous mode
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.221 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.226 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdae6fd97-30, col_values=(('external_ids', {'iface-id': 'fa4a8929-9892-4386-8add-a0f479d3fa44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:36 compute-0 ovn_controller[94697]: 2026-01-23T09:22:36Z|00210|binding|INFO|Releasing lport fa4a8929-9892-4386-8add-a0f479d3fa44 from this chassis (sb_readonly=0)
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.238 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.240 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.241 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dae6fd97-3d1f-488f-995e-721a38374a58.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dae6fd97-3d1f-488f-995e-721a38374a58.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.242 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[385d12fc-00ae-4e8a-89cf-9e30c3e06b44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.242 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-dae6fd97-3d1f-488f-995e-721a38374a58
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/dae6fd97-3d1f-488f-995e-721a38374a58.pid.haproxy
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID dae6fd97-3d1f-488f-995e-721a38374a58
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:22:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:36.243 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'env', 'PROCESS_TAG=haproxy-dae6fd97-3d1f-488f-995e-721a38374a58', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dae6fd97-3d1f-488f-995e-721a38374a58.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.440 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160156.4403317, 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.441 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] VM Started (Lifecycle Event)
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.459 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.464 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160156.440432, 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.464 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] VM Paused (Lifecycle Event)
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.476 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.478 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:22:36 compute-0 nova_compute[182092]: 2026-01-23 09:22:36.515 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:22:36 compute-0 podman[217284]: 2026-01-23 09:22:36.532789855 +0000 UTC m=+0.033201249 container create 3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:22:36 compute-0 systemd[1]: Started libpod-conmon-3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0.scope.
Jan 23 09:22:36 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:22:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5892c6d29c0cdb13e89231bb8f890538a7bdebaab49717aeab194426bf57605a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:22:36 compute-0 podman[217284]: 2026-01-23 09:22:36.604181252 +0000 UTC m=+0.104592656 container init 3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:22:36 compute-0 podman[217284]: 2026-01-23 09:22:36.608407075 +0000 UTC m=+0.108818469 container start 3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:22:36 compute-0 podman[217284]: 2026-01-23 09:22:36.517467137 +0000 UTC m=+0.017878541 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:22:36 compute-0 podman[217297]: 2026-01-23 09:22:36.618142832 +0000 UTC m=+0.056475418 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:22:36 compute-0 podman[217294]: 2026-01-23 09:22:36.620347983 +0000 UTC m=+0.059292072 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:22:36 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217312]: [NOTICE]   (217331) : New worker (217339) forked
Jan 23 09:22:36 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217312]: [NOTICE]   (217331) : Loading success.
Jan 23 09:22:37 compute-0 nova_compute[182092]: 2026-01-23 09:22:37.007 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:37 compute-0 nova_compute[182092]: 2026-01-23 09:22:37.858 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160142.8580444, 39d03a92-ae38-47c1-aa4a-8041f0b84b15 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:37 compute-0 nova_compute[182092]: 2026-01-23 09:22:37.859 182096 INFO nova.compute.manager [-] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] VM Stopped (Lifecycle Event)
Jan 23 09:22:37 compute-0 nova_compute[182092]: 2026-01-23 09:22:37.881 182096 DEBUG nova.compute.manager [None req-f8420ecf-eb88-49e7-a699-6c145e900d39 - - - - - -] [instance: 39d03a92-ae38-47c1-aa4a-8041f0b84b15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:37 compute-0 nova_compute[182092]: 2026-01-23 09:22:37.989 182096 DEBUG nova.network.neutron [req-9a3efa2c-b0f4-4876-9651-6d23334e4b45 req-90141b6a-7cd2-4df2-b0df-76cd1a5ccca2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Updated VIF entry in instance network info cache for port 245227da-dfb0-477a-8626-df9b5425ae01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:22:37 compute-0 nova_compute[182092]: 2026-01-23 09:22:37.989 182096 DEBUG nova.network.neutron [req-9a3efa2c-b0f4-4876-9651-6d23334e4b45 req-90141b6a-7cd2-4df2-b0df-76cd1a5ccca2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Updating instance_info_cache with network_info: [{"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.013 182096 DEBUG oslo_concurrency.lockutils [req-9a3efa2c-b0f4-4876-9651-6d23334e4b45 req-90141b6a-7cd2-4df2-b0df-76cd1a5ccca2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.106 182096 DEBUG nova.compute.manager [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received event network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.107 182096 DEBUG oslo_concurrency.lockutils [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.107 182096 DEBUG oslo_concurrency.lockutils [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.108 182096 DEBUG oslo_concurrency.lockutils [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.108 182096 DEBUG nova.compute.manager [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Processing event network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.108 182096 DEBUG nova.compute.manager [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received event network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.108 182096 DEBUG oslo_concurrency.lockutils [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.109 182096 DEBUG oslo_concurrency.lockutils [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.109 182096 DEBUG oslo_concurrency.lockutils [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.110 182096 DEBUG nova.compute.manager [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] No waiting events found dispatching network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.110 182096 WARNING nova.compute.manager [req-c11400f4-4707-47da-9b7d-4ad51fa5279f req-a36d278c-b983-4442-a031-d2402aa7d0c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received unexpected event network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 for instance with vm_state building and task_state spawning.
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.111 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.113 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160158.1131883, 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.113 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] VM Resumed (Lifecycle Event)
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.115 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.117 182096 INFO nova.virt.libvirt.driver [-] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Instance spawned successfully.
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.117 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.135 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.137 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.145 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.145 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.146 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.146 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.147 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.147 182096 DEBUG nova.virt.libvirt.driver [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.170 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.267 182096 INFO nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Took 7.93 seconds to spawn the instance on the hypervisor.
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.267 182096 DEBUG nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.554 182096 INFO nova.compute.manager [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Took 8.77 seconds to build instance.
Jan 23 09:22:38 compute-0 nova_compute[182092]: 2026-01-23 09:22:38.577 182096 DEBUG oslo_concurrency.lockutils [None req-bc0a6ea1-3240-464f-b6ec-b6b3ed75844d 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:39 compute-0 podman[217344]: 2026-01-23 09:22:39.211151981 +0000 UTC m=+0.045844792 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible)
Jan 23 09:22:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:39.857 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:39.858 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:39.858 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:40 compute-0 nova_compute[182092]: 2026-01-23 09:22:40.089 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:40 compute-0 nova_compute[182092]: 2026-01-23 09:22:40.904 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160145.9034986, 88007f09-2a63-400d-b5d6-8cc5e235cfec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:40 compute-0 nova_compute[182092]: 2026-01-23 09:22:40.905 182096 INFO nova.compute.manager [-] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] VM Stopped (Lifecycle Event)
Jan 23 09:22:40 compute-0 nova_compute[182092]: 2026-01-23 09:22:40.925 182096 DEBUG nova.compute.manager [None req-d7e36a47-813e-40f7-b7bc-9d376a0c9e1d - - - - - -] [instance: 88007f09-2a63-400d-b5d6-8cc5e235cfec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.175 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.175 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.176 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.176 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.176 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.183 182096 INFO nova.compute.manager [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Terminating instance
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.189 182096 DEBUG nova.compute.manager [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:22:41 compute-0 kernel: tap245227da-df (unregistering): left promiscuous mode
Jan 23 09:22:41 compute-0 NetworkManager[54920]: <info>  [1769160161.2101] device (tap245227da-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:22:41 compute-0 ovn_controller[94697]: 2026-01-23T09:22:41Z|00211|binding|INFO|Releasing lport 245227da-dfb0-477a-8626-df9b5425ae01 from this chassis (sb_readonly=0)
Jan 23 09:22:41 compute-0 ovn_controller[94697]: 2026-01-23T09:22:41Z|00212|binding|INFO|Setting lport 245227da-dfb0-477a-8626-df9b5425ae01 down in Southbound
Jan 23 09:22:41 compute-0 ovn_controller[94697]: 2026-01-23T09:22:41Z|00213|binding|INFO|Removing iface tap245227da-df ovn-installed in OVS
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.216 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.226 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:c9:19 10.100.0.9'], port_security=['fa:16:3e:a2:c9:19 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dae6fd97-3d1f-488f-995e-721a38374a58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7ee14425b6747efb2e72842d0c056ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '91792383-edee-41db-a8be-9e88e85faeab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3159c470-c9ca-403b-a7f0-e2ef4dc1d89a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=245227da-dfb0-477a-8626-df9b5425ae01) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.227 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 245227da-dfb0-477a-8626-df9b5425ae01 in datapath dae6fd97-3d1f-488f-995e-721a38374a58 unbound from our chassis
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.229 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dae6fd97-3d1f-488f-995e-721a38374a58, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.230 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4f76e3-cf3b-4521-9683-420141948c74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.232 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 namespace which is not needed anymore
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.235 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000049.scope: Deactivated successfully.
Jan 23 09:22:41 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000049.scope: Consumed 3.471s CPU time.
Jan 23 09:22:41 compute-0 systemd-machined[153562]: Machine qemu-32-instance-00000049 terminated.
Jan 23 09:22:41 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217312]: [NOTICE]   (217331) : haproxy version is 2.8.14-c23fe91
Jan 23 09:22:41 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217312]: [NOTICE]   (217331) : path to executable is /usr/sbin/haproxy
Jan 23 09:22:41 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217312]: [ALERT]    (217331) : Current worker (217339) exited with code 143 (Terminated)
Jan 23 09:22:41 compute-0 neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58[217312]: [WARNING]  (217331) : All workers exited. Exiting... (0)
Jan 23 09:22:41 compute-0 systemd[1]: libpod-3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0.scope: Deactivated successfully.
Jan 23 09:22:41 compute-0 podman[217394]: 2026-01-23 09:22:41.339375431 +0000 UTC m=+0.035354732 container died 3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:22:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0-userdata-shm.mount: Deactivated successfully.
Jan 23 09:22:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-5892c6d29c0cdb13e89231bb8f890538a7bdebaab49717aeab194426bf57605a-merged.mount: Deactivated successfully.
Jan 23 09:22:41 compute-0 podman[217394]: 2026-01-23 09:22:41.361056815 +0000 UTC m=+0.057036116 container cleanup 3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:22:41 compute-0 systemd[1]: libpod-conmon-3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0.scope: Deactivated successfully.
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.405 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 podman[217417]: 2026-01-23 09:22:41.410033705 +0000 UTC m=+0.033863047 container remove 3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.414 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.418 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d1e920-2d60-44cc-8aa1-f02a56cbe6f9]: (4, ('Fri Jan 23 09:22:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 (3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0)\n3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0\nFri Jan 23 09:22:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 (3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0)\n3f240da9de0797e78eec1ca70d8a3ffb729280f4a98ae25bfe6b448fe46210b0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.421 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8124378d-71ae-434d-b8b6-5a42832173d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.422 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdae6fd97-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:41 compute-0 kernel: tapdae6fd97-30: left promiscuous mode
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.433 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.443 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0150cd-4cdd-4d50-9aee-99374c4facf9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.446 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.447 182096 INFO nova.virt.libvirt.driver [-] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Instance destroyed successfully.
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.447 182096 DEBUG nova.objects.instance [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lazy-loading 'resources' on Instance uuid 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.458 182096 DEBUG nova.virt.libvirt.vif [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:22:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-379842775',display_name='tempest-MultipleCreateTestJSON-server-379842775-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-379842775-1',id=73,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:22:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a7ee14425b6747efb2e72842d0c056ad',ramdisk_id='',reservation_id='r-b6v7lkbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1292493636',owner_user_name='tempest-MultipleCreateTestJSON-1292493636-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:22:38Z,user_data=None,user_id='98de0e9eb03748a688af5ed612c515ce',uuid=1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.458 182096 DEBUG nova.network.os_vif_util [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converting VIF {"id": "245227da-dfb0-477a-8626-df9b5425ae01", "address": "fa:16:3e:a2:c9:19", "network": {"id": "dae6fd97-3d1f-488f-995e-721a38374a58", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-330089943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7ee14425b6747efb2e72842d0c056ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap245227da-df", "ovs_interfaceid": "245227da-dfb0-477a-8626-df9b5425ae01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.459 182096 DEBUG nova.network.os_vif_util [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:c9:19,bridge_name='br-int',has_traffic_filtering=True,id=245227da-dfb0-477a-8626-df9b5425ae01,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap245227da-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.459 182096 DEBUG os_vif [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:c9:19,bridge_name='br-int',has_traffic_filtering=True,id=245227da-dfb0-477a-8626-df9b5425ae01,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap245227da-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.461 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.461 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap245227da-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.454 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[22c82665-53fd-48c6-b4a7-afd10bcd1d06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.455 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[047f2b27-38ee-4ae7-977e-147b1c0359db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.463 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.465 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.466 182096 INFO os_vif [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:c9:19,bridge_name='br-int',has_traffic_filtering=True,id=245227da-dfb0-477a-8626-df9b5425ae01,network=Network(dae6fd97-3d1f-488f-995e-721a38374a58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap245227da-df')
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.466 182096 INFO nova.virt.libvirt.driver [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Deleting instance files /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214_del
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.467 182096 INFO nova.virt.libvirt.driver [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Deletion of /var/lib/nova/instances/1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214_del complete
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.471 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a2be48de-89fc-4c61-b776-3d98d6736b63]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366153, 'reachable_time': 21964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217447, 'error': None, 'target': 'ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:41 compute-0 systemd[1]: run-netns-ovnmeta\x2ddae6fd97\x2d3d1f\x2d488f\x2d995e\x2d721a38374a58.mount: Deactivated successfully.
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.473 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dae6fd97-3d1f-488f-995e-721a38374a58 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:22:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:22:41.474 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[524f1a79-b4b2-41bd-826b-642646f8d6b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.545 182096 DEBUG nova.compute.manager [req-c06b4ef4-4a8b-4964-9b68-aadb8a61c9f0 req-7f109c58-a53c-4d85-9a33-d31f8fe7ed46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received event network-vif-unplugged-245227da-dfb0-477a-8626-df9b5425ae01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.545 182096 DEBUG oslo_concurrency.lockutils [req-c06b4ef4-4a8b-4964-9b68-aadb8a61c9f0 req-7f109c58-a53c-4d85-9a33-d31f8fe7ed46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.545 182096 DEBUG oslo_concurrency.lockutils [req-c06b4ef4-4a8b-4964-9b68-aadb8a61c9f0 req-7f109c58-a53c-4d85-9a33-d31f8fe7ed46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.546 182096 DEBUG oslo_concurrency.lockutils [req-c06b4ef4-4a8b-4964-9b68-aadb8a61c9f0 req-7f109c58-a53c-4d85-9a33-d31f8fe7ed46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.546 182096 DEBUG nova.compute.manager [req-c06b4ef4-4a8b-4964-9b68-aadb8a61c9f0 req-7f109c58-a53c-4d85-9a33-d31f8fe7ed46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] No waiting events found dispatching network-vif-unplugged-245227da-dfb0-477a-8626-df9b5425ae01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.546 182096 DEBUG nova.compute.manager [req-c06b4ef4-4a8b-4964-9b68-aadb8a61c9f0 req-7f109c58-a53c-4d85-9a33-d31f8fe7ed46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received event network-vif-unplugged-245227da-dfb0-477a-8626-df9b5425ae01 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.556 182096 INFO nova.compute.manager [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.557 182096 DEBUG oslo.service.loopingcall [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.557 182096 DEBUG nova.compute.manager [-] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:22:41 compute-0 nova_compute[182092]: 2026-01-23 09:22:41.557 182096 DEBUG nova.network.neutron [-] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.008 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.424 182096 DEBUG nova.network.neutron [-] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.452 182096 INFO nova.compute.manager [-] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Took 0.89 seconds to deallocate network for instance.
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.536 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.536 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.636 182096 DEBUG nova.compute.provider_tree [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.650 182096 DEBUG nova.scheduler.client.report [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.666 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.708 182096 INFO nova.scheduler.client.report [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Deleted allocations for instance 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214
Jan 23 09:22:42 compute-0 nova_compute[182092]: 2026-01-23 09:22:42.783 182096 DEBUG oslo_concurrency.lockutils [None req-a12683bf-7528-4845-9761-deaaa1ef3bc7 98de0e9eb03748a688af5ed612c515ce a7ee14425b6747efb2e72842d0c056ad - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:43 compute-0 nova_compute[182092]: 2026-01-23 09:22:43.727 182096 DEBUG nova.compute.manager [req-59948277-7cfe-4563-ab8f-2252031b18ed req-6ecb5695-53e7-46f8-b9ca-b7e9e7413878 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received event network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:43 compute-0 nova_compute[182092]: 2026-01-23 09:22:43.727 182096 DEBUG oslo_concurrency.lockutils [req-59948277-7cfe-4563-ab8f-2252031b18ed req-6ecb5695-53e7-46f8-b9ca-b7e9e7413878 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:43 compute-0 nova_compute[182092]: 2026-01-23 09:22:43.728 182096 DEBUG oslo_concurrency.lockutils [req-59948277-7cfe-4563-ab8f-2252031b18ed req-6ecb5695-53e7-46f8-b9ca-b7e9e7413878 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:43 compute-0 nova_compute[182092]: 2026-01-23 09:22:43.728 182096 DEBUG oslo_concurrency.lockutils [req-59948277-7cfe-4563-ab8f-2252031b18ed req-6ecb5695-53e7-46f8-b9ca-b7e9e7413878 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:43 compute-0 nova_compute[182092]: 2026-01-23 09:22:43.728 182096 DEBUG nova.compute.manager [req-59948277-7cfe-4563-ab8f-2252031b18ed req-6ecb5695-53e7-46f8-b9ca-b7e9e7413878 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] No waiting events found dispatching network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:22:43 compute-0 nova_compute[182092]: 2026-01-23 09:22:43.728 182096 WARNING nova.compute.manager [req-59948277-7cfe-4563-ab8f-2252031b18ed req-6ecb5695-53e7-46f8-b9ca-b7e9e7413878 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received unexpected event network-vif-plugged-245227da-dfb0-477a-8626-df9b5425ae01 for instance with vm_state deleted and task_state None.
Jan 23 09:22:43 compute-0 nova_compute[182092]: 2026-01-23 09:22:43.729 182096 DEBUG nova.compute.manager [req-59948277-7cfe-4563-ab8f-2252031b18ed req-6ecb5695-53e7-46f8-b9ca-b7e9e7413878 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Received event network-vif-deleted-245227da-dfb0-477a-8626-df9b5425ae01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:22:45 compute-0 nova_compute[182092]: 2026-01-23 09:22:45.647 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:22:46 compute-0 nova_compute[182092]: 2026-01-23 09:22:46.359 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:46 compute-0 nova_compute[182092]: 2026-01-23 09:22:46.463 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:47 compute-0 nova_compute[182092]: 2026-01-23 09:22:47.009 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:47 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 23 09:22:47 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000048.scope: Consumed 11.095s CPU time.
Jan 23 09:22:47 compute-0 systemd-machined[153562]: Machine qemu-31-instance-00000048 terminated.
Jan 23 09:22:47 compute-0 podman[217452]: 2026-01-23 09:22:47.834258218 +0000 UTC m=+0.059745518 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.656 182096 INFO nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance shutdown successfully after 13 seconds.
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.659 182096 INFO nova.virt.libvirt.driver [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance destroyed successfully.
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.662 182096 INFO nova.virt.libvirt.driver [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance destroyed successfully.
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.662 182096 INFO nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Deleting instance files /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5_del
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.663 182096 INFO nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Deletion of /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5_del complete
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.857 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.857 182096 INFO nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Creating image(s)
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.858 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.858 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.859 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.859 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:48 compute-0 nova_compute[182092]: 2026-01-23 09:22:48.859 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.525 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.570 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c.part --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.571 182096 DEBUG nova.virt.images [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] 11cc348c-4b05-42ba-a4b9-513b91dede76 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.572 182096 DEBUG nova.privsep.utils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.572 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c.part /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.622 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c.part /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c.converted" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.625 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.670 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c.converted --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.671 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.812s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.682 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.726 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.727 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.727 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.737 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.781 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.782 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.800 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.801 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.802 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.845 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.846 182096 DEBUG nova.virt.disk.api [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Checking if we can resize image /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.847 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.892 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.892 182096 DEBUG nova.virt.disk.api [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Cannot resize image /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.893 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.893 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Ensure instance console log exists: /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.894 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.894 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.894 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.895 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.898 182096 WARNING nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.904 182096 DEBUG nova.virt.libvirt.host [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.904 182096 DEBUG nova.virt.libvirt.host [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.906 182096 DEBUG nova.virt.libvirt.host [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.907 182096 DEBUG nova.virt.libvirt.host [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.908 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.908 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.909 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.909 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.909 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.909 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.909 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.910 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.910 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.910 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.910 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.910 182096 DEBUG nova.virt.hardware [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.911 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.952 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <uuid>3565c36e-66c4-408c-9b5d-34356739efc5</uuid>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <name>instance-00000048</name>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerShowV247Test-server-2102390312</nova:name>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:22:50</nova:creationTime>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:22:50 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:22:50 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:22:50 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:22:50 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:22:50 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:22:50 compute-0 nova_compute[182092]:         <nova:user uuid="06fe70d62c9946f9b10d7d260a8d24aa">tempest-ServerShowV247Test-2078701084-project-member</nova:user>
Jan 23 09:22:50 compute-0 nova_compute[182092]:         <nova:project uuid="a6a3a3aa1b474ddd8cf8d23d2ea93efa">tempest-ServerShowV247Test-2078701084</nova:project>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="11cc348c-4b05-42ba-a4b9-513b91dede76"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <system>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <entry name="serial">3565c36e-66c4-408c-9b5d-34356739efc5</entry>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <entry name="uuid">3565c36e-66c4-408c-9b5d-34356739efc5</entry>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     </system>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <os>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   </os>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <features>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   </features>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.config"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/console.log" append="off"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <video>
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     </video>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:22:50 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:22:50 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:22:50 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:22:50 compute-0 nova_compute[182092]: </domain>
Jan 23 09:22:50 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.994 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.994 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:22:50 compute-0 nova_compute[182092]: 2026-01-23 09:22:50.994 182096 INFO nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Using config drive
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.023 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.061 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'keypairs' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.380 182096 INFO nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Creating config drive at /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.config
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.384 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdk_16w0o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.465 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.502 182096 DEBUG oslo_concurrency.processutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdk_16w0o" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:22:51 compute-0 systemd-machined[153562]: New machine qemu-33-instance-00000048.
Jan 23 09:22:51 compute-0 systemd[1]: Started Virtual Machine qemu-33-instance-00000048.
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.989 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 3565c36e-66c4-408c-9b5d-34356739efc5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.990 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160171.989462, 3565c36e-66c4-408c-9b5d-34356739efc5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.990 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] VM Resumed (Lifecycle Event)
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.992 182096 DEBUG nova.compute.manager [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.993 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.996 182096 INFO nova.virt.libvirt.driver [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance spawned successfully.
Jan 23 09:22:51 compute-0 nova_compute[182092]: 2026-01-23 09:22:51.996 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.009 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.025 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.029 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.033 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.033 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.033 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.034 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.034 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.035 182096 DEBUG nova.virt.libvirt.driver [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.132 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.132 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160171.9903836, 3565c36e-66c4-408c-9b5d-34356739efc5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.132 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] VM Started (Lifecycle Event)
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.169 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.171 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.200 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.205 182096 DEBUG nova.compute.manager [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.271 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.272 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.272 182096 DEBUG nova.objects.instance [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:22:52 compute-0 nova_compute[182092]: 2026-01-23 09:22:52.324 182096 DEBUG oslo_concurrency.lockutils [None req-b9c78c3d-c879-4793-b987-5d5497a8d52f 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.719 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "3565c36e-66c4-408c-9b5d-34356739efc5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.719 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3565c36e-66c4-408c-9b5d-34356739efc5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.719 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "3565c36e-66c4-408c-9b5d-34356739efc5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.720 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3565c36e-66c4-408c-9b5d-34356739efc5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.720 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3565c36e-66c4-408c-9b5d-34356739efc5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.726 182096 INFO nova.compute.manager [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Terminating instance
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.732 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "refresh_cache-3565c36e-66c4-408c-9b5d-34356739efc5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.732 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquired lock "refresh_cache-3565c36e-66c4-408c-9b5d-34356739efc5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.733 182096 DEBUG nova.network.neutron [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:22:54 compute-0 nova_compute[182092]: 2026-01-23 09:22:54.936 182096 DEBUG nova.network.neutron [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.266 182096 DEBUG nova.network.neutron [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.279 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Releasing lock "refresh_cache-3565c36e-66c4-408c-9b5d-34356739efc5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.279 182096 DEBUG nova.compute.manager [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:22:55 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 23 09:22:55 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000048.scope: Consumed 3.747s CPU time.
Jan 23 09:22:55 compute-0 systemd-machined[153562]: Machine qemu-33-instance-00000048 terminated.
Jan 23 09:22:55 compute-0 podman[217544]: 2026-01-23 09:22:55.358269264 +0000 UTC m=+0.041244792 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:22:55 compute-0 podman[217543]: 2026-01-23 09:22:55.372476306 +0000 UTC m=+0.057623030 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.510 182096 INFO nova.virt.libvirt.driver [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance destroyed successfully.
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.510 182096 DEBUG nova.objects.instance [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lazy-loading 'resources' on Instance uuid 3565c36e-66c4-408c-9b5d-34356739efc5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.528 182096 INFO nova.virt.libvirt.driver [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Deleting instance files /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5_del
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.529 182096 INFO nova.virt.libvirt.driver [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Deletion of /var/lib/nova/instances/3565c36e-66c4-408c-9b5d-34356739efc5_del complete
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.572 182096 INFO nova.compute.manager [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Took 0.29 seconds to destroy the instance on the hypervisor.
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.573 182096 DEBUG oslo.service.loopingcall [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.573 182096 DEBUG nova.compute.manager [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.573 182096 DEBUG nova.network.neutron [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.716 182096 DEBUG nova.network.neutron [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.730 182096 DEBUG nova.network.neutron [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.743 182096 INFO nova.compute.manager [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Took 0.17 seconds to deallocate network for instance.
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.806 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.806 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.891 182096 DEBUG nova.scheduler.client.report [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.968 182096 DEBUG nova.scheduler.client.report [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:22:55 compute-0 nova_compute[182092]: 2026-01-23 09:22:55.969 182096 DEBUG nova.compute.provider_tree [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.075 182096 DEBUG nova.scheduler.client.report [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.109 182096 DEBUG nova.scheduler.client.report [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.167 182096 DEBUG nova.compute.provider_tree [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.187 182096 DEBUG nova.scheduler.client.report [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.205 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.224 182096 INFO nova.scheduler.client.report [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Deleted allocations for instance 3565c36e-66c4-408c-9b5d-34356739efc5
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.292 182096 DEBUG oslo_concurrency.lockutils [None req-e4eae660-d177-4fc5-8aa5-251f2eca9b38 06fe70d62c9946f9b10d7d260a8d24aa a6a3a3aa1b474ddd8cf8d23d2ea93efa - - default default] Lock "3565c36e-66c4-408c-9b5d-34356739efc5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.437 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160161.4366279, 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.438 182096 INFO nova.compute.manager [-] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] VM Stopped (Lifecycle Event)
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.461 182096 DEBUG nova.compute.manager [None req-015c31e2-bc92-4307-aae2-a451014b0c55 - - - - - -] [instance: 1d8bd4f8-0179-4aa7-8e2c-b3dd0c862214] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:22:56 compute-0 nova_compute[182092]: 2026-01-23 09:22:56.466 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:57 compute-0 nova_compute[182092]: 2026-01-23 09:22:57.011 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:22:57 compute-0 nova_compute[182092]: 2026-01-23 09:22:57.668 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:22:58 compute-0 nova_compute[182092]: 2026-01-23 09:22:58.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:22:59 compute-0 nova_compute[182092]: 2026-01-23 09:22:59.646 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:22:59 compute-0 nova_compute[182092]: 2026-01-23 09:22:59.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:22:59 compute-0 nova_compute[182092]: 2026-01-23 09:22:59.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.671 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.868 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.869 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5667MB free_disk=73.26424407958984GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.870 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.870 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.941 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.942 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.970 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:23:00 compute-0 nova_compute[182092]: 2026-01-23 09:23:00.989 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:23:01 compute-0 nova_compute[182092]: 2026-01-23 09:23:01.015 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:23:01 compute-0 nova_compute[182092]: 2026-01-23 09:23:01.015 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:01 compute-0 nova_compute[182092]: 2026-01-23 09:23:01.467 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.013 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.015 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.016 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.016 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.031 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.496 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.496 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.513 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.619 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.619 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.624 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.624 182096 INFO nova.compute.claims [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.670 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.734 182096 DEBUG nova.compute.provider_tree [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.747 182096 DEBUG nova.scheduler.client.report [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.760 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.761 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.814 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.814 182096 DEBUG nova.network.neutron [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.839 182096 INFO nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.860 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.992 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.993 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.993 182096 INFO nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Creating image(s)
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.994 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.994 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:02 compute-0 nova_compute[182092]: 2026-01-23 09:23:02.995 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.005 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.051 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.052 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.053 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.062 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.106 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.107 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.128 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.129 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.130 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.174 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.175 182096 DEBUG nova.virt.disk.api [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Checking if we can resize image /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.175 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.206 182096 DEBUG nova.policy [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.222 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.222 182096 DEBUG nova.virt.disk.api [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Cannot resize image /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.222 182096 DEBUG nova.objects.instance [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.241 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.241 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Ensure instance console log exists: /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.241 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.242 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:03 compute-0 nova_compute[182092]: 2026-01-23 09:23:03.242 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:04 compute-0 nova_compute[182092]: 2026-01-23 09:23:04.755 182096 DEBUG nova.network.neutron [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Successfully created port: b1eb1ee2-fc97-423a-bac5-6219bd097839 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:23:06 compute-0 nova_compute[182092]: 2026-01-23 09:23:06.026 182096 DEBUG nova.network.neutron [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Successfully updated port: b1eb1ee2-fc97-423a-bac5-6219bd097839 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:23:06 compute-0 nova_compute[182092]: 2026-01-23 09:23:06.039 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:23:06 compute-0 nova_compute[182092]: 2026-01-23 09:23:06.039 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:23:06 compute-0 nova_compute[182092]: 2026-01-23 09:23:06.039 182096 DEBUG nova.network.neutron [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:23:06 compute-0 nova_compute[182092]: 2026-01-23 09:23:06.214 182096 DEBUG nova.network.neutron [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:23:06 compute-0 nova_compute[182092]: 2026-01-23 09:23:06.469 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:06 compute-0 nova_compute[182092]: 2026-01-23 09:23:06.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.013 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:07 compute-0 podman[217608]: 2026-01-23 09:23:07.198149797 +0000 UTC m=+0.035413321 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:23:07 compute-0 podman[217609]: 2026-01-23 09:23:07.20721565 +0000 UTC m=+0.042763959 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.515 182096 DEBUG nova.compute.manager [req-0adf23bc-83a1-467f-91f7-90c732bd5a23 req-a4613f6b-4149-48b2-839d-6f1fc818bd2b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-changed-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.515 182096 DEBUG nova.compute.manager [req-0adf23bc-83a1-467f-91f7-90c732bd5a23 req-a4613f6b-4149-48b2-839d-6f1fc818bd2b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Refreshing instance network info cache due to event network-changed-b1eb1ee2-fc97-423a-bac5-6219bd097839. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.515 182096 DEBUG oslo_concurrency.lockutils [req-0adf23bc-83a1-467f-91f7-90c732bd5a23 req-a4613f6b-4149-48b2-839d-6f1fc818bd2b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.542 182096 DEBUG nova.network.neutron [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.557 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.557 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance network_info: |[{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.557 182096 DEBUG oslo_concurrency.lockutils [req-0adf23bc-83a1-467f-91f7-90c732bd5a23 req-a4613f6b-4149-48b2-839d-6f1fc818bd2b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.557 182096 DEBUG nova.network.neutron [req-0adf23bc-83a1-467f-91f7-90c732bd5a23 req-a4613f6b-4149-48b2-839d-6f1fc818bd2b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Refreshing network info cache for port b1eb1ee2-fc97-423a-bac5-6219bd097839 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.559 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Start _get_guest_xml network_info=[{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.562 182096 WARNING nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.567 182096 DEBUG nova.virt.libvirt.host [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.568 182096 DEBUG nova.virt.libvirt.host [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.572 182096 DEBUG nova.virt.libvirt.host [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.572 182096 DEBUG nova.virt.libvirt.host [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.573 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.573 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.574 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.574 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.574 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.574 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.575 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.575 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.575 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.575 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.575 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.575 182096 DEBUG nova.virt.hardware [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.578 182096 DEBUG nova.virt.libvirt.vif [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.578 182096 DEBUG nova.network.os_vif_util [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.579 182096 DEBUG nova.network.os_vif_util [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.580 182096 DEBUG nova.objects.instance [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.590 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <uuid>74eaa05c-e365-4879-af9a-1bf1c102eda7</uuid>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <name>instance-0000004b</name>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerActionsTestJSON-server-1490124151</nova:name>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:23:07</nova:creationTime>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:23:07 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:23:07 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:23:07 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:23:07 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:23:07 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:23:07 compute-0 nova_compute[182092]:         <nova:user uuid="89c019480e524c04af4d250b1c4051e5">tempest-ServerActionsTestJSON-766366320-project-member</nova:user>
Jan 23 09:23:07 compute-0 nova_compute[182092]:         <nova:project uuid="860ef09b9e6e4866bbe99b6e769733a3">tempest-ServerActionsTestJSON-766366320</nova:project>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:23:07 compute-0 nova_compute[182092]:         <nova:port uuid="b1eb1ee2-fc97-423a-bac5-6219bd097839">
Jan 23 09:23:07 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <system>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <entry name="serial">74eaa05c-e365-4879-af9a-1bf1c102eda7</entry>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <entry name="uuid">74eaa05c-e365-4879-af9a-1bf1c102eda7</entry>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </system>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <os>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   </os>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <features>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   </features>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:27:52:52"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <target dev="tapb1eb1ee2-fc"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/console.log" append="off"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <video>
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </video>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:23:07 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:23:07 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:23:07 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:23:07 compute-0 nova_compute[182092]: </domain>
Jan 23 09:23:07 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.591 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Preparing to wait for external event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.591 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.591 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.592 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.592 182096 DEBUG nova.virt.libvirt.vif [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.592 182096 DEBUG nova.network.os_vif_util [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.593 182096 DEBUG nova.network.os_vif_util [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.593 182096 DEBUG os_vif [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.593 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.594 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.594 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.596 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.596 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1eb1ee2-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.596 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1eb1ee2-fc, col_values=(('external_ids', {'iface-id': 'b1eb1ee2-fc97-423a-bac5-6219bd097839', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:52:52', 'vm-uuid': '74eaa05c-e365-4879-af9a-1bf1c102eda7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.597 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:07 compute-0 NetworkManager[54920]: <info>  [1769160187.5987] manager: (tapb1eb1ee2-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.601 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.602 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.603 182096 INFO os_vif [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.669 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.669 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.669 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No VIF found with MAC fa:16:3e:27:52:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:23:07 compute-0 nova_compute[182092]: 2026-01-23 09:23:07.669 182096 INFO nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Using config drive
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.297 182096 INFO nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Creating config drive at /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.302 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp54dpu7e6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.421 182096 DEBUG oslo_concurrency.processutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp54dpu7e6" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:08 compute-0 kernel: tapb1eb1ee2-fc: entered promiscuous mode
Jan 23 09:23:08 compute-0 NetworkManager[54920]: <info>  [1769160188.4614] manager: (tapb1eb1ee2-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Jan 23 09:23:08 compute-0 ovn_controller[94697]: 2026-01-23T09:23:08Z|00214|binding|INFO|Claiming lport b1eb1ee2-fc97-423a-bac5-6219bd097839 for this chassis.
Jan 23 09:23:08 compute-0 ovn_controller[94697]: 2026-01-23T09:23:08Z|00215|binding|INFO|b1eb1ee2-fc97-423a-bac5-6219bd097839: Claiming fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.463 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.465 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.480 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:52:52 10.100.0.10'], port_security=['fa:16:3e:27:52:52 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b1eb1ee2-fc97-423a-bac5-6219bd097839) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.481 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b1eb1ee2-fc97-423a-bac5-6219bd097839 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.482 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:23:08 compute-0 systemd-udevd[217662]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.491 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[744cb6c2-aa98-476d-9024-eb6cba6aaa4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.492 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa610e7c-51 in ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.494 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa610e7c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.494 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd04bd9-9860-4ee0-8f7a-a9b73dd784ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.495 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f7560c60-458f-4407-9639-de1978e9c4ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 NetworkManager[54920]: <info>  [1769160188.4994] device (tapb1eb1ee2-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:23:08 compute-0 NetworkManager[54920]: <info>  [1769160188.5001] device (tapb1eb1ee2-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.506 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4938ee-0b97-4bc4-9287-a41785ed44b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 systemd-machined[153562]: New machine qemu-34-instance-0000004b.
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.525 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.523 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a2e249-dbbc-427b-bdbd-956efdc4cd53]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 systemd[1]: Started Virtual Machine qemu-34-instance-0000004b.
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.530 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:08 compute-0 ovn_controller[94697]: 2026-01-23T09:23:08Z|00216|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 ovn-installed in OVS
Jan 23 09:23:08 compute-0 ovn_controller[94697]: 2026-01-23T09:23:08Z|00217|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 up in Southbound
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.545 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[3de1b25c-c311-484e-a443-ddd09c2fd885]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 NetworkManager[54920]: <info>  [1769160188.5485] manager: (tapfa610e7c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.547 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7603662a-d135-4613-87fe-2a9f3d8303f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.572 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4b48fd-c666-437e-ab83-6bf8efe162a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.574 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[99f49199-cfd1-4622-abfc-97e0a7165b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 NetworkManager[54920]: <info>  [1769160188.5874] device (tapfa610e7c-50): carrier: link connected
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.591 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4efa1aad-480f-4ae5-8437-9304bfbbc7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.604 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7f26be2e-8dbd-4e12-b919-572ca065e8c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369407, 'reachable_time': 35861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217690, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.613 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[111897c1-85c7-4803-8263-f03b8ab5498f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:516c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369407, 'tstamp': 369407}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217691, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.624 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9916a8ac-43bb-4aac-a4f9-4c98ab423190]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369407, 'reachable_time': 35861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217692, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.640 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6c4c0c-b5cd-45a5-a384-b7002b5ef65d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.671 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bfcc96a4-c94f-46d6-a1cc-f35582dca4bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.672 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.672 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.673 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:08 compute-0 NetworkManager[54920]: <info>  [1769160188.6748] manager: (tapfa610e7c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 23 09:23:08 compute-0 kernel: tapfa610e7c-50: entered promiscuous mode
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.674 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.679 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.680 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:08 compute-0 ovn_controller[94697]: 2026-01-23T09:23:08Z|00218|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.680 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.681 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.681 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1b396137-fd22-49ee-8973-3213482d032f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.684 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:23:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:08.684 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'env', 'PROCESS_TAG=haproxy-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa610e7c-53f8-4775-b5b8-aa45897b011c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.692 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.743 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160188.7427247, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.743 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Started (Lifecycle Event)
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.769 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.771 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160188.7428553, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.771 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Paused (Lifecycle Event)
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.785 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.787 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.807 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.920 182096 DEBUG nova.compute.manager [req-f3f77e31-dadd-404c-93ad-5f46c3863cc1 req-bf0073a2-04fb-40d6-b148-5656a4456715 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.920 182096 DEBUG oslo_concurrency.lockutils [req-f3f77e31-dadd-404c-93ad-5f46c3863cc1 req-bf0073a2-04fb-40d6-b148-5656a4456715 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.921 182096 DEBUG oslo_concurrency.lockutils [req-f3f77e31-dadd-404c-93ad-5f46c3863cc1 req-bf0073a2-04fb-40d6-b148-5656a4456715 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.921 182096 DEBUG oslo_concurrency.lockutils [req-f3f77e31-dadd-404c-93ad-5f46c3863cc1 req-bf0073a2-04fb-40d6-b148-5656a4456715 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.921 182096 DEBUG nova.compute.manager [req-f3f77e31-dadd-404c-93ad-5f46c3863cc1 req-bf0073a2-04fb-40d6-b148-5656a4456715 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Processing event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.922 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.924 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160188.9244843, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.924 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Resumed (Lifecycle Event)
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.926 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.928 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance spawned successfully.
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.928 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.949 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.953 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.955 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.955 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.956 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.956 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.957 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.958 182096 DEBUG nova.virt.libvirt.driver [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:08 compute-0 podman[217727]: 2026-01-23 09:23:08.966439451 +0000 UTC m=+0.031900193 container create 306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:23:08 compute-0 nova_compute[182092]: 2026-01-23 09:23:08.997 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:23:08 compute-0 systemd[1]: Started libpod-conmon-306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef.scope.
Jan 23 09:23:09 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:23:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa0099ac8aaf20fcce1b6d0f621c0e89c2dabe64f607e864a6af1dfafbfc43e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:23:09 compute-0 podman[217727]: 2026-01-23 09:23:09.027371113 +0000 UTC m=+0.092831854 container init 306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 09:23:09 compute-0 podman[217727]: 2026-01-23 09:23:09.032023581 +0000 UTC m=+0.097484322 container start 306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:23:09 compute-0 podman[217727]: 2026-01-23 09:23:08.951916564 +0000 UTC m=+0.017377325 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:23:09 compute-0 nova_compute[182092]: 2026-01-23 09:23:09.033 182096 INFO nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Took 6.04 seconds to spawn the instance on the hypervisor.
Jan 23 09:23:09 compute-0 nova_compute[182092]: 2026-01-23 09:23:09.034 182096 DEBUG nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:09 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[217739]: [NOTICE]   (217743) : New worker (217745) forked
Jan 23 09:23:09 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[217739]: [NOTICE]   (217743) : Loading success.
Jan 23 09:23:09 compute-0 nova_compute[182092]: 2026-01-23 09:23:09.131 182096 INFO nova.compute.manager [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Took 6.53 seconds to build instance.
Jan 23 09:23:09 compute-0 nova_compute[182092]: 2026-01-23 09:23:09.156 182096 DEBUG oslo_concurrency.lockutils [None req-9a5b3204-2f87-4885-8543-95380073ecd8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:09 compute-0 nova_compute[182092]: 2026-01-23 09:23:09.661 182096 DEBUG nova.network.neutron [req-0adf23bc-83a1-467f-91f7-90c732bd5a23 req-a4613f6b-4149-48b2-839d-6f1fc818bd2b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updated VIF entry in instance network info cache for port b1eb1ee2-fc97-423a-bac5-6219bd097839. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:23:09 compute-0 nova_compute[182092]: 2026-01-23 09:23:09.662 182096 DEBUG nova.network.neutron [req-0adf23bc-83a1-467f-91f7-90c732bd5a23 req-a4613f6b-4149-48b2-839d-6f1fc818bd2b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:23:09 compute-0 nova_compute[182092]: 2026-01-23 09:23:09.690 182096 DEBUG oslo_concurrency.lockutils [req-0adf23bc-83a1-467f-91f7-90c732bd5a23 req-a4613f6b-4149-48b2-839d-6f1fc818bd2b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:23:10 compute-0 podman[217750]: 2026-01-23 09:23:10.207485757 +0000 UTC m=+0.041189668 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Jan 23 09:23:10 compute-0 nova_compute[182092]: 2026-01-23 09:23:10.509 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160175.5085306, 3565c36e-66c4-408c-9b5d-34356739efc5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:10 compute-0 nova_compute[182092]: 2026-01-23 09:23:10.509 182096 INFO nova.compute.manager [-] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] VM Stopped (Lifecycle Event)
Jan 23 09:23:10 compute-0 nova_compute[182092]: 2026-01-23 09:23:10.523 182096 DEBUG nova.compute.manager [None req-66fec66f-df3b-4dab-8146-f77eae6bd677 - - - - - -] [instance: 3565c36e-66c4-408c-9b5d-34356739efc5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.093 182096 DEBUG nova.compute.manager [req-de259294-d76c-4fa7-a118-02bf428af0e8 req-6634dca3-ef2b-4336-a3fe-72c89977cb98 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.094 182096 DEBUG oslo_concurrency.lockutils [req-de259294-d76c-4fa7-a118-02bf428af0e8 req-6634dca3-ef2b-4336-a3fe-72c89977cb98 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.094 182096 DEBUG oslo_concurrency.lockutils [req-de259294-d76c-4fa7-a118-02bf428af0e8 req-6634dca3-ef2b-4336-a3fe-72c89977cb98 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.094 182096 DEBUG oslo_concurrency.lockutils [req-de259294-d76c-4fa7-a118-02bf428af0e8 req-6634dca3-ef2b-4336-a3fe-72c89977cb98 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.094 182096 DEBUG nova.compute.manager [req-de259294-d76c-4fa7-a118-02bf428af0e8 req-6634dca3-ef2b-4336-a3fe-72c89977cb98 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.094 182096 WARNING nova.compute.manager [req-de259294-d76c-4fa7-a118-02bf428af0e8 req-6634dca3-ef2b-4336-a3fe-72c89977cb98 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.723 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:11 compute-0 NetworkManager[54920]: <info>  [1769160191.7321] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 23 09:23:11 compute-0 NetworkManager[54920]: <info>  [1769160191.7325] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.837 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:11 compute-0 ovn_controller[94697]: 2026-01-23T09:23:11Z|00219|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:23:11 compute-0 nova_compute[182092]: 2026-01-23 09:23:11.846 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:12 compute-0 nova_compute[182092]: 2026-01-23 09:23:12.016 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:12 compute-0 nova_compute[182092]: 2026-01-23 09:23:12.598 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:13 compute-0 nova_compute[182092]: 2026-01-23 09:23:13.201 182096 DEBUG nova.compute.manager [req-3fb2b95e-3823-47a1-9cb6-df411f158515 req-0242af31-fa03-4f2e-9511-56dc28d31089 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-changed-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:13 compute-0 nova_compute[182092]: 2026-01-23 09:23:13.202 182096 DEBUG nova.compute.manager [req-3fb2b95e-3823-47a1-9cb6-df411f158515 req-0242af31-fa03-4f2e-9511-56dc28d31089 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Refreshing instance network info cache due to event network-changed-b1eb1ee2-fc97-423a-bac5-6219bd097839. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:23:13 compute-0 nova_compute[182092]: 2026-01-23 09:23:13.202 182096 DEBUG oslo_concurrency.lockutils [req-3fb2b95e-3823-47a1-9cb6-df411f158515 req-0242af31-fa03-4f2e-9511-56dc28d31089 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:23:13 compute-0 nova_compute[182092]: 2026-01-23 09:23:13.203 182096 DEBUG oslo_concurrency.lockutils [req-3fb2b95e-3823-47a1-9cb6-df411f158515 req-0242af31-fa03-4f2e-9511-56dc28d31089 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:23:13 compute-0 nova_compute[182092]: 2026-01-23 09:23:13.203 182096 DEBUG nova.network.neutron [req-3fb2b95e-3823-47a1-9cb6-df411f158515 req-0242af31-fa03-4f2e-9511-56dc28d31089 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Refreshing network info cache for port b1eb1ee2-fc97-423a-bac5-6219bd097839 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:23:14 compute-0 nova_compute[182092]: 2026-01-23 09:23:14.624 182096 DEBUG nova.network.neutron [req-3fb2b95e-3823-47a1-9cb6-df411f158515 req-0242af31-fa03-4f2e-9511-56dc28d31089 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updated VIF entry in instance network info cache for port b1eb1ee2-fc97-423a-bac5-6219bd097839. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:23:14 compute-0 nova_compute[182092]: 2026-01-23 09:23:14.624 182096 DEBUG nova.network.neutron [req-3fb2b95e-3823-47a1-9cb6-df411f158515 req-0242af31-fa03-4f2e-9511-56dc28d31089 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:23:14 compute-0 nova_compute[182092]: 2026-01-23 09:23:14.646 182096 DEBUG oslo_concurrency.lockutils [req-3fb2b95e-3823-47a1-9cb6-df411f158515 req-0242af31-fa03-4f2e-9511-56dc28d31089 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:23:15 compute-0 ovn_controller[94697]: 2026-01-23T09:23:15Z|00220|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:23:15 compute-0 nova_compute[182092]: 2026-01-23 09:23:15.122 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:15 compute-0 nova_compute[182092]: 2026-01-23 09:23:15.586 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:17 compute-0 nova_compute[182092]: 2026-01-23 09:23:17.017 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:17 compute-0 nova_compute[182092]: 2026-01-23 09:23:17.599 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:18 compute-0 podman[217769]: 2026-01-23 09:23:18.241318433 +0000 UTC m=+0.079077525 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 09:23:20 compute-0 ovn_controller[94697]: 2026-01-23T09:23:20Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:23:20 compute-0 ovn_controller[94697]: 2026-01-23T09:23:20Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:23:20 compute-0 nova_compute[182092]: 2026-01-23 09:23:20.738 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:22 compute-0 nova_compute[182092]: 2026-01-23 09:23:22.019 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:22 compute-0 nova_compute[182092]: 2026-01-23 09:23:22.601 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:25 compute-0 nova_compute[182092]: 2026-01-23 09:23:25.808 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:26 compute-0 nova_compute[182092]: 2026-01-23 09:23:26.141 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:26 compute-0 podman[217802]: 2026-01-23 09:23:26.211220131 +0000 UTC m=+0.041021901 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 09:23:26 compute-0 podman[217803]: 2026-01-23 09:23:26.234202528 +0000 UTC m=+0.061830828 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:23:27 compute-0 nova_compute[182092]: 2026-01-23 09:23:27.021 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:27 compute-0 nova_compute[182092]: 2026-01-23 09:23:27.602 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:28.109 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:23:28 compute-0 nova_compute[182092]: 2026-01-23 09:23:28.110 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:28.111 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:23:30 compute-0 nova_compute[182092]: 2026-01-23 09:23:30.240 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:31.113 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.022 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.046 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.291 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.292 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.292 182096 INFO nova.compute.manager [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Rebooting instance
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.303 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.304 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.304 182096 DEBUG nova.network.neutron [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:23:32 compute-0 nova_compute[182092]: 2026-01-23 09:23:32.604 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.400 182096 DEBUG nova.network.neutron [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.436 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.443 182096 DEBUG nova.compute.manager [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:34 compute-0 kernel: tapb1eb1ee2-fc (unregistering): left promiscuous mode
Jan 23 09:23:34 compute-0 NetworkManager[54920]: <info>  [1769160214.6268] device (tapb1eb1ee2-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:23:34 compute-0 ovn_controller[94697]: 2026-01-23T09:23:34Z|00221|binding|INFO|Releasing lport b1eb1ee2-fc97-423a-bac5-6219bd097839 from this chassis (sb_readonly=0)
Jan 23 09:23:34 compute-0 ovn_controller[94697]: 2026-01-23T09:23:34Z|00222|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 down in Southbound
Jan 23 09:23:34 compute-0 ovn_controller[94697]: 2026-01-23T09:23:34Z|00223|binding|INFO|Removing iface tapb1eb1ee2-fc ovn-installed in OVS
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.631 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.635 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:52:52 10.100.0.10'], port_security=['fa:16:3e:27:52:52 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b1eb1ee2-fc97-423a-bac5-6219bd097839) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.636 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b1eb1ee2-fc97-423a-bac5-6219bd097839 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.638 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa610e7c-53f8-4775-b5b8-aa45897b011c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.638 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2b054501-087a-41be-8fda-4d2e5c00075b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.639 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace which is not needed anymore
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.648 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 23 09:23:34 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004b.scope: Consumed 10.759s CPU time.
Jan 23 09:23:34 compute-0 systemd-machined[153562]: Machine qemu-34-instance-0000004b terminated.
Jan 23 09:23:34 compute-0 ovn_controller[94697]: 2026-01-23T09:23:34Z|00224|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.693 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[217739]: [NOTICE]   (217743) : haproxy version is 2.8.14-c23fe91
Jan 23 09:23:34 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[217739]: [NOTICE]   (217743) : path to executable is /usr/sbin/haproxy
Jan 23 09:23:34 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[217739]: [WARNING]  (217743) : Exiting Master process...
Jan 23 09:23:34 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[217739]: [ALERT]    (217743) : Current worker (217745) exited with code 143 (Terminated)
Jan 23 09:23:34 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[217739]: [WARNING]  (217743) : All workers exited. Exiting... (0)
Jan 23 09:23:34 compute-0 systemd[1]: libpod-306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef.scope: Deactivated successfully.
Jan 23 09:23:34 compute-0 podman[217861]: 2026-01-23 09:23:34.749061395 +0000 UTC m=+0.040893198 container died 306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 09:23:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef-userdata-shm.mount: Deactivated successfully.
Jan 23 09:23:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-fa0099ac8aaf20fcce1b6d0f621c0e89c2dabe64f607e864a6af1dfafbfc43e9-merged.mount: Deactivated successfully.
Jan 23 09:23:34 compute-0 podman[217861]: 2026-01-23 09:23:34.800349403 +0000 UTC m=+0.092181206 container cleanup 306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 09:23:34 compute-0 systemd[1]: libpod-conmon-306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef.scope: Deactivated successfully.
Jan 23 09:23:34 compute-0 podman[217888]: 2026-01-23 09:23:34.846144188 +0000 UTC m=+0.024134422 container remove 306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.853 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8d86b47f-83cb-49fc-b8f0-bbe5650c821f]: (4, ('Fri Jan 23 09:23:34 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef)\n306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef\nFri Jan 23 09:23:34 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef)\n306748ecb6f636a1c9d5c4d0796d5d71d549a204b83833c4fd9d340c2cd19aef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.855 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[38663794-3984-417a-bde1-dba177b184a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.856 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.857 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 kernel: tapfa610e7c-50: left promiscuous mode
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.921 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.924 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2f167c96-35d9-46d6-a4ad-45c700ba953e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.930 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[89dcac20-f862-4b04-9e53-8245aa779a63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.931 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[622642c7-7058-4805-9548-3f08b293fb5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.941 182096 DEBUG nova.compute.manager [req-faaa11ce-6565-4e8d-9641-0d8ededd48e9 req-a45b408b-c2ae-4cc3-b529-d3d82410b6f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.941 182096 DEBUG oslo_concurrency.lockutils [req-faaa11ce-6565-4e8d-9641-0d8ededd48e9 req-a45b408b-c2ae-4cc3-b529-d3d82410b6f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.941 182096 DEBUG oslo_concurrency.lockutils [req-faaa11ce-6565-4e8d-9641-0d8ededd48e9 req-a45b408b-c2ae-4cc3-b529-d3d82410b6f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.941 182096 DEBUG oslo_concurrency.lockutils [req-faaa11ce-6565-4e8d-9641-0d8ededd48e9 req-a45b408b-c2ae-4cc3-b529-d3d82410b6f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.941 182096 DEBUG nova.compute.manager [req-faaa11ce-6565-4e8d-9641-0d8ededd48e9 req-a45b408b-c2ae-4cc3-b529-d3d82410b6f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.942 182096 WARNING nova.compute.manager [req-faaa11ce-6565-4e8d-9641-0d8ededd48e9 req-a45b408b-c2ae-4cc3-b529-d3d82410b6f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state reboot_started_hard.
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.945 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.953 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[77e43752-aeb7-48ba-be1f-b9bcc9c937e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369402, 'reachable_time': 39986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217910, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.954 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance destroyed successfully.
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.954 182096 DEBUG nova.objects.instance [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'resources' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:34 compute-0 systemd[1]: run-netns-ovnmeta\x2dfa610e7c\x2d53f8\x2d4775\x2db5b8\x2daa45897b011c.mount: Deactivated successfully.
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.957 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.958 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:23:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:34.958 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dedf10-4dfa-4f80-9b68-b03d94fa2cd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.970 182096 DEBUG nova.virt.libvirt.vif [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:23:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.971 182096 DEBUG nova.network.os_vif_util [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.971 182096 DEBUG nova.network.os_vif_util [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.971 182096 DEBUG os_vif [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.972 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.973 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1eb1ee2-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.974 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.975 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.977 182096 INFO os_vif [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.981 182096 DEBUG nova.virt.libvirt.driver [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Start _get_guest_xml network_info=[{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.984 182096 WARNING nova.virt.libvirt.driver [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.988 182096 DEBUG nova.virt.libvirt.host [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.988 182096 DEBUG nova.virt.libvirt.host [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.991 182096 DEBUG nova.virt.libvirt.host [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.991 182096 DEBUG nova.virt.libvirt.host [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.992 182096 DEBUG nova.virt.libvirt.driver [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.992 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.992 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.993 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.993 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.993 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.993 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.993 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.993 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.994 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.994 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.994 182096 DEBUG nova.virt.hardware [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:23:34 compute-0 nova_compute[182092]: 2026-01-23 09:23:34.994 182096 DEBUG nova.objects.instance [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.007 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.053 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.054 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.054 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.054 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.055 182096 DEBUG nova.virt.libvirt.vif [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:23:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.056 182096 DEBUG nova.network.os_vif_util [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.056 182096 DEBUG nova.network.os_vif_util [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.057 182096 DEBUG nova.objects.instance [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.076 182096 DEBUG nova.virt.libvirt.driver [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <uuid>74eaa05c-e365-4879-af9a-1bf1c102eda7</uuid>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <name>instance-0000004b</name>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerActionsTestJSON-server-1490124151</nova:name>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:23:34</nova:creationTime>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:23:35 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:23:35 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:23:35 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:23:35 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:23:35 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:23:35 compute-0 nova_compute[182092]:         <nova:user uuid="89c019480e524c04af4d250b1c4051e5">tempest-ServerActionsTestJSON-766366320-project-member</nova:user>
Jan 23 09:23:35 compute-0 nova_compute[182092]:         <nova:project uuid="860ef09b9e6e4866bbe99b6e769733a3">tempest-ServerActionsTestJSON-766366320</nova:project>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:23:35 compute-0 nova_compute[182092]:         <nova:port uuid="b1eb1ee2-fc97-423a-bac5-6219bd097839">
Jan 23 09:23:35 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <system>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <entry name="serial">74eaa05c-e365-4879-af9a-1bf1c102eda7</entry>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <entry name="uuid">74eaa05c-e365-4879-af9a-1bf1c102eda7</entry>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </system>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <os>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   </os>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <features>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   </features>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:27:52:52"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <target dev="tapb1eb1ee2-fc"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/console.log" append="off"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <video>
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </video>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <input type="keyboard" bus="usb"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:23:35 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:23:35 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:23:35 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:23:35 compute-0 nova_compute[182092]: </domain>
Jan 23 09:23:35 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.078 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.124 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.125 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.171 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.172 182096 DEBUG nova.objects.instance [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.184 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.241 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.242 182096 DEBUG nova.virt.disk.api [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Checking if we can resize image /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.242 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.287 182096 DEBUG oslo_concurrency.processutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.288 182096 DEBUG nova.virt.disk.api [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Cannot resize image /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.289 182096 DEBUG nova.objects.instance [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.303 182096 DEBUG nova.virt.libvirt.vif [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.304 182096 DEBUG nova.network.os_vif_util [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.304 182096 DEBUG nova.network.os_vif_util [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.305 182096 DEBUG os_vif [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.305 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.306 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.306 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.308 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.308 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1eb1ee2-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.309 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1eb1ee2-fc, col_values=(('external_ids', {'iface-id': 'b1eb1ee2-fc97-423a-bac5-6219bd097839', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:52:52', 'vm-uuid': '74eaa05c-e365-4879-af9a-1bf1c102eda7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.310 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.3106] manager: (tapb1eb1ee2-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.313 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.314 182096 INFO os_vif [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:23:35 compute-0 kernel: tapb1eb1ee2-fc: entered promiscuous mode
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.367 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.370 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 ovn_controller[94697]: 2026-01-23T09:23:35Z|00225|binding|INFO|Claiming lport b1eb1ee2-fc97-423a-bac5-6219bd097839 for this chassis.
Jan 23 09:23:35 compute-0 ovn_controller[94697]: 2026-01-23T09:23:35Z|00226|binding|INFO|b1eb1ee2-fc97-423a-bac5-6219bd097839: Claiming fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.3723] manager: (tapb1eb1ee2-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Jan 23 09:23:35 compute-0 systemd-udevd[217848]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.373 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.3788] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.378 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.3803] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.3839] device (tapb1eb1ee2-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.381 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:52:52 10.100.0.10'], port_security=['fa:16:3e:27:52:52 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b1eb1ee2-fc97-423a-bac5-6219bd097839) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.382 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b1eb1ee2-fc97-423a-bac5-6219bd097839 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.3845] device (tapb1eb1ee2-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.386 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.394 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[38bbf085-e06e-4247-aa15-bdc48bbeef71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.394 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa610e7c-51 in ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.396 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa610e7c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.396 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[52c95aed-62ad-4a0a-a23c-d78a384a7657]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.397 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[83ea49b0-1940-4132-bec6-1d4429769d78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.405 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[07f483c9-c93b-4d5a-997a-ea27cb349909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 systemd-machined[153562]: New machine qemu-35-instance-0000004b.
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.428 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d8ed4ac-0f45-4912-87ce-aef39f03951d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.452 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[39226c81-6043-4947-9748-b651e0f9d6b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 systemd[1]: Started Virtual Machine qemu-35-instance-0000004b.
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.4705] manager: (tapfa610e7c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.471 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b04ba3a9-3a4b-4388-ac26-cb76016ac94a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.503 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ccd485-98f8-4f3c-b2c4-5d826b7daca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.505 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[69c818c2-df21-46bc-a721-5c03e74fa5c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.5313] device (tapfa610e7c-50): carrier: link connected
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.538 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4de4fbc3-ea44-4e31-b8ca-83cb4c3f3249]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.551 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cf255a1f-773e-464c-8f46-9b4a273820f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372101, 'reachable_time': 34317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217970, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.562 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ff61e8-90af-4c9f-809d-51b3d66dd6cb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:516c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372101, 'tstamp': 372101}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217971, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.572 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.576 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0cd441-aad9-4810-8237-44519e7f753e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372101, 'reachable_time': 34317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217972, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.577 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.593 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 ovn_controller[94697]: 2026-01-23T09:23:35Z|00227|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 ovn-installed in OVS
Jan 23 09:23:35 compute-0 ovn_controller[94697]: 2026-01-23T09:23:35Z|00228|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 up in Southbound
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.604 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.605 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[54be699a-39b5-4063-9f42-593d03dd0fda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.647 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f5cf1538-6761-49b9-89b7-d999123a4ce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.648 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.648 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.648 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:35 compute-0 kernel: tapfa610e7c-50: entered promiscuous mode
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.649 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 NetworkManager[54920]: <info>  [1769160215.6502] manager: (tapfa610e7c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.651 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.653 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.654 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 ovn_controller[94697]: 2026-01-23T09:23:35Z|00229|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.655 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.656 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.657 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd4216f-d038-4bb3-a728-e8870a2a58a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.657 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:23:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:35.658 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'env', 'PROCESS_TAG=haproxy-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa610e7c-53f8-4775-b5b8-aa45897b011c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.666 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:35 compute-0 podman[218000]: 2026-01-23 09:23:35.930764675 +0000 UTC m=+0.029143442 container create 13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:23:35 compute-0 systemd[1]: Started libpod-conmon-13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e.scope.
Jan 23 09:23:35 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:23:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e46e497f6e3d5067a805457d93e1eeeef7254a895bd1cc766dc3319b7abaaffe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:23:35 compute-0 podman[218000]: 2026-01-23 09:23:35.988145028 +0000 UTC m=+0.086523784 container init 13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:23:35 compute-0 podman[218000]: 2026-01-23 09:23:35.992088005 +0000 UTC m=+0.090466762 container start 13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:23:35 compute-0 podman[218000]: 2026-01-23 09:23:35.917570794 +0000 UTC m=+0.015949571 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.993 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:35 compute-0 nova_compute[182092]: 2026-01-23 09:23:35.994 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:36 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218012]: [NOTICE]   (218016) : New worker (218018) forked
Jan 23 09:23:36 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218012]: [NOTICE]   (218016) : Loading success.
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.007 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.082 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.082 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.087 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.087 182096 INFO nova.compute.claims [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.171 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 74eaa05c-e365-4879-af9a-1bf1c102eda7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.171 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160216.1712363, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.172 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Resumed (Lifecycle Event)
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.173 182096 DEBUG nova.compute.manager [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.176 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance rebooted successfully.
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.176 182096 DEBUG nova.compute.manager [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.178 182096 DEBUG nova.compute.provider_tree [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.194 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.196 182096 DEBUG nova.scheduler.client.report [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.199 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.222 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.223 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.225 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.225 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160216.17285, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.225 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Started (Lifecycle Event)
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.255 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.258 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.275 182096 DEBUG oslo_concurrency.lockutils [None req-37f6789d-2f8a-4277-837d-b58f345da0d8 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.983s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.287 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.288 182096 DEBUG nova.network.neutron [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.300 182096 INFO nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.310 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.391 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.392 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.392 182096 INFO nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Creating image(s)
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.393 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.393 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.394 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.405 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.462 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.463 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.463 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.473 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.498 182096 DEBUG nova.policy [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5edc8d287a9f4ffd90f54ecea19df7e8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.535 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.536 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.568 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.569 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.570 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.627 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.628 182096 DEBUG nova.virt.disk.api [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Checking if we can resize image /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.629 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.685 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.686 182096 DEBUG nova.virt.disk.api [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Cannot resize image /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.686 182096 DEBUG nova.objects.instance [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'migration_context' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.697 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.698 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Ensure instance console log exists: /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.698 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.698 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:36 compute-0 nova_compute[182092]: 2026-01-23 09:23:36.699 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.023 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.053 182096 DEBUG nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.053 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.054 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.054 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.054 182096 DEBUG nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.054 182096 WARNING nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.055 182096 DEBUG nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.055 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.055 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.055 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.056 182096 DEBUG nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.056 182096 WARNING nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.056 182096 DEBUG nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.056 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.057 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.057 182096 DEBUG oslo_concurrency.lockutils [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.057 182096 DEBUG nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.057 182096 WARNING nova.compute.manager [req-ad4fd5d7-5909-44d0-92c5-aa7876cacc4a req-8921f0cc-a051-436b-ae1b-6d4cf5b26bb3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.111 182096 DEBUG nova.network.neutron [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Successfully created port: d5398983-1fbc-4441-9ed2-b93902eae444 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.464 182096 INFO nova.compute.manager [None req-34f0714e-4874-4da8-838b-c389eebd2ec3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Get console output
Jan 23 09:23:37 compute-0 ovn_controller[94697]: 2026-01-23T09:23:37Z|00230|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.658 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.827 182096 DEBUG nova.network.neutron [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Successfully updated port: d5398983-1fbc-4441-9ed2-b93902eae444 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.837 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "refresh_cache-e3ec110a-6bce-48f3-bfa4-2da541b334a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.837 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquired lock "refresh_cache-e3ec110a-6bce-48f3-bfa4-2da541b334a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:23:37 compute-0 nova_compute[182092]: 2026-01-23 09:23:37.837 182096 DEBUG nova.network.neutron [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:23:38 compute-0 nova_compute[182092]: 2026-01-23 09:23:38.077 182096 DEBUG nova.network.neutron [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:23:38 compute-0 podman[218047]: 2026-01-23 09:23:38.205215404 +0000 UTC m=+0.043033806 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 09:23:38 compute-0 podman[218048]: 2026-01-23 09:23:38.215437118 +0000 UTC m=+0.051842866 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:23:38 compute-0 nova_compute[182092]: 2026-01-23 09:23:38.915 182096 DEBUG nova.compute.manager [req-bf55c2e4-1834-4805-9047-e23f8dd88848 req-4c17423d-d7a5-4fc4-ba0e-d90d3e1ebed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-changed-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:38 compute-0 nova_compute[182092]: 2026-01-23 09:23:38.916 182096 DEBUG nova.compute.manager [req-bf55c2e4-1834-4805-9047-e23f8dd88848 req-4c17423d-d7a5-4fc4-ba0e-d90d3e1ebed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Refreshing instance network info cache due to event network-changed-d5398983-1fbc-4441-9ed2-b93902eae444. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:23:38 compute-0 nova_compute[182092]: 2026-01-23 09:23:38.916 182096 DEBUG oslo_concurrency.lockutils [req-bf55c2e4-1834-4805-9047-e23f8dd88848 req-4c17423d-d7a5-4fc4-ba0e-d90d3e1ebed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-e3ec110a-6bce-48f3-bfa4-2da541b334a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.165 182096 DEBUG nova.network.neutron [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Updating instance_info_cache with network_info: [{"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.183 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Releasing lock "refresh_cache-e3ec110a-6bce-48f3-bfa4-2da541b334a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.184 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance network_info: |[{"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.185 182096 DEBUG oslo_concurrency.lockutils [req-bf55c2e4-1834-4805-9047-e23f8dd88848 req-4c17423d-d7a5-4fc4-ba0e-d90d3e1ebed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-e3ec110a-6bce-48f3-bfa4-2da541b334a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.185 182096 DEBUG nova.network.neutron [req-bf55c2e4-1834-4805-9047-e23f8dd88848 req-4c17423d-d7a5-4fc4-ba0e-d90d3e1ebed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Refreshing network info cache for port d5398983-1fbc-4441-9ed2-b93902eae444 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.188 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Start _get_guest_xml network_info=[{"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.193 182096 WARNING nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.198 182096 DEBUG nova.virt.libvirt.host [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.198 182096 DEBUG nova.virt.libvirt.host [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.201 182096 DEBUG nova.virt.libvirt.host [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.202 182096 DEBUG nova.virt.libvirt.host [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.203 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.203 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.204 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.205 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.205 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.206 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.206 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.206 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.207 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.208 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.208 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.209 182096 DEBUG nova.virt.hardware [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.212 182096 DEBUG nova.virt.libvirt.vif [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-814236726',display_name='tempest-ServerDiskConfigTestJSON-server-814236726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-814236726',id=78,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-9ly5rpza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:36Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=e3ec110a-6bce-48f3-bfa4-2da541b334a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.213 182096 DEBUG nova.network.os_vif_util [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.214 182096 DEBUG nova.network.os_vif_util [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.215 182096 DEBUG nova.objects.instance [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'pci_devices' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.233 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <uuid>e3ec110a-6bce-48f3-bfa4-2da541b334a3</uuid>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <name>instance-0000004e</name>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-814236726</nova:name>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:23:39</nova:creationTime>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:23:39 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:23:39 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:23:39 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:23:39 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:23:39 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:23:39 compute-0 nova_compute[182092]:         <nova:user uuid="5edc8d287a9f4ffd90f54ecea19df7e8">tempest-ServerDiskConfigTestJSON-1935371313-project-member</nova:user>
Jan 23 09:23:39 compute-0 nova_compute[182092]:         <nova:project uuid="b9afff815c4546ad97f6d3afa2c35483">tempest-ServerDiskConfigTestJSON-1935371313</nova:project>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:23:39 compute-0 nova_compute[182092]:         <nova:port uuid="d5398983-1fbc-4441-9ed2-b93902eae444">
Jan 23 09:23:39 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <system>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <entry name="serial">e3ec110a-6bce-48f3-bfa4-2da541b334a3</entry>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <entry name="uuid">e3ec110a-6bce-48f3-bfa4-2da541b334a3</entry>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </system>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <os>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   </os>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <features>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   </features>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:56:04:a2"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <target dev="tapd5398983-1f"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/console.log" append="off"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <video>
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </video>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:23:39 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:23:39 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:23:39 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:23:39 compute-0 nova_compute[182092]: </domain>
Jan 23 09:23:39 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.238 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Preparing to wait for external event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.239 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.239 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.240 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.240 182096 DEBUG nova.virt.libvirt.vif [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-814236726',display_name='tempest-ServerDiskConfigTestJSON-server-814236726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-814236726',id=78,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-9ly5rpza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:36Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=e3ec110a-6bce-48f3-bfa4-2da541b334a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.241 182096 DEBUG nova.network.os_vif_util [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.242 182096 DEBUG nova.network.os_vif_util [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.243 182096 DEBUG os_vif [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.243 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.243 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.244 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.248 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.249 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5398983-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.249 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd5398983-1f, col_values=(('external_ids', {'iface-id': 'd5398983-1fbc-4441-9ed2-b93902eae444', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:04:a2', 'vm-uuid': 'e3ec110a-6bce-48f3-bfa4-2da541b334a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.251 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:39 compute-0 NetworkManager[54920]: <info>  [1769160219.2519] manager: (tapd5398983-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.253 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.256 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.257 182096 INFO os_vif [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f')
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.295 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.296 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.296 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No VIF found with MAC fa:16:3e:56:04:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:23:39 compute-0 nova_compute[182092]: 2026-01-23 09:23:39.296 182096 INFO nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Using config drive
Jan 23 09:23:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:39.858 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:39.859 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:39.859 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.007 182096 INFO nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Creating config drive at /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.011 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy0wj55pk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.130 182096 DEBUG oslo_concurrency.processutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy0wj55pk" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:40 compute-0 kernel: tapd5398983-1f: entered promiscuous mode
Jan 23 09:23:40 compute-0 NetworkManager[54920]: <info>  [1769160220.1714] manager: (tapd5398983-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.173 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:40 compute-0 ovn_controller[94697]: 2026-01-23T09:23:40Z|00231|binding|INFO|Claiming lport d5398983-1fbc-4441-9ed2-b93902eae444 for this chassis.
Jan 23 09:23:40 compute-0 ovn_controller[94697]: 2026-01-23T09:23:40Z|00232|binding|INFO|d5398983-1fbc-4441-9ed2-b93902eae444: Claiming fa:16:3e:56:04:a2 10.100.0.13
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.185 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:04:a2 10.100.0.13'], port_security=['fa:16:3e:56:04:a2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e3ec110a-6bce-48f3-bfa4-2da541b334a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3598732e-75d5-4a2b-8884-521ea92eab7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7652143e-debd-4a5a-90a5-8ccbe554976b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d5ac5-7f76-4d27-a905-10a9d18c8f4a, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=d5398983-1fbc-4441-9ed2-b93902eae444) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.185 103978 INFO neutron.agent.ovn.metadata.agent [-] Port d5398983-1fbc-4441-9ed2-b93902eae444 in datapath 3598732e-75d5-4a2b-8884-521ea92eab7a bound to our chassis
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.188 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.188 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:40 compute-0 ovn_controller[94697]: 2026-01-23T09:23:40Z|00233|binding|INFO|Setting lport d5398983-1fbc-4441-9ed2-b93902eae444 ovn-installed in OVS
Jan 23 09:23:40 compute-0 ovn_controller[94697]: 2026-01-23T09:23:40Z|00234|binding|INFO|Setting lport d5398983-1fbc-4441-9ed2-b93902eae444 up in Southbound
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.192 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.202 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f9bd51a6-6f05-4825-b95e-78f7d0caabad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.203 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3598732e-71 in ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:23:40 compute-0 systemd-udevd[218105]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.205 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3598732e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.205 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5239bab5-f7d8-4723-af40-9b495f228907]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 systemd-machined[153562]: New machine qemu-36-instance-0000004e.
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.208 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6618a4f0-195f-481b-a16d-9bc455d1fc2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 systemd[1]: Started Virtual Machine qemu-36-instance-0000004e.
Jan 23 09:23:40 compute-0 NetworkManager[54920]: <info>  [1769160220.2179] device (tapd5398983-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:23:40 compute-0 NetworkManager[54920]: <info>  [1769160220.2183] device (tapd5398983-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.217 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[d67bc9b4-c189-44d4-9215-8731b82fb3d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.228 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[12c104e4-0231-45a9-8f41-cc7744031f51]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.252 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[995c2113-8bd6-47d9-ac56-4230eb500042]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 NetworkManager[54920]: <info>  [1769160220.2561] manager: (tap3598732e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/131)
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.255 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[93001d7a-3b76-4bc7-ba56-52fb5843a727]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.293 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[84306911-dc0a-4416-9fef-3baa31bb7ae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.295 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[62e51f86-1178-412d-b745-1e189c56e53d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 podman[218110]: 2026-01-23 09:23:40.310417425 +0000 UTC m=+0.069149996 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=openstack_network_exporter)
Jan 23 09:23:40 compute-0 NetworkManager[54920]: <info>  [1769160220.3179] device (tap3598732e-70): carrier: link connected
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.321 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[88749b72-7180-4f51-b759-7f416dcaca29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.334 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a0132bec-8fbc-4b0e-96ee-5b941887909d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3598732e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:00:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372580, 'reachable_time': 43856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218148, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.349 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed93759-f87e-4bb8-be71-18d13a738844]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372580, 'tstamp': 372580}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218149, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.361 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[12e0f3bf-fb6f-4383-9575-73808d589698]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3598732e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:00:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372580, 'reachable_time': 43856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218150, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.386 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[953a50e1-3961-4777-8f13-4cb5ffe948c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.430 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[564fc06f-c3f8-4b74-abd9-fbf26a4969b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.431 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3598732e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.432 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.432 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3598732e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:40 compute-0 NetworkManager[54920]: <info>  [1769160220.4341] manager: (tap3598732e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Jan 23 09:23:40 compute-0 kernel: tap3598732e-70: entered promiscuous mode
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.438 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.441 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3598732e-70, col_values=(('external_ids', {'iface-id': '9bf071ba-d027-4af7-a154-40b491b7a535'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:40 compute-0 ovn_controller[94697]: 2026-01-23T09:23:40Z|00235|binding|INFO|Releasing lport 9bf071ba-d027-4af7-a154-40b491b7a535 from this chassis (sb_readonly=0)
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.443 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.446 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.454 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[633fbf3c-3627-432d-908e-de5d03356346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:40 compute-0 nova_compute[182092]: 2026-01-23 09:23:40.456 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.457 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:23:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:40.458 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'env', 'PROCESS_TAG=haproxy-3598732e-75d5-4a2b-8884-521ea92eab7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3598732e-75d5-4a2b-8884-521ea92eab7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:23:40 compute-0 podman[218180]: 2026-01-23 09:23:40.747262021 +0000 UTC m=+0.031335427 container create 2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 09:23:40 compute-0 systemd[1]: Started libpod-conmon-2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6.scope.
Jan 23 09:23:40 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:23:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/696942b16336f446213986095a3929a55ac4976ebe70eb25e1deae83abe2c10b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:23:40 compute-0 podman[218180]: 2026-01-23 09:23:40.8068726 +0000 UTC m=+0.090946016 container init 2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:23:40 compute-0 podman[218180]: 2026-01-23 09:23:40.811873465 +0000 UTC m=+0.095946881 container start 2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:23:40 compute-0 podman[218180]: 2026-01-23 09:23:40.731601046 +0000 UTC m=+0.015674471 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:23:40 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218192]: [NOTICE]   (218196) : New worker (218198) forked
Jan 23 09:23:40 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218192]: [NOTICE]   (218196) : Loading success.
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.190 182096 DEBUG nova.network.neutron [req-bf55c2e4-1834-4805-9047-e23f8dd88848 req-4c17423d-d7a5-4fc4-ba0e-d90d3e1ebed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Updated VIF entry in instance network info cache for port d5398983-1fbc-4441-9ed2-b93902eae444. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.191 182096 DEBUG nova.network.neutron [req-bf55c2e4-1834-4805-9047-e23f8dd88848 req-4c17423d-d7a5-4fc4-ba0e-d90d3e1ebed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Updating instance_info_cache with network_info: [{"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.214 182096 DEBUG oslo_concurrency.lockutils [req-bf55c2e4-1834-4805-9047-e23f8dd88848 req-4c17423d-d7a5-4fc4-ba0e-d90d3e1ebed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-e3ec110a-6bce-48f3-bfa4-2da541b334a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.255 182096 DEBUG nova.compute.manager [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.256 182096 DEBUG oslo_concurrency.lockutils [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.256 182096 DEBUG oslo_concurrency.lockutils [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.256 182096 DEBUG oslo_concurrency.lockutils [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.256 182096 DEBUG nova.compute.manager [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Processing event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.256 182096 DEBUG nova.compute.manager [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.257 182096 DEBUG oslo_concurrency.lockutils [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.257 182096 DEBUG oslo_concurrency.lockutils [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.257 182096 DEBUG oslo_concurrency.lockutils [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.257 182096 DEBUG nova.compute.manager [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] No waiting events found dispatching network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.258 182096 WARNING nova.compute.manager [req-fd13eaf0-59cf-4f2f-9507-ba738e02e5a9 req-5d439c6b-3aab-4dd3-b2c7-11de0a76772d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received unexpected event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 for instance with vm_state building and task_state spawning.
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.259 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.259 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160221.2593899, e3ec110a-6bce-48f3-bfa4-2da541b334a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.260 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] VM Started (Lifecycle Event)
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.262 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.265 182096 INFO nova.virt.libvirt.driver [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance spawned successfully.
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.265 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.275 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.279 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.281 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.281 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.281 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.282 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.282 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.282 182096 DEBUG nova.virt.libvirt.driver [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.300 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.300 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160221.2621326, e3ec110a-6bce-48f3-bfa4-2da541b334a3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.300 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] VM Paused (Lifecycle Event)
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.322 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.324 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160221.2629128, e3ec110a-6bce-48f3-bfa4-2da541b334a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.324 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] VM Resumed (Lifecycle Event)
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.346 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.348 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.361 182096 INFO nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Took 4.97 seconds to spawn the instance on the hypervisor.
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.362 182096 DEBUG nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.366 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.415 182096 INFO nova.compute.manager [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Took 5.36 seconds to build instance.
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.425 182096 DEBUG oslo_concurrency.lockutils [None req-7d284e72-22cc-41d1-aa46-06e1580f4dff 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.709 182096 DEBUG oslo_concurrency.lockutils [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.710 182096 DEBUG oslo_concurrency.lockutils [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.710 182096 DEBUG nova.compute.manager [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.713 182096 DEBUG nova.compute.manager [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.714 182096 DEBUG nova.objects.instance [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'flavor' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.736 182096 DEBUG nova.objects.instance [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'info_cache' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:41 compute-0 nova_compute[182092]: 2026-01-23 09:23:41.764 182096 DEBUG nova.virt.libvirt.driver [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:23:42 compute-0 nova_compute[182092]: 2026-01-23 09:23:42.024 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:44 compute-0 nova_compute[182092]: 2026-01-23 09:23:44.253 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:45 compute-0 nova_compute[182092]: 2026-01-23 09:23:45.189 182096 INFO nova.compute.manager [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Rebuilding instance
Jan 23 09:23:45 compute-0 nova_compute[182092]: 2026-01-23 09:23:45.416 182096 DEBUG nova.compute.manager [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:45 compute-0 nova_compute[182092]: 2026-01-23 09:23:45.482 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'pci_requests' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:45 compute-0 nova_compute[182092]: 2026-01-23 09:23:45.501 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'pci_devices' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:45 compute-0 nova_compute[182092]: 2026-01-23 09:23:45.528 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'resources' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:45 compute-0 nova_compute[182092]: 2026-01-23 09:23:45.559 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'migration_context' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:45 compute-0 nova_compute[182092]: 2026-01-23 09:23:45.589 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:23:45 compute-0 nova_compute[182092]: 2026-01-23 09:23:45.593 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:23:47 compute-0 nova_compute[182092]: 2026-01-23 09:23:47.026 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:48 compute-0 nova_compute[182092]: 2026-01-23 09:23:48.288 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:48 compute-0 ovn_controller[94697]: 2026-01-23T09:23:48Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:23:49 compute-0 podman[218217]: 2026-01-23 09:23:49.229895731 +0000 UTC m=+0.065482128 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:23:49 compute-0 nova_compute[182092]: 2026-01-23 09:23:49.253 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:51 compute-0 ovn_controller[94697]: 2026-01-23T09:23:51Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:56:04:a2 10.100.0.13
Jan 23 09:23:51 compute-0 ovn_controller[94697]: 2026-01-23T09:23:51Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:56:04:a2 10.100.0.13
Jan 23 09:23:51 compute-0 nova_compute[182092]: 2026-01-23 09:23:51.734 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:51 compute-0 nova_compute[182092]: 2026-01-23 09:23:51.794 182096 DEBUG nova.virt.libvirt.driver [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:23:52 compute-0 nova_compute[182092]: 2026-01-23 09:23:52.027 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:53 compute-0 kernel: tapb1eb1ee2-fc (unregistering): left promiscuous mode
Jan 23 09:23:53 compute-0 NetworkManager[54920]: <info>  [1769160233.9254] device (tapb1eb1ee2-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:23:53 compute-0 nova_compute[182092]: 2026-01-23 09:23:53.929 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:53 compute-0 ovn_controller[94697]: 2026-01-23T09:23:53Z|00236|binding|INFO|Releasing lport b1eb1ee2-fc97-423a-bac5-6219bd097839 from this chassis (sb_readonly=0)
Jan 23 09:23:53 compute-0 ovn_controller[94697]: 2026-01-23T09:23:53Z|00237|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 down in Southbound
Jan 23 09:23:53 compute-0 ovn_controller[94697]: 2026-01-23T09:23:53Z|00238|binding|INFO|Removing iface tapb1eb1ee2-fc ovn-installed in OVS
Jan 23 09:23:53 compute-0 nova_compute[182092]: 2026-01-23 09:23:53.931 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:53 compute-0 nova_compute[182092]: 2026-01-23 09:23:53.941 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:53.944 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:52:52 10.100.0.10'], port_security=['fa:16:3e:27:52:52 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b1eb1ee2-fc97-423a-bac5-6219bd097839) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:23:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:53.945 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b1eb1ee2-fc97-423a-bac5-6219bd097839 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:23:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:53.946 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa610e7c-53f8-4775-b5b8-aa45897b011c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:23:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:53.947 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[505cf91a-c790-425d-a257-069763d30959]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:53.949 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace which is not needed anymore
Jan 23 09:23:53 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 23 09:23:53 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004b.scope: Consumed 11.766s CPU time.
Jan 23 09:23:53 compute-0 systemd-machined[153562]: Machine qemu-35-instance-0000004b terminated.
Jan 23 09:23:54 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218012]: [NOTICE]   (218016) : haproxy version is 2.8.14-c23fe91
Jan 23 09:23:54 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218012]: [NOTICE]   (218016) : path to executable is /usr/sbin/haproxy
Jan 23 09:23:54 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218012]: [WARNING]  (218016) : Exiting Master process...
Jan 23 09:23:54 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218012]: [ALERT]    (218016) : Current worker (218018) exited with code 143 (Terminated)
Jan 23 09:23:54 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218012]: [WARNING]  (218016) : All workers exited. Exiting... (0)
Jan 23 09:23:54 compute-0 systemd[1]: libpod-13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e.scope: Deactivated successfully.
Jan 23 09:23:54 compute-0 conmon[218012]: conmon 13dcfaa31317c3ea1e8e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e.scope/container/memory.events
Jan 23 09:23:54 compute-0 podman[218277]: 2026-01-23 09:23:54.045688045 +0000 UTC m=+0.031924499 container died 13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:23:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e-userdata-shm.mount: Deactivated successfully.
Jan 23 09:23:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-e46e497f6e3d5067a805457d93e1eeeef7254a895bd1cc766dc3319b7abaaffe-merged.mount: Deactivated successfully.
Jan 23 09:23:54 compute-0 podman[218277]: 2026-01-23 09:23:54.066512282 +0000 UTC m=+0.052748735 container cleanup 13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:23:54 compute-0 systemd[1]: libpod-conmon-13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e.scope: Deactivated successfully.
Jan 23 09:23:54 compute-0 podman[218301]: 2026-01-23 09:23:54.107714482 +0000 UTC m=+0.026112975 container remove 13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.110 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[698ea6f9-1285-475b-ba4e-94aa654d7417]: (4, ('Fri Jan 23 09:23:54 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e)\n13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e\nFri Jan 23 09:23:54 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e)\n13dcfaa31317c3ea1e8e7bcce9a77a4d31e5184a277710893e5741e5b535cd7e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.111 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ca47a5-75f9-4177-b9ab-194b9201d0b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.112 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.114 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:54 compute-0 kernel: tapfa610e7c-50: left promiscuous mode
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.130 182096 DEBUG nova.compute.manager [req-cc4540d4-e610-4e5b-bcfd-242f1ad39518 req-4f7c075b-0c8d-4f42-bf2d-494f98c10c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.130 182096 DEBUG oslo_concurrency.lockutils [req-cc4540d4-e610-4e5b-bcfd-242f1ad39518 req-4f7c075b-0c8d-4f42-bf2d-494f98c10c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.131 182096 DEBUG oslo_concurrency.lockutils [req-cc4540d4-e610-4e5b-bcfd-242f1ad39518 req-4f7c075b-0c8d-4f42-bf2d-494f98c10c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.131 182096 DEBUG oslo_concurrency.lockutils [req-cc4540d4-e610-4e5b-bcfd-242f1ad39518 req-4f7c075b-0c8d-4f42-bf2d-494f98c10c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.131 182096 DEBUG nova.compute.manager [req-cc4540d4-e610-4e5b-bcfd-242f1ad39518 req-4f7c075b-0c8d-4f42-bf2d-494f98c10c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.132 182096 WARNING nova.compute.manager [req-cc4540d4-e610-4e5b-bcfd-242f1ad39518 req-4f7c075b-0c8d-4f42-bf2d-494f98c10c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state powering-off.
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.132 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac3b477-20cf-4e3c-bdca-45ac473f0c98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.132 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.140 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2f4b2a5c-9bff-4bd7-8afd-07cf5c890e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.141 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[da899c51-4b8a-4cc6-95b1-dd509cd46a7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.152 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6b444d-6e29-4f63-95ef-4ab88925be87]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372093, 'reachable_time': 25444, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218316, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:54 compute-0 systemd[1]: run-netns-ovnmeta\x2dfa610e7c\x2d53f8\x2d4775\x2db5b8\x2daa45897b011c.mount: Deactivated successfully.
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.155 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:23:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:54.155 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[f416c1c2-b7ed-485a-ba01-b6b1d6a50f4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.254 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.801 182096 INFO nova.virt.libvirt.driver [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance shutdown successfully after 13 seconds.
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.805 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance destroyed successfully.
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.805 182096 DEBUG nova.objects.instance [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.815 182096 DEBUG nova.compute.manager [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:23:54 compute-0 nova_compute[182092]: 2026-01-23 09:23:54.871 182096 DEBUG oslo_concurrency.lockutils [None req-599ee9b6-5878-45e2-bee1-9e486db05dcc 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:55 compute-0 nova_compute[182092]: 2026-01-23 09:23:55.629 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.003 182096 DEBUG nova.objects.instance [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'flavor' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.026 182096 DEBUG nova.objects.instance [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'info_cache' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.056 182096 DEBUG oslo_concurrency.lockutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.057 182096 DEBUG oslo_concurrency.lockutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.057 182096 DEBUG nova.network.neutron [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.212 182096 DEBUG nova.compute.manager [req-e03c4003-d44b-4483-9f21-46c040f20922 req-33931e69-d204-402a-9257-b8f78595fa1d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.213 182096 DEBUG oslo_concurrency.lockutils [req-e03c4003-d44b-4483-9f21-46c040f20922 req-33931e69-d204-402a-9257-b8f78595fa1d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.213 182096 DEBUG oslo_concurrency.lockutils [req-e03c4003-d44b-4483-9f21-46c040f20922 req-33931e69-d204-402a-9257-b8f78595fa1d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.213 182096 DEBUG oslo_concurrency.lockutils [req-e03c4003-d44b-4483-9f21-46c040f20922 req-33931e69-d204-402a-9257-b8f78595fa1d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.213 182096 DEBUG nova.compute.manager [req-e03c4003-d44b-4483-9f21-46c040f20922 req-33931e69-d204-402a-9257-b8f78595fa1d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:56 compute-0 nova_compute[182092]: 2026-01-23 09:23:56.213 182096 WARNING nova.compute.manager [req-e03c4003-d44b-4483-9f21-46c040f20922 req-33931e69-d204-402a-9257-b8f78595fa1d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state stopped and task_state powering-on.
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.031 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:57 compute-0 podman[218334]: 2026-01-23 09:23:57.21042672 +0000 UTC m=+0.046687491 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 23 09:23:57 compute-0 podman[218335]: 2026-01-23 09:23:57.215689019 +0000 UTC m=+0.047856816 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.660 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:23:57 compute-0 kernel: tapd5398983-1f (unregistering): left promiscuous mode
Jan 23 09:23:57 compute-0 NetworkManager[54920]: <info>  [1769160237.7506] device (tapd5398983-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.753 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:57 compute-0 ovn_controller[94697]: 2026-01-23T09:23:57Z|00239|binding|INFO|Releasing lport d5398983-1fbc-4441-9ed2-b93902eae444 from this chassis (sb_readonly=0)
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.758 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:57 compute-0 ovn_controller[94697]: 2026-01-23T09:23:57Z|00240|binding|INFO|Setting lport d5398983-1fbc-4441-9ed2-b93902eae444 down in Southbound
Jan 23 09:23:57 compute-0 ovn_controller[94697]: 2026-01-23T09:23:57Z|00241|binding|INFO|Removing iface tapd5398983-1f ovn-installed in OVS
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.761 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.766 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:04:a2 10.100.0.13'], port_security=['fa:16:3e:56:04:a2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e3ec110a-6bce-48f3-bfa4-2da541b334a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3598732e-75d5-4a2b-8884-521ea92eab7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7652143e-debd-4a5a-90a5-8ccbe554976b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d5ac5-7f76-4d27-a905-10a9d18c8f4a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=d5398983-1fbc-4441-9ed2-b93902eae444) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.767 103978 INFO neutron.agent.ovn.metadata.agent [-] Port d5398983-1fbc-4441-9ed2-b93902eae444 in datapath 3598732e-75d5-4a2b-8884-521ea92eab7a unbound from our chassis
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.768 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3598732e-75d5-4a2b-8884-521ea92eab7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.769 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3b736c55-2e63-46b1-9304-0d3a733022d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.769 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a namespace which is not needed anymore
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.773 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:57 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Jan 23 09:23:57 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004e.scope: Consumed 11.185s CPU time.
Jan 23 09:23:57 compute-0 systemd-machined[153562]: Machine qemu-36-instance-0000004e terminated.
Jan 23 09:23:57 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218192]: [NOTICE]   (218196) : haproxy version is 2.8.14-c23fe91
Jan 23 09:23:57 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218192]: [NOTICE]   (218196) : path to executable is /usr/sbin/haproxy
Jan 23 09:23:57 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218192]: [ALERT]    (218196) : Current worker (218198) exited with code 143 (Terminated)
Jan 23 09:23:57 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218192]: [WARNING]  (218196) : All workers exited. Exiting... (0)
Jan 23 09:23:57 compute-0 systemd[1]: libpod-2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6.scope: Deactivated successfully.
Jan 23 09:23:57 compute-0 podman[218394]: 2026-01-23 09:23:57.877239927 +0000 UTC m=+0.036915895 container died 2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:23:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6-userdata-shm.mount: Deactivated successfully.
Jan 23 09:23:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-696942b16336f446213986095a3929a55ac4976ebe70eb25e1deae83abe2c10b-merged.mount: Deactivated successfully.
Jan 23 09:23:57 compute-0 podman[218394]: 2026-01-23 09:23:57.898466923 +0000 UTC m=+0.058142892 container cleanup 2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:23:57 compute-0 systemd[1]: libpod-conmon-2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6.scope: Deactivated successfully.
Jan 23 09:23:57 compute-0 podman[218417]: 2026-01-23 09:23:57.943430119 +0000 UTC m=+0.026529751 container remove 2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.947 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ac43c6ce-4611-436f-9ffa-e388211ab5a2]: (4, ('Fri Jan 23 09:23:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a (2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6)\n2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6\nFri Jan 23 09:23:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a (2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6)\n2ed3d78f9803feeac20edbec87cfb39b45eab5e9f21428338bf0f84656e331d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.949 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[22248eb7-f5d5-4dd9-baf5-101cf5205007]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.949 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3598732e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.951 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:57 compute-0 kernel: tap3598732e-70: left promiscuous mode
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.964 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:57 compute-0 NetworkManager[54920]: <info>  [1769160237.9666] manager: (tapd5398983-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Jan 23 09:23:57 compute-0 nova_compute[182092]: 2026-01-23 09:23:57.971 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.973 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5dbdc9-7c41-4189-8a36-533ef49b7edd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.983 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c89d2ac7-e422-4652-b88d-f807c4f49c60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.984 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b66622-1f3b-476f-823f-fde2ca5c1b1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:57.998 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7aa46d-994a-476a-9ca8-21844e2ed00e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372573, 'reachable_time': 36981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218447, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d3598732e\x2d75d5\x2d4a2b\x2d8884\x2d521ea92eab7a.mount: Deactivated successfully.
Jan 23 09:23:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:58.000 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:23:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:23:58.000 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[67333dd9-dd61-4788-a1ad-972596dc9bf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.284 182096 DEBUG nova.compute.manager [req-9d50c5af-3c3d-4212-80cb-0699beec03c7 req-641f816d-643a-4aee-a164-2db22d31f140 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-unplugged-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.285 182096 DEBUG oslo_concurrency.lockutils [req-9d50c5af-3c3d-4212-80cb-0699beec03c7 req-641f816d-643a-4aee-a164-2db22d31f140 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.285 182096 DEBUG oslo_concurrency.lockutils [req-9d50c5af-3c3d-4212-80cb-0699beec03c7 req-641f816d-643a-4aee-a164-2db22d31f140 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.285 182096 DEBUG oslo_concurrency.lockutils [req-9d50c5af-3c3d-4212-80cb-0699beec03c7 req-641f816d-643a-4aee-a164-2db22d31f140 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.285 182096 DEBUG nova.compute.manager [req-9d50c5af-3c3d-4212-80cb-0699beec03c7 req-641f816d-643a-4aee-a164-2db22d31f140 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] No waiting events found dispatching network-vif-unplugged-d5398983-1fbc-4441-9ed2-b93902eae444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.286 182096 WARNING nova.compute.manager [req-9d50c5af-3c3d-4212-80cb-0699beec03c7 req-641f816d-643a-4aee-a164-2db22d31f140 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received unexpected event network-vif-unplugged-d5398983-1fbc-4441-9ed2-b93902eae444 for instance with vm_state active and task_state rebuilding.
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.640 182096 INFO nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance shutdown successfully after 13 seconds.
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.644 182096 INFO nova.virt.libvirt.driver [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance destroyed successfully.
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.647 182096 INFO nova.virt.libvirt.driver [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance destroyed successfully.
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.648 182096 DEBUG nova.virt.libvirt.vif [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-814236726',display_name='tempest-ServerDiskConfigTestJSON-server-814236726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-814236726',id=78,image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-9ly5rpza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:44Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=e3ec110a-6bce-48f3-bfa4-2da541b334a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.648 182096 DEBUG nova.network.os_vif_util [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.649 182096 DEBUG nova.network.os_vif_util [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.649 182096 DEBUG os_vif [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.650 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.650 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5398983-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.651 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.653 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.655 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.657 182096 INFO os_vif [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f')
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.657 182096 INFO nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Deleting instance files /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3_del
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.658 182096 INFO nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Deletion of /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3_del complete
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.829 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.829 182096 INFO nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Creating image(s)
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.829 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.830 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.830 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.841 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.888 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.888 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.889 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.900 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.945 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.946 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.967 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.968 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:58 compute-0 nova_compute[182092]: 2026-01-23 09:23:58.968 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.013 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.013 182096 DEBUG nova.virt.disk.api [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Checking if we can resize image /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.014 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.059 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.060 182096 DEBUG nova.virt.disk.api [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Cannot resize image /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.061 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.061 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Ensure instance console log exists: /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.061 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.062 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.062 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.064 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Start _get_guest_xml network_info=[{"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.069 182096 WARNING nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.075 182096 DEBUG nova.virt.libvirt.host [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.075 182096 DEBUG nova.virt.libvirt.host [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.077 182096 DEBUG nova.virt.libvirt.host [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.078 182096 DEBUG nova.virt.libvirt.host [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.079 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.079 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.079 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.080 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.080 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.080 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.080 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.080 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.081 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.081 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.081 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.081 182096 DEBUG nova.virt.hardware [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.081 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'vcpu_model' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.105 182096 DEBUG nova.virt.libvirt.vif [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-814236726',display_name='tempest-ServerDiskConfigTestJSON-server-814236726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-814236726',id=78,image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-9ly5rpza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:58Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=e3ec110a-6bce-48f3-bfa4-2da541b334a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.106 182096 DEBUG nova.network.os_vif_util [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.106 182096 DEBUG nova.network.os_vif_util [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.107 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <uuid>e3ec110a-6bce-48f3-bfa4-2da541b334a3</uuid>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <name>instance-0000004e</name>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-814236726</nova:name>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:23:59</nova:creationTime>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:23:59 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:23:59 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:23:59 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:23:59 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:23:59 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:23:59 compute-0 nova_compute[182092]:         <nova:user uuid="5edc8d287a9f4ffd90f54ecea19df7e8">tempest-ServerDiskConfigTestJSON-1935371313-project-member</nova:user>
Jan 23 09:23:59 compute-0 nova_compute[182092]:         <nova:project uuid="b9afff815c4546ad97f6d3afa2c35483">tempest-ServerDiskConfigTestJSON-1935371313</nova:project>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="11cc348c-4b05-42ba-a4b9-513b91dede76"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:23:59 compute-0 nova_compute[182092]:         <nova:port uuid="d5398983-1fbc-4441-9ed2-b93902eae444">
Jan 23 09:23:59 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <system>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <entry name="serial">e3ec110a-6bce-48f3-bfa4-2da541b334a3</entry>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <entry name="uuid">e3ec110a-6bce-48f3-bfa4-2da541b334a3</entry>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </system>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <os>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   </os>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <features>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   </features>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:56:04:a2"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <target dev="tapd5398983-1f"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/console.log" append="off"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <video>
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </video>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:23:59 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:23:59 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:23:59 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:23:59 compute-0 nova_compute[182092]: </domain>
Jan 23 09:23:59 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.109 182096 DEBUG nova.compute.manager [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Preparing to wait for external event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.109 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.109 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.109 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.110 182096 DEBUG nova.virt.libvirt.vif [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-814236726',display_name='tempest-ServerDiskConfigTestJSON-server-814236726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-814236726',id=78,image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-9ly5rpza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:58Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=e3ec110a-6bce-48f3-bfa4-2da541b334a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.110 182096 DEBUG nova.network.os_vif_util [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.111 182096 DEBUG nova.network.os_vif_util [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.111 182096 DEBUG os_vif [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.112 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.112 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.112 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.114 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.115 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5398983-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.115 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd5398983-1f, col_values=(('external_ids', {'iface-id': 'd5398983-1fbc-4441-9ed2-b93902eae444', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:04:a2', 'vm-uuid': 'e3ec110a-6bce-48f3-bfa4-2da541b334a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.116 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:59 compute-0 NetworkManager[54920]: <info>  [1769160239.1173] manager: (tapd5398983-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.119 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.121 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.122 182096 INFO os_vif [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f')
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.158 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.158 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.158 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No VIF found with MAC fa:16:3e:56:04:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.159 182096 INFO nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Using config drive
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.173 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'ec2_ids' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:23:59 compute-0 nova_compute[182092]: 2026-01-23 09:23:59.208 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'keypairs' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.173 182096 DEBUG nova.network.neutron [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.190 182096 DEBUG oslo_concurrency.lockutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.209 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance destroyed successfully.
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.210 182096 DEBUG nova.objects.instance [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.218 182096 DEBUG nova.objects.instance [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'resources' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.229 182096 DEBUG nova.virt.libvirt.vif [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:23:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.229 182096 DEBUG nova.network.os_vif_util [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.229 182096 DEBUG nova.network.os_vif_util [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.230 182096 DEBUG os_vif [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.231 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.231 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1eb1ee2-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.232 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.235 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.236 182096 INFO os_vif [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.241 182096 DEBUG nova.virt.libvirt.driver [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Start _get_guest_xml network_info=[{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.243 182096 WARNING nova.virt.libvirt.driver [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.247 182096 DEBUG nova.virt.libvirt.host [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.247 182096 DEBUG nova.virt.libvirt.host [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.250 182096 DEBUG nova.virt.libvirt.host [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.250 182096 DEBUG nova.virt.libvirt.host [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.251 182096 DEBUG nova.virt.libvirt.driver [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.251 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.251 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.251 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.252 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.252 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.252 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.252 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.252 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.253 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.253 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.253 182096 DEBUG nova.virt.hardware [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.253 182096 DEBUG nova.objects.instance [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.266 182096 DEBUG nova.virt.libvirt.vif [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:23:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.266 182096 DEBUG nova.network.os_vif_util [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.267 182096 DEBUG nova.network.os_vif_util [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.268 182096 DEBUG nova.objects.instance [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.277 182096 DEBUG nova.virt.libvirt.driver [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <uuid>74eaa05c-e365-4879-af9a-1bf1c102eda7</uuid>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <name>instance-0000004b</name>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerActionsTestJSON-server-1490124151</nova:name>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:24:00</nova:creationTime>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:24:00 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:24:00 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:24:00 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:24:00 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:24:00 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:24:00 compute-0 nova_compute[182092]:         <nova:user uuid="89c019480e524c04af4d250b1c4051e5">tempest-ServerActionsTestJSON-766366320-project-member</nova:user>
Jan 23 09:24:00 compute-0 nova_compute[182092]:         <nova:project uuid="860ef09b9e6e4866bbe99b6e769733a3">tempest-ServerActionsTestJSON-766366320</nova:project>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:24:00 compute-0 nova_compute[182092]:         <nova:port uuid="b1eb1ee2-fc97-423a-bac5-6219bd097839">
Jan 23 09:24:00 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <system>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <entry name="serial">74eaa05c-e365-4879-af9a-1bf1c102eda7</entry>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <entry name="uuid">74eaa05c-e365-4879-af9a-1bf1c102eda7</entry>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </system>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <os>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   </os>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <features>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   </features>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:27:52:52"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <target dev="tapb1eb1ee2-fc"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/console.log" append="off"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <video>
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </video>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <input type="keyboard" bus="usb"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:24:00 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:24:00 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:24:00 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:24:00 compute-0 nova_compute[182092]: </domain>
Jan 23 09:24:00 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.278 182096 DEBUG oslo_concurrency.processutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.326 182096 DEBUG oslo_concurrency.processutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.327 182096 DEBUG oslo_concurrency.processutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.374 182096 DEBUG oslo_concurrency.processutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.375 182096 DEBUG nova.objects.instance [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.391 182096 DEBUG oslo_concurrency.processutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.438 182096 DEBUG oslo_concurrency.processutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.439 182096 DEBUG nova.virt.disk.api [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Checking if we can resize image /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.440 182096 DEBUG oslo_concurrency.processutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.497 182096 DEBUG oslo_concurrency.processutils [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.497 182096 DEBUG nova.virt.disk.api [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Cannot resize image /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.498 182096 DEBUG nova.objects.instance [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.510 182096 DEBUG nova.virt.libvirt.vif [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:23:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.510 182096 DEBUG nova.network.os_vif_util [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.511 182096 DEBUG nova.network.os_vif_util [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.511 182096 DEBUG os_vif [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.512 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.512 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.513 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.515 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.515 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1eb1ee2-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.516 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1eb1ee2-fc, col_values=(('external_ids', {'iface-id': 'b1eb1ee2-fc97-423a-bac5-6219bd097839', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:52:52', 'vm-uuid': '74eaa05c-e365-4879-af9a-1bf1c102eda7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.517 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 NetworkManager[54920]: <info>  [1769160240.5179] manager: (tapb1eb1ee2-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.519 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.522 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.522 182096 INFO os_vif [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:24:00 compute-0 kernel: tapb1eb1ee2-fc: entered promiscuous mode
Jan 23 09:24:00 compute-0 systemd-udevd[218376]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:24:00 compute-0 NetworkManager[54920]: <info>  [1769160240.5685] manager: (tapb1eb1ee2-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Jan 23 09:24:00 compute-0 ovn_controller[94697]: 2026-01-23T09:24:00Z|00242|binding|INFO|Claiming lport b1eb1ee2-fc97-423a-bac5-6219bd097839 for this chassis.
Jan 23 09:24:00 compute-0 ovn_controller[94697]: 2026-01-23T09:24:00Z|00243|binding|INFO|b1eb1ee2-fc97-423a-bac5-6219bd097839: Claiming fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.571 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.572 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 NetworkManager[54920]: <info>  [1769160240.5815] device (tapb1eb1ee2-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:24:00 compute-0 NetworkManager[54920]: <info>  [1769160240.5822] device (tapb1eb1ee2-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.580 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:52:52 10.100.0.10'], port_security=['fa:16:3e:27:52:52 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b1eb1ee2-fc97-423a-bac5-6219bd097839) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.581 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b1eb1ee2-fc97-423a-bac5-6219bd097839 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.582 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:24:00 compute-0 ovn_controller[94697]: 2026-01-23T09:24:00Z|00244|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 up in Southbound
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.588 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 ovn_controller[94697]: 2026-01-23T09:24:00Z|00245|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 ovn-installed in OVS
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.589 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.591 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4243fb11-0e11-45dd-9f02-82a130d8f6a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.592 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa610e7c-51 in ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.593 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa610e7c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.593 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b3880a25-a79a-4ca6-b455-c7b2980d7ce0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.594 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[abe6a0eb-f318-4295-b696-8b07d184132b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.599 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.604 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[18ec2abe-8e3e-40ad-9746-fff2219951db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 systemd-machined[153562]: New machine qemu-37-instance-0000004b.
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.614 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[29ca4b8c-cb30-43f2-84ae-1120fe7bbafa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 systemd[1]: Started Virtual Machine qemu-37-instance-0000004b.
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.635 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[28fb742c-98ad-4a62-867d-e18518d3a2b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.638 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f31d6129-b1ef-47fa-a88a-961bd5513153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 NetworkManager[54920]: <info>  [1769160240.6400] manager: (tapfa610e7c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/137)
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.668 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[467ff1f2-d791-4b7a-bc81-ebc12c478894]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.670 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[0075d0df-a35b-4718-9b9f-c20be2825ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.671 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.671 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.672 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.672 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:24:00 compute-0 NetworkManager[54920]: <info>  [1769160240.6948] device (tapfa610e7c-50): carrier: link connected
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.700 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ef7f3527-18d1-43e2-b9e2-3d1e3e22bf1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.715 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b232a183-4b3b-445b-8819-80b762e72d32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374618, 'reachable_time': 38825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218529, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.719 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.737 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[951824cf-bec2-4720-9c1b-0f77bc25219a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:516c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374618, 'tstamp': 374618}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218531, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.750 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[51153d06-3fa4-4ef7-99cf-d72d443371e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374618, 'reachable_time': 38825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218532, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.770 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.770 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.770 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[42770711-95fd-4e0e-85a6-f8a8919dab10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.809 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[37e2d07f-c116-425f-806c-6bea188cfe05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.810 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.810 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.811 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:00 compute-0 NetworkManager[54920]: <info>  [1769160240.8130] manager: (tapfa610e7c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.812 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 kernel: tapfa610e7c-50: entered promiscuous mode
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.818 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.820 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.821 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 ovn_controller[94697]: 2026-01-23T09:24:00Z|00246|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.835 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.839 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.839 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b70cca96-19df-44e2-bc2c-5b232a5ae845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.840 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:24:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:00.840 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'env', 'PROCESS_TAG=haproxy-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa610e7c-53f8-4775-b5b8-aa45897b011c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.839 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.842 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.891 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.891 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.947 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:00 compute-0 nova_compute[182092]: 2026-01-23 09:24:00.948 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000004e, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config'
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.056 182096 INFO nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Creating config drive at /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.061 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd549y6_0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.123 182096 DEBUG nova.compute.manager [req-f252de93-cfbc-459a-b780-b0a82df314c0 req-330ec624-3478-42d7-b859-bad128223c32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.123 182096 DEBUG oslo_concurrency.lockutils [req-f252de93-cfbc-459a-b780-b0a82df314c0 req-330ec624-3478-42d7-b859-bad128223c32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.124 182096 DEBUG oslo_concurrency.lockutils [req-f252de93-cfbc-459a-b780-b0a82df314c0 req-330ec624-3478-42d7-b859-bad128223c32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.124 182096 DEBUG oslo_concurrency.lockutils [req-f252de93-cfbc-459a-b780-b0a82df314c0 req-330ec624-3478-42d7-b859-bad128223c32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.124 182096 DEBUG nova.compute.manager [req-f252de93-cfbc-459a-b780-b0a82df314c0 req-330ec624-3478-42d7-b859-bad128223c32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Processing event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:24:01 compute-0 podman[218577]: 2026-01-23 09:24:01.161798791 +0000 UTC m=+0.056930023 container create 10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:24:01 compute-0 systemd[1]: Started libpod-conmon-10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb.scope.
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.182 182096 DEBUG oslo_concurrency.processutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd549y6_0" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:01 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:24:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5658fe4fc9a674f4b35a59cc06dd54a8eab1c3eb1585586857a862bc11109684/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:24:01 compute-0 podman[218577]: 2026-01-23 09:24:01.217499114 +0000 UTC m=+0.112630336 container init 10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:24:01 compute-0 podman[218577]: 2026-01-23 09:24:01.222729792 +0000 UTC m=+0.117861014 container start 10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:24:01 compute-0 podman[218577]: 2026-01-23 09:24:01.136802354 +0000 UTC m=+0.031933596 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:24:01 compute-0 kernel: tapd5398983-1f: entered promiscuous mode
Jan 23 09:24:01 compute-0 NetworkManager[54920]: <info>  [1769160241.2341] manager: (tapd5398983-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.234 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:01 compute-0 ovn_controller[94697]: 2026-01-23T09:24:01Z|00247|binding|INFO|Claiming lport d5398983-1fbc-4441-9ed2-b93902eae444 for this chassis.
Jan 23 09:24:01 compute-0 ovn_controller[94697]: 2026-01-23T09:24:01Z|00248|binding|INFO|d5398983-1fbc-4441-9ed2-b93902eae444: Claiming fa:16:3e:56:04:a2 10.100.0.13
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.240 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:01 compute-0 ovn_controller[94697]: 2026-01-23T09:24:01Z|00249|binding|INFO|Setting lport d5398983-1fbc-4441-9ed2-b93902eae444 ovn-installed in OVS
Jan 23 09:24:01 compute-0 ovn_controller[94697]: 2026-01-23T09:24:01Z|00250|binding|INFO|Setting lport d5398983-1fbc-4441-9ed2-b93902eae444 up in Southbound
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.259 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:01 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218588]: [NOTICE]   (218602) : New worker (218605) forked
Jan 23 09:24:01 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218588]: [NOTICE]   (218602) : Loading success.
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.262 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:04:a2 10.100.0.13'], port_security=['fa:16:3e:56:04:a2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e3ec110a-6bce-48f3-bfa4-2da541b334a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3598732e-75d5-4a2b-8884-521ea92eab7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7652143e-debd-4a5a-90a5-8ccbe554976b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d5ac5-7f76-4d27-a905-10a9d18c8f4a, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=d5398983-1fbc-4441-9ed2-b93902eae444) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:24:01 compute-0 systemd-udevd[218609]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:24:01 compute-0 NetworkManager[54920]: <info>  [1769160241.2867] device (tapd5398983-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:24:01 compute-0 NetworkManager[54920]: <info>  [1769160241.2872] device (tapd5398983-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:24:01 compute-0 systemd-machined[153562]: New machine qemu-38-instance-0000004e.
Jan 23 09:24:01 compute-0 systemd[1]: Started Virtual Machine qemu-38-instance-0000004e.
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.318 103978 INFO neutron.agent.ovn.metadata.agent [-] Port d5398983-1fbc-4441-9ed2-b93902eae444 in datapath 3598732e-75d5-4a2b-8884-521ea92eab7a unbound from our chassis
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.320 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.328 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[619d5e70-7d0f-4a39-b502-08007588ee83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.331 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3598732e-71 in ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.333 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3598732e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.333 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[107ab4f8-54a0-44ee-b0d9-f592bbcdec27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.334 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9a37da6a-7f60-4fe7-b44d-77f04caeffc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.338 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.339 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5740MB free_disk=73.23473358154297GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.339 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.339 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.349 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[f81d32f8-48d1-4f58-b3ae-8682d9dd8898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.358 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2fc10c-9d2e-452d-baf8-132f33d7e2aa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.378 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2830a5d3-d384-4536-ad43-86bf4fcf33ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 NetworkManager[54920]: <info>  [1769160241.3833] manager: (tap3598732e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/140)
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.382 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d64b1fc5-6f4a-4cab-9e89-eced27e1d9e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.401 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 74eaa05c-e365-4879-af9a-1bf1c102eda7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.402 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance e3ec110a-6bce-48f3-bfa4-2da541b334a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.402 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.402 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.408 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ba01012b-f55a-450a-8471-42747d5039b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.410 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ca93aae1-7c9b-44ad-94e7-99e8435dc642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 NetworkManager[54920]: <info>  [1769160241.4248] device (tap3598732e-70): carrier: link connected
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.429 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[634077fc-3bb6-40c5-9aec-3fad15a1786e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.442 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.445 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b55138f8-c86b-4113-898f-2dc603e9847d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3598732e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:00:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374691, 'reachable_time': 38789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218638, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.454 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.458 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0c09fa-ce94-4a3c-81ef-3c08e0d6f53b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374691, 'tstamp': 374691}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218639, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.470 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ff7823-85ad-4824-9ffd-7f96ca81995a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3598732e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:00:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374691, 'reachable_time': 38789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218640, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.472 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.473 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.491 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b293fa6e-570c-4a6e-8f34-4be8de41f5f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.528 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9db1f5-4e92-4cf0-b642-3a38f003c7c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.529 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3598732e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.529 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.529 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3598732e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.531 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:01 compute-0 NetworkManager[54920]: <info>  [1769160241.5316] manager: (tap3598732e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Jan 23 09:24:01 compute-0 kernel: tap3598732e-70: entered promiscuous mode
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.533 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.535 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3598732e-70, col_values=(('external_ids', {'iface-id': '9bf071ba-d027-4af7-a154-40b491b7a535'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.536 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:01 compute-0 ovn_controller[94697]: 2026-01-23T09:24:01Z|00251|binding|INFO|Releasing lport 9bf071ba-d027-4af7-a154-40b491b7a535 from this chassis (sb_readonly=0)
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.537 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.539 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.539 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[779fbfcd-2b76-48fd-a55c-98705e2b6657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.540 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:24:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:01.541 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'env', 'PROCESS_TAG=haproxy-3598732e-75d5-4a2b-8884-521ea92eab7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3598732e-75d5-4a2b-8884-521ea92eab7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:24:01 compute-0 nova_compute[182092]: 2026-01-23 09:24:01.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:01 compute-0 podman[218669]: 2026-01-23 09:24:01.826688489 +0000 UTC m=+0.036403248 container create dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:24:01 compute-0 systemd[1]: Started libpod-conmon-dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0.scope.
Jan 23 09:24:01 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:24:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe5631593faacd51cacc487aa4757731f4effdc1b49d86fadcc2f08aa020215/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:24:01 compute-0 podman[218669]: 2026-01-23 09:24:01.877047285 +0000 UTC m=+0.086762043 container init dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:24:01 compute-0 podman[218669]: 2026-01-23 09:24:01.881727203 +0000 UTC m=+0.091441962 container start dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:24:01 compute-0 podman[218669]: 2026-01-23 09:24:01.81253126 +0000 UTC m=+0.022246039 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:24:01 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218680]: [NOTICE]   (218684) : New worker (218686) forked
Jan 23 09:24:01 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218680]: [NOTICE]   (218684) : Loading success.
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.000 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 74eaa05c-e365-4879-af9a-1bf1c102eda7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.000 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160241.9999645, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.001 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Resumed (Lifecycle Event)
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.002 182096 DEBUG nova.compute.manager [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.005 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance rebooted successfully.
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.005 182096 DEBUG nova.compute.manager [None req-47983c68-519b-4da9-adf9-0ae1370b7e8a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.032 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.035 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.039 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.055 182096 DEBUG nova.compute.manager [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.059 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.062 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160242.0024807, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.062 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Started (Lifecycle Event)
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.066 182096 INFO nova.virt.libvirt.driver [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance spawned successfully.
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.066 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.094 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.095 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.095 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.095 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.096 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.096 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.096 182096 DEBUG nova.virt.libvirt.driver [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.100 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.136 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for e3ec110a-6bce-48f3-bfa4-2da541b334a3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.136 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160242.054642, e3ec110a-6bce-48f3-bfa4-2da541b334a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.136 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] VM Started (Lifecycle Event)
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.161 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.163 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.203 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.204 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160242.0547178, e3ec110a-6bce-48f3-bfa4-2da541b334a3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.204 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] VM Paused (Lifecycle Event)
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.205 182096 DEBUG nova.compute.manager [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.236 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.239 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160242.0577958, e3ec110a-6bce-48f3-bfa4-2da541b334a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.239 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] VM Resumed (Lifecycle Event)
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.255 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.257 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.280 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.281 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.281 182096 DEBUG nova.objects.instance [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.359 182096 DEBUG oslo_concurrency.lockutils [None req-6cb51f19-c430-4efc-9846-470bd353b287 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.468 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.468 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.468 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:24:02 compute-0 nova_compute[182092]: 2026-01-23 09:24:02.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.028 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.029 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.029 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.029 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.224 182096 DEBUG nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.224 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.224 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.224 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.225 182096 DEBUG nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.225 182096 WARNING nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.225 182096 DEBUG nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.225 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.225 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.226 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.226 182096 DEBUG nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.226 182096 WARNING nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.226 182096 DEBUG nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.226 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.226 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.227 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.227 182096 DEBUG nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] No waiting events found dispatching network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.227 182096 WARNING nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received unexpected event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 for instance with vm_state active and task_state None.
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.227 182096 DEBUG nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.227 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.228 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.228 182096 DEBUG oslo_concurrency.lockutils [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.228 182096 DEBUG nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] No waiting events found dispatching network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:03 compute-0 nova_compute[182092]: 2026-01-23 09:24:03.228 182096 WARNING nova.compute.manager [req-09b6d721-830a-4c3f-9d96-9a1c568dadc7 req-8314aaf8-65c0-447f-9372-cfde0ef1b636 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received unexpected event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 for instance with vm_state active and task_state None.
Jan 23 09:24:05 compute-0 nova_compute[182092]: 2026-01-23 09:24:05.303 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:24:05 compute-0 nova_compute[182092]: 2026-01-23 09:24:05.322 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:24:05 compute-0 nova_compute[182092]: 2026-01-23 09:24:05.323 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:24:05 compute-0 nova_compute[182092]: 2026-01-23 09:24:05.323 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:05 compute-0 nova_compute[182092]: 2026-01-23 09:24:05.323 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:05 compute-0 nova_compute[182092]: 2026-01-23 09:24:05.518 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.033 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.034 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.034 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.034 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.034 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.041 182096 INFO nova.compute.manager [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Terminating instance
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.046 182096 DEBUG nova.compute.manager [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:24:06 compute-0 kernel: tapd5398983-1f (unregistering): left promiscuous mode
Jan 23 09:24:06 compute-0 NetworkManager[54920]: <info>  [1769160246.0608] device (tapd5398983-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.067 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 ovn_controller[94697]: 2026-01-23T09:24:06Z|00252|binding|INFO|Releasing lport d5398983-1fbc-4441-9ed2-b93902eae444 from this chassis (sb_readonly=0)
Jan 23 09:24:06 compute-0 ovn_controller[94697]: 2026-01-23T09:24:06Z|00253|binding|INFO|Setting lport d5398983-1fbc-4441-9ed2-b93902eae444 down in Southbound
Jan 23 09:24:06 compute-0 ovn_controller[94697]: 2026-01-23T09:24:06Z|00254|binding|INFO|Removing iface tapd5398983-1f ovn-installed in OVS
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.069 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.080 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:04:a2 10.100.0.13'], port_security=['fa:16:3e:56:04:a2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e3ec110a-6bce-48f3-bfa4-2da541b334a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3598732e-75d5-4a2b-8884-521ea92eab7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7652143e-debd-4a5a-90a5-8ccbe554976b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d5ac5-7f76-4d27-a905-10a9d18c8f4a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=d5398983-1fbc-4441-9ed2-b93902eae444) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.081 103978 INFO neutron.agent.ovn.metadata.agent [-] Port d5398983-1fbc-4441-9ed2-b93902eae444 in datapath 3598732e-75d5-4a2b-8884-521ea92eab7a unbound from our chassis
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.083 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3598732e-75d5-4a2b-8884-521ea92eab7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.084 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ff9c06fa-0be0-4689-8656-db5a06a27869]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.085 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a namespace which is not needed anymore
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.088 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Jan 23 09:24:06 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004e.scope: Consumed 4.688s CPU time.
Jan 23 09:24:06 compute-0 systemd-machined[153562]: Machine qemu-38-instance-0000004e terminated.
Jan 23 09:24:06 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218680]: [NOTICE]   (218684) : haproxy version is 2.8.14-c23fe91
Jan 23 09:24:06 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218680]: [NOTICE]   (218684) : path to executable is /usr/sbin/haproxy
Jan 23 09:24:06 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218680]: [WARNING]  (218684) : Exiting Master process...
Jan 23 09:24:06 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218680]: [ALERT]    (218684) : Current worker (218686) exited with code 143 (Terminated)
Jan 23 09:24:06 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[218680]: [WARNING]  (218684) : All workers exited. Exiting... (0)
Jan 23 09:24:06 compute-0 systemd[1]: libpod-dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0.scope: Deactivated successfully.
Jan 23 09:24:06 compute-0 podman[218727]: 2026-01-23 09:24:06.18364454 +0000 UTC m=+0.030386277 container died dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:24:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0-userdata-shm.mount: Deactivated successfully.
Jan 23 09:24:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-cbe5631593faacd51cacc487aa4757731f4effdc1b49d86fadcc2f08aa020215-merged.mount: Deactivated successfully.
Jan 23 09:24:06 compute-0 podman[218727]: 2026-01-23 09:24:06.205823551 +0000 UTC m=+0.052565288 container cleanup dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:24:06 compute-0 systemd[1]: libpod-conmon-dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0.scope: Deactivated successfully.
Jan 23 09:24:06 compute-0 podman[218752]: 2026-01-23 09:24:06.249406063 +0000 UTC m=+0.026720922 container remove dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.253 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d6e63d74-a2d3-4782-8cd0-fab1c1bce5db]: (4, ('Fri Jan 23 09:24:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a (dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0)\ndd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0\nFri Jan 23 09:24:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a (dd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0)\ndd2e3f855c378f8dc2fcfdadb1a148fca9ef7998727b5f19d43b148fc87189e0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.256 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5329ea40-3ecf-4652-bf65-c0648efd3ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.257 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3598732e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.259 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.265 182096 DEBUG nova.compute.manager [req-e222bd79-b6e7-478b-ab83-c704f2aebf9a req-d843f264-8c50-47a6-9572-ca1104a4f7d0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-unplugged-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.265 182096 DEBUG oslo_concurrency.lockutils [req-e222bd79-b6e7-478b-ab83-c704f2aebf9a req-d843f264-8c50-47a6-9572-ca1104a4f7d0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.266 182096 DEBUG oslo_concurrency.lockutils [req-e222bd79-b6e7-478b-ab83-c704f2aebf9a req-d843f264-8c50-47a6-9572-ca1104a4f7d0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.266 182096 DEBUG oslo_concurrency.lockutils [req-e222bd79-b6e7-478b-ab83-c704f2aebf9a req-d843f264-8c50-47a6-9572-ca1104a4f7d0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.266 182096 DEBUG nova.compute.manager [req-e222bd79-b6e7-478b-ab83-c704f2aebf9a req-d843f264-8c50-47a6-9572-ca1104a4f7d0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] No waiting events found dispatching network-vif-unplugged-d5398983-1fbc-4441-9ed2-b93902eae444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.267 182096 DEBUG nova.compute.manager [req-e222bd79-b6e7-478b-ab83-c704f2aebf9a req-d843f264-8c50-47a6-9572-ca1104a4f7d0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-unplugged-d5398983-1fbc-4441-9ed2-b93902eae444 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.274 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 kernel: tap3598732e-70: left promiscuous mode
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.286 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.288 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5f87f0cf-b57e-43b0-b4a8-c9e9688ac35e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.298 182096 INFO nova.virt.libvirt.driver [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Instance destroyed successfully.
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.298 182096 DEBUG nova.objects.instance [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'resources' on Instance uuid e3ec110a-6bce-48f3-bfa4-2da541b334a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.299 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c304fa31-5874-45d0-85d7-bf0a98c0ed60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.300 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c6703449-6b8b-4192-ad72-148936086fc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.311 182096 DEBUG nova.virt.libvirt.vif [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:23:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-814236726',display_name='tempest-ServerDiskConfigTestJSON-server-814236726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-814236726',id=78,image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:24:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-9ly5rpza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:24:02Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=e3ec110a-6bce-48f3-bfa4-2da541b334a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.312 182096 DEBUG nova.network.os_vif_util [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "d5398983-1fbc-4441-9ed2-b93902eae444", "address": "fa:16:3e:56:04:a2", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5398983-1f", "ovs_interfaceid": "d5398983-1fbc-4441-9ed2-b93902eae444", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.313 182096 DEBUG nova.network.os_vif_util [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.313 182096 DEBUG os_vif [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.314 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.314 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5398983-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d3598732e\x2d75d5\x2d4a2b\x2d8884\x2d521ea92eab7a.mount: Deactivated successfully.
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.314 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c544b0-329f-4bf6-9bff-2333732d2b6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374686, 'reachable_time': 21223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218783, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.317 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:24:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:06.317 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e8e006-d705-41eb-8a40-3c3a291d2646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.320 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.321 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.323 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.324 182096 INFO os_vif [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:04:a2,bridge_name='br-int',has_traffic_filtering=True,id=d5398983-1fbc-4441-9ed2-b93902eae444,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5398983-1f')
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.325 182096 INFO nova.virt.libvirt.driver [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Deleting instance files /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3_del
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.325 182096 INFO nova.virt.libvirt.driver [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Deletion of /var/lib/nova/instances/e3ec110a-6bce-48f3-bfa4-2da541b334a3_del complete
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.376 182096 INFO nova.compute.manager [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.376 182096 DEBUG oslo.service.loopingcall [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.376 182096 DEBUG nova.compute.manager [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.376 182096 DEBUG nova.network.neutron [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:24:06 compute-0 ovn_controller[94697]: 2026-01-23T09:24:06Z|00255|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:24:06 compute-0 nova_compute[182092]: 2026-01-23 09:24:06.579 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.034 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.350 182096 DEBUG nova.network.neutron [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.362 182096 INFO nova.compute.manager [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Took 0.99 seconds to deallocate network for instance.
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.422 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.423 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.490 182096 DEBUG nova.compute.provider_tree [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.507 182096 DEBUG nova.scheduler.client.report [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.526 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.528 182096 DEBUG nova.compute.manager [req-baaaee32-aa5a-4f81-bb0e-49d016b7a376 req-d4bffa81-7146-4535-bb81-eb2ff708b387 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-deleted-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.557 182096 INFO nova.scheduler.client.report [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Deleted allocations for instance e3ec110a-6bce-48f3-bfa4-2da541b334a3
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.619 182096 DEBUG oslo_concurrency.lockutils [None req-8068a35f-1216-4acd-b94c-06b1d6e184c8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.703 182096 INFO nova.compute.manager [None req-5d9fe59d-d27d-4c9b-ac19-37b3ed587a81 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Pausing
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.704 182096 DEBUG nova.objects.instance [None req-5d9fe59d-d27d-4c9b-ac19-37b3ed587a81 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'flavor' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.728 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160247.7286685, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.729 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Paused (Lifecycle Event)
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.730 182096 DEBUG nova.compute.manager [None req-5d9fe59d-d27d-4c9b-ac19-37b3ed587a81 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.745 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.747 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:07 compute-0 nova_compute[182092]: 2026-01-23 09:24:07.771 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.356 182096 DEBUG nova.compute.manager [req-b0ee4705-3de3-4542-ba6f-703d2ddcfee4 req-16c330a7-3d8d-40aa-aeaf-707c0835ea2a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.356 182096 DEBUG oslo_concurrency.lockutils [req-b0ee4705-3de3-4542-ba6f-703d2ddcfee4 req-16c330a7-3d8d-40aa-aeaf-707c0835ea2a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.357 182096 DEBUG oslo_concurrency.lockutils [req-b0ee4705-3de3-4542-ba6f-703d2ddcfee4 req-16c330a7-3d8d-40aa-aeaf-707c0835ea2a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.357 182096 DEBUG oslo_concurrency.lockutils [req-b0ee4705-3de3-4542-ba6f-703d2ddcfee4 req-16c330a7-3d8d-40aa-aeaf-707c0835ea2a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "e3ec110a-6bce-48f3-bfa4-2da541b334a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.357 182096 DEBUG nova.compute.manager [req-b0ee4705-3de3-4542-ba6f-703d2ddcfee4 req-16c330a7-3d8d-40aa-aeaf-707c0835ea2a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] No waiting events found dispatching network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.357 182096 WARNING nova.compute.manager [req-b0ee4705-3de3-4542-ba6f-703d2ddcfee4 req-16c330a7-3d8d-40aa-aeaf-707c0835ea2a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Received unexpected event network-vif-plugged-d5398983-1fbc-4441-9ed2-b93902eae444 for instance with vm_state deleted and task_state None.
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.504 182096 INFO nova.compute.manager [None req-4b7e5181-f4d1-44ab-a13d-4e7810079a83 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Unpausing
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.505 182096 DEBUG nova.objects.instance [None req-4b7e5181-f4d1-44ab-a13d-4e7810079a83 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'flavor' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.528 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160248.5283484, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.528 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Resumed (Lifecycle Event)
Jan 23 09:24:08 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.530 182096 DEBUG nova.virt.libvirt.guest [None req-4b7e5181-f4d1-44ab-a13d-4e7810079a83 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.531 182096 DEBUG nova.compute.manager [None req-4b7e5181-f4d1-44ab-a13d-4e7810079a83 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.546 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.548 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:08 compute-0 nova_compute[182092]: 2026-01-23 09:24:08.572 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] During sync_power_state the instance has a pending task (unpausing). Skip.
Jan 23 09:24:09 compute-0 podman[218785]: 2026-01-23 09:24:09.239307957 +0000 UTC m=+0.073983796 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:24:09 compute-0 podman[218784]: 2026-01-23 09:24:09.239302297 +0000 UTC m=+0.075361957 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:24:11 compute-0 podman[218821]: 2026-01-23 09:24:11.206177795 +0000 UTC m=+0.047160182 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:24:11 compute-0 nova_compute[182092]: 2026-01-23 09:24:11.318 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:12 compute-0 nova_compute[182092]: 2026-01-23 09:24:12.037 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:13 compute-0 nova_compute[182092]: 2026-01-23 09:24:13.577 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:14 compute-0 ovn_controller[94697]: 2026-01-23T09:24:14Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:24:16 compute-0 nova_compute[182092]: 2026-01-23 09:24:16.319 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:16 compute-0 nova_compute[182092]: 2026-01-23 09:24:16.742 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:17 compute-0 nova_compute[182092]: 2026-01-23 09:24:17.037 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:20 compute-0 podman[218846]: 2026-01-23 09:24:20.242443679 +0000 UTC m=+0.079449699 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:24:20 compute-0 nova_compute[182092]: 2026-01-23 09:24:20.548 182096 DEBUG oslo_concurrency.lockutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:20 compute-0 nova_compute[182092]: 2026-01-23 09:24:20.548 182096 DEBUG oslo_concurrency.lockutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:20 compute-0 nova_compute[182092]: 2026-01-23 09:24:20.548 182096 INFO nova.compute.manager [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Rebooting instance
Jan 23 09:24:20 compute-0 nova_compute[182092]: 2026-01-23 09:24:20.559 182096 DEBUG oslo_concurrency.lockutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:24:20 compute-0 nova_compute[182092]: 2026-01-23 09:24:20.559 182096 DEBUG oslo_concurrency.lockutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:24:20 compute-0 nova_compute[182092]: 2026-01-23 09:24:20.559 182096 DEBUG nova.network.neutron [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:24:21 compute-0 nova_compute[182092]: 2026-01-23 09:24:21.295 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160246.2951484, e3ec110a-6bce-48f3-bfa4-2da541b334a3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:21 compute-0 nova_compute[182092]: 2026-01-23 09:24:21.296 182096 INFO nova.compute.manager [-] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] VM Stopped (Lifecycle Event)
Jan 23 09:24:21 compute-0 nova_compute[182092]: 2026-01-23 09:24:21.310 182096 DEBUG nova.compute.manager [None req-be2cdbd0-73f4-492f-97e2-2f75bc6b9612 - - - - - -] [instance: e3ec110a-6bce-48f3-bfa4-2da541b334a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:21 compute-0 nova_compute[182092]: 2026-01-23 09:24:21.320 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:22 compute-0 nova_compute[182092]: 2026-01-23 09:24:22.039 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.188 182096 DEBUG nova.network.neutron [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.206 182096 DEBUG oslo_concurrency.lockutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.215 182096 DEBUG nova.compute.manager [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:23 compute-0 kernel: tapb1eb1ee2-fc (unregistering): left promiscuous mode
Jan 23 09:24:23 compute-0 NetworkManager[54920]: <info>  [1769160263.3606] device (tapb1eb1ee2-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:24:23 compute-0 ovn_controller[94697]: 2026-01-23T09:24:23Z|00256|binding|INFO|Releasing lport b1eb1ee2-fc97-423a-bac5-6219bd097839 from this chassis (sb_readonly=0)
Jan 23 09:24:23 compute-0 ovn_controller[94697]: 2026-01-23T09:24:23Z|00257|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 down in Southbound
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.364 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 ovn_controller[94697]: 2026-01-23T09:24:23Z|00258|binding|INFO|Removing iface tapb1eb1ee2-fc ovn-installed in OVS
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.366 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.377 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:52:52 10.100.0.10'], port_security=['fa:16:3e:27:52:52 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b1eb1ee2-fc97-423a-bac5-6219bd097839) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.378 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b1eb1ee2-fc97-423a-bac5-6219bd097839 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.380 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa610e7c-53f8-4775-b5b8-aa45897b011c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.381 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fc17de80-bac7-417f-a92d-9121300ab9bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.381 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace which is not needed anymore
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.382 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 23 09:24:23 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004b.scope: Consumed 12.390s CPU time.
Jan 23 09:24:23 compute-0 systemd-machined[153562]: Machine qemu-37-instance-0000004b terminated.
Jan 23 09:24:23 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218588]: [NOTICE]   (218602) : haproxy version is 2.8.14-c23fe91
Jan 23 09:24:23 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218588]: [NOTICE]   (218602) : path to executable is /usr/sbin/haproxy
Jan 23 09:24:23 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218588]: [WARNING]  (218602) : Exiting Master process...
Jan 23 09:24:23 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218588]: [ALERT]    (218602) : Current worker (218605) exited with code 143 (Terminated)
Jan 23 09:24:23 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[218588]: [WARNING]  (218602) : All workers exited. Exiting... (0)
Jan 23 09:24:23 compute-0 systemd[1]: libpod-10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb.scope: Deactivated successfully.
Jan 23 09:24:23 compute-0 podman[218890]: 2026-01-23 09:24:23.485462654 +0000 UTC m=+0.035893265 container died 10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 09:24:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb-userdata-shm.mount: Deactivated successfully.
Jan 23 09:24:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-5658fe4fc9a674f4b35a59cc06dd54a8eab1c3eb1585586857a862bc11109684-merged.mount: Deactivated successfully.
Jan 23 09:24:23 compute-0 podman[218890]: 2026-01-23 09:24:23.511025591 +0000 UTC m=+0.061456202 container cleanup 10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:24:23 compute-0 systemd[1]: libpod-conmon-10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb.scope: Deactivated successfully.
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.588 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance destroyed successfully.
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.588 182096 DEBUG nova.objects.instance [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'resources' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:23 compute-0 podman[218914]: 2026-01-23 09:24:23.600327825 +0000 UTC m=+0.064087926 container remove 10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.603 182096 DEBUG nova.virt.libvirt.vif [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:24:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.603 182096 DEBUG nova.network.os_vif_util [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.604 182096 DEBUG nova.network.os_vif_util [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.604 182096 DEBUG os_vif [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.606 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.605 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae866d9-3ffb-4747-9f61-fd9860ab7ef2]: (4, ('Fri Jan 23 09:24:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb)\n10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb\nFri Jan 23 09:24:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb)\n10100c8c7adc0600f17a4f8567fea4cfa943cf151aae9a5ae50f87da17ec18cb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.606 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1eb1ee2-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.607 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2a4c8ce4-8104-4876-b11c-d5956f265eef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.608 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.608 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 kernel: tapfa610e7c-50: left promiscuous mode
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.615 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.619 182096 INFO os_vif [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.625 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[640d74be-80a7-4636-8366-6d21129dc599]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.626 182096 DEBUG nova.virt.libvirt.driver [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Start _get_guest_xml network_info=[{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.627 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.634 182096 WARNING nova.virt.libvirt.driver [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.636 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[36116843-5a30-4138-8e85-03a28d07423d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.638 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f78eaeac-fbbc-4670-a91a-2a91f22b0ee3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.641 182096 DEBUG nova.virt.libvirt.host [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.642 182096 DEBUG nova.virt.libvirt.host [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.646 182096 DEBUG nova.virt.libvirt.host [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.646 182096 DEBUG nova.virt.libvirt.host [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.647 182096 DEBUG nova.virt.libvirt.driver [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.648 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.648 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.648 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.649 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.649 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.649 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.649 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.650 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.650 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.650 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.650 182096 DEBUG nova.virt.hardware [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.651 182096 DEBUG nova.objects.instance [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.652 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2909ea44-da22-4608-9a1e-e985422a8ba7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374611, 'reachable_time': 29546, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218943, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 systemd[1]: run-netns-ovnmeta\x2dfa610e7c\x2d53f8\x2d4775\x2db5b8\x2daa45897b011c.mount: Deactivated successfully.
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.654 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.654 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[732e2ae9-8f9b-44b2-8c0d-d5f2674a395a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.664 182096 DEBUG nova.virt.libvirt.vif [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:24:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.664 182096 DEBUG nova.network.os_vif_util [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.665 182096 DEBUG nova.network.os_vif_util [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.665 182096 DEBUG nova.objects.instance [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.677 182096 DEBUG nova.virt.libvirt.driver [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <uuid>74eaa05c-e365-4879-af9a-1bf1c102eda7</uuid>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <name>instance-0000004b</name>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerActionsTestJSON-server-1490124151</nova:name>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:24:23</nova:creationTime>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:24:23 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:24:23 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:24:23 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:24:23 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:24:23 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:24:23 compute-0 nova_compute[182092]:         <nova:user uuid="89c019480e524c04af4d250b1c4051e5">tempest-ServerActionsTestJSON-766366320-project-member</nova:user>
Jan 23 09:24:23 compute-0 nova_compute[182092]:         <nova:project uuid="860ef09b9e6e4866bbe99b6e769733a3">tempest-ServerActionsTestJSON-766366320</nova:project>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:24:23 compute-0 nova_compute[182092]:         <nova:port uuid="b1eb1ee2-fc97-423a-bac5-6219bd097839">
Jan 23 09:24:23 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <system>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <entry name="serial">74eaa05c-e365-4879-af9a-1bf1c102eda7</entry>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <entry name="uuid">74eaa05c-e365-4879-af9a-1bf1c102eda7</entry>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </system>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <os>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   </os>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <features>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   </features>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:27:52:52"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <target dev="tapb1eb1ee2-fc"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/console.log" append="off"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <video>
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </video>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <input type="keyboard" bus="usb"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:24:23 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:24:23 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:24:23 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:24:23 compute-0 nova_compute[182092]: </domain>
Jan 23 09:24:23 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.678 182096 DEBUG oslo_concurrency.processutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.731 182096 DEBUG oslo_concurrency.processutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.732 182096 DEBUG oslo_concurrency.processutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.781 182096 DEBUG oslo_concurrency.processutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.782 182096 DEBUG nova.objects.instance [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.792 182096 DEBUG oslo_concurrency.processutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.839 182096 DEBUG oslo_concurrency.processutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.839 182096 DEBUG nova.virt.disk.api [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Checking if we can resize image /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.840 182096 DEBUG oslo_concurrency.processutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.886 182096 DEBUG oslo_concurrency.processutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.887 182096 DEBUG nova.virt.disk.api [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Cannot resize image /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.887 182096 DEBUG nova.objects.instance [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.896 182096 DEBUG nova.virt.libvirt.vif [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:24:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.896 182096 DEBUG nova.network.os_vif_util [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.897 182096 DEBUG nova.network.os_vif_util [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.897 182096 DEBUG os_vif [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.898 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.898 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.899 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.901 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.902 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1eb1ee2-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.902 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1eb1ee2-fc, col_values=(('external_ids', {'iface-id': 'b1eb1ee2-fc97-423a-bac5-6219bd097839', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:52:52', 'vm-uuid': '74eaa05c-e365-4879-af9a-1bf1c102eda7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.904 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 NetworkManager[54920]: <info>  [1769160263.9049] manager: (tapb1eb1ee2-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.907 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.909 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.910 182096 INFO os_vif [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:24:23 compute-0 systemd-udevd[218874]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:24:23 compute-0 kernel: tapb1eb1ee2-fc: entered promiscuous mode
Jan 23 09:24:23 compute-0 NetworkManager[54920]: <info>  [1769160263.9644] manager: (tapb1eb1ee2-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/143)
Jan 23 09:24:23 compute-0 ovn_controller[94697]: 2026-01-23T09:24:23Z|00259|binding|INFO|Claiming lport b1eb1ee2-fc97-423a-bac5-6219bd097839 for this chassis.
Jan 23 09:24:23 compute-0 ovn_controller[94697]: 2026-01-23T09:24:23Z|00260|binding|INFO|b1eb1ee2-fc97-423a-bac5-6219bd097839: Claiming fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.966 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 NetworkManager[54920]: <info>  [1769160263.9744] device (tapb1eb1ee2-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:24:23 compute-0 NetworkManager[54920]: <info>  [1769160263.9752] device (tapb1eb1ee2-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:24:23 compute-0 ovn_controller[94697]: 2026-01-23T09:24:23Z|00261|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 ovn-installed in OVS
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.978 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 nova_compute[182092]: 2026-01-23 09:24:23.981 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:23 compute-0 ovn_controller[94697]: 2026-01-23T09:24:23Z|00262|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 up in Southbound
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.984 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:52:52 10.100.0.10'], port_security=['fa:16:3e:27:52:52 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b1eb1ee2-fc97-423a-bac5-6219bd097839) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.985 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b1eb1ee2-fc97-423a-bac5-6219bd097839 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.987 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.996 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0c43bba9-5902-47ae-83fd-93c3f3638bdf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.997 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa610e7c-51 in ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.998 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa610e7c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.998 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1c62d60a-64dc-4b66-8a5b-277911639fe6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:23.999 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9ae029-0af2-4e5f-8865-c77155e5a79c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 systemd-machined[153562]: New machine qemu-39-instance-0000004b.
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.007 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[6d29be0f-0185-49a9-9ee5-c554b68a41ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 systemd[1]: Started Virtual Machine qemu-39-instance-0000004b.
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.017 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1c5beb-e01c-43c3-a28c-5fad5548b3e3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.041 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[59f5bd89-4fe1-45ed-8ce7-64fc416f3aa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 NetworkManager[54920]: <info>  [1769160264.0462] manager: (tapfa610e7c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/144)
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.047 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc8ea83-7060-4d5c-97f0-4936614baa06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.073 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3c8c9f-af77-4f4d-ac84-30e0a2978d72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.076 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a6a6c6-32c6-44f2-85d9-dd29000f72b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 NetworkManager[54920]: <info>  [1769160264.0923] device (tapfa610e7c-50): carrier: link connected
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.096 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4d692a9c-b4bd-4acb-808c-351e74b8d5d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.109 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4e716291-955d-4756-842d-5b42071ccf1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376957, 'reachable_time': 28767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218994, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.120 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cf422a87-c9e5-4f99-8d39-09f8aa2d2aad]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:516c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376957, 'tstamp': 376957}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218995, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.133 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ab44de4d-9316-49aa-8764-32d0d23328cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376957, 'reachable_time': 28767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218996, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.156 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9e93b5a8-6e26-4a22-812c-bf346478bfa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.198 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9ac0b7-875d-4da1-8884-9d3509b66773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.199 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.200 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.200 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.201 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:24 compute-0 NetworkManager[54920]: <info>  [1769160264.2022] manager: (tapfa610e7c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Jan 23 09:24:24 compute-0 kernel: tapfa610e7c-50: entered promiscuous mode
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.205 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.207 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.208 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:24 compute-0 ovn_controller[94697]: 2026-01-23T09:24:24Z|00263|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.208 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.210 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.220 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.220 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c9df8d10-8902-4797-9efc-1ae1ec447e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.221 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:24:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:24.222 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'env', 'PROCESS_TAG=haproxy-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa610e7c-53f8-4775-b5b8-aa45897b011c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.252 182096 DEBUG nova.compute.manager [req-6839620f-a9ea-4e46-96e4-e9ed47db77dc req-e37f893f-dc8f-4831-8b26-73c77eef3eba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.252 182096 DEBUG oslo_concurrency.lockutils [req-6839620f-a9ea-4e46-96e4-e9ed47db77dc req-e37f893f-dc8f-4831-8b26-73c77eef3eba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.253 182096 DEBUG oslo_concurrency.lockutils [req-6839620f-a9ea-4e46-96e4-e9ed47db77dc req-e37f893f-dc8f-4831-8b26-73c77eef3eba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.253 182096 DEBUG oslo_concurrency.lockutils [req-6839620f-a9ea-4e46-96e4-e9ed47db77dc req-e37f893f-dc8f-4831-8b26-73c77eef3eba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.253 182096 DEBUG nova.compute.manager [req-6839620f-a9ea-4e46-96e4-e9ed47db77dc req-e37f893f-dc8f-4831-8b26-73c77eef3eba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.254 182096 WARNING nova.compute.manager [req-6839620f-a9ea-4e46-96e4-e9ed47db77dc req-e37f893f-dc8f-4831-8b26-73c77eef3eba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state reboot_started_hard.
Jan 23 09:24:24 compute-0 podman[219025]: 2026-01-23 09:24:24.502149861 +0000 UTC m=+0.030964327 container create 160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:24:24 compute-0 systemd[1]: Started libpod-conmon-160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39.scope.
Jan 23 09:24:24 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:24:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8f1ea40c6e34988106fc13f4d8f34d56503c76d0c5b60759714311d189796a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:24:24 compute-0 podman[219025]: 2026-01-23 09:24:24.569041816 +0000 UTC m=+0.097856282 container init 160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:24:24 compute-0 podman[219025]: 2026-01-23 09:24:24.573207896 +0000 UTC m=+0.102022362 container start 160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:24:24 compute-0 podman[219025]: 2026-01-23 09:24:24.487592948 +0000 UTC m=+0.016407424 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:24:24 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[219037]: [NOTICE]   (219047) : New worker (219049) forked
Jan 23 09:24:24 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[219037]: [NOTICE]   (219047) : Loading success.
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.636 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 74eaa05c-e365-4879-af9a-1bf1c102eda7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.637 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160264.6360168, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.637 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Resumed (Lifecycle Event)
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.639 182096 DEBUG nova.compute.manager [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.642 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance rebooted successfully.
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.642 182096 DEBUG nova.compute.manager [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.667 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.669 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.692 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.692 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160264.6361933, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.692 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Started (Lifecycle Event)
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.712 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.714 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:24 compute-0 nova_compute[182092]: 2026-01-23 09:24:24.725 182096 DEBUG oslo_concurrency.lockutils [None req-ae2a8cb1-cd74-4fbb-b152-d2cf6b9ed8a7 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.276 182096 DEBUG nova.compute.manager [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 23 09:24:26 compute-0 ovn_controller[94697]: 2026-01-23T09:24:26Z|00264|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.369 182096 DEBUG nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.369 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.370 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.371 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.371 182096 DEBUG nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.372 182096 WARNING nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.372 182096 DEBUG nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.372 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.373 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.373 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.373 182096 DEBUG nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.373 182096 WARNING nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.374 182096 DEBUG nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.374 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.374 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.374 182096 DEBUG oslo_concurrency.lockutils [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.375 182096 DEBUG nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.375 182096 WARNING nova.compute.manager [req-ffdde9e5-1079-4329-803c-29b8b5144e82 req-c5171bff-258b-4bdb-9fa3-74d188d458c4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state None.
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.375 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.381 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.381 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.415 182096 DEBUG nova.objects.instance [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lazy-loading 'pci_requests' on Instance uuid 06627a8c-e9af-44f2-8f53-e757c83abd9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.423 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.424 182096 INFO nova.compute.claims [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.424 182096 DEBUG nova.objects.instance [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lazy-loading 'resources' on Instance uuid 06627a8c-e9af-44f2-8f53-e757c83abd9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.430 182096 DEBUG nova.objects.instance [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lazy-loading 'numa_topology' on Instance uuid 06627a8c-e9af-44f2-8f53-e757c83abd9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.446 182096 DEBUG nova.objects.instance [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 06627a8c-e9af-44f2-8f53-e757c83abd9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.485 182096 INFO nova.compute.resource_tracker [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Updating resource usage from migration a25dffda-8318-4b1d-9645-bd5ae27637b3
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.485 182096 DEBUG nova.compute.resource_tracker [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Starting to track incoming migration a25dffda-8318-4b1d-9645-bd5ae27637b3 with flavor 98e818ca-8ca1-4177-8a64-bde266c399d2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.537 182096 DEBUG nova.compute.provider_tree [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.549 182096 DEBUG nova.scheduler.client.report [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.571 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:26 compute-0 nova_compute[182092]: 2026-01-23 09:24:26.571 182096 INFO nova.compute.manager [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Migrating
Jan 23 09:24:27 compute-0 nova_compute[182092]: 2026-01-23 09:24:27.041 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:28 compute-0 podman[219056]: 2026-01-23 09:24:28.207129485 +0000 UTC m=+0.041905258 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 09:24:28 compute-0 podman[219057]: 2026-01-23 09:24:28.212238223 +0000 UTC m=+0.046368108 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:24:28 compute-0 nova_compute[182092]: 2026-01-23 09:24:28.905 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:29 compute-0 sshd-session[219095]: Accepted publickey for nova from 192.168.122.101 port 43452 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:24:29 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:24:29 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:24:29 compute-0 systemd-logind[746]: New session 51 of user nova.
Jan 23 09:24:29 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:24:29 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:24:29 compute-0 systemd[219099]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:24:29 compute-0 systemd[219099]: Queued start job for default target Main User Target.
Jan 23 09:24:29 compute-0 systemd[219099]: Created slice User Application Slice.
Jan 23 09:24:29 compute-0 systemd[219099]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:24:29 compute-0 systemd[219099]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:24:29 compute-0 systemd[219099]: Reached target Paths.
Jan 23 09:24:29 compute-0 systemd[219099]: Reached target Timers.
Jan 23 09:24:29 compute-0 systemd[219099]: Starting D-Bus User Message Bus Socket...
Jan 23 09:24:29 compute-0 systemd[219099]: Starting Create User's Volatile Files and Directories...
Jan 23 09:24:29 compute-0 systemd[219099]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:24:29 compute-0 systemd[219099]: Finished Create User's Volatile Files and Directories.
Jan 23 09:24:29 compute-0 systemd[219099]: Reached target Sockets.
Jan 23 09:24:29 compute-0 systemd[219099]: Reached target Basic System.
Jan 23 09:24:29 compute-0 systemd[219099]: Reached target Main User Target.
Jan 23 09:24:29 compute-0 systemd[219099]: Startup finished in 100ms.
Jan 23 09:24:29 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:24:29 compute-0 systemd[1]: Started Session 51 of User nova.
Jan 23 09:24:29 compute-0 sshd-session[219095]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:24:29 compute-0 sshd-session[219114]: Received disconnect from 192.168.122.101 port 43452:11: disconnected by user
Jan 23 09:24:29 compute-0 sshd-session[219114]: Disconnected from user nova 192.168.122.101 port 43452
Jan 23 09:24:29 compute-0 sshd-session[219095]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:24:29 compute-0 systemd-logind[746]: Session 51 logged out. Waiting for processes to exit.
Jan 23 09:24:29 compute-0 systemd[1]: session-51.scope: Deactivated successfully.
Jan 23 09:24:29 compute-0 systemd-logind[746]: Removed session 51.
Jan 23 09:24:29 compute-0 sshd-session[219116]: Accepted publickey for nova from 192.168.122.101 port 43454 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:24:29 compute-0 systemd-logind[746]: New session 53 of user nova.
Jan 23 09:24:29 compute-0 systemd[1]: Started Session 53 of User nova.
Jan 23 09:24:29 compute-0 sshd-session[219116]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:24:30 compute-0 sshd-session[219119]: Received disconnect from 192.168.122.101 port 43454:11: disconnected by user
Jan 23 09:24:30 compute-0 sshd-session[219119]: Disconnected from user nova 192.168.122.101 port 43454
Jan 23 09:24:30 compute-0 sshd-session[219116]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:24:30 compute-0 systemd[1]: session-53.scope: Deactivated successfully.
Jan 23 09:24:30 compute-0 systemd-logind[746]: Session 53 logged out. Waiting for processes to exit.
Jan 23 09:24:30 compute-0 systemd-logind[746]: Removed session 53.
Jan 23 09:24:30 compute-0 nova_compute[182092]: 2026-01-23 09:24:30.347 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:30.349 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:24:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:30.350 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:24:32 compute-0 nova_compute[182092]: 2026-01-23 09:24:32.041 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:32 compute-0 nova_compute[182092]: 2026-01-23 09:24:32.523 182096 DEBUG nova.compute.manager [req-09489186-0029-402d-9944-534d60843ee1 req-329546a7-caee-46f0-8141-e309ef2fdf4a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-unplugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:32 compute-0 nova_compute[182092]: 2026-01-23 09:24:32.524 182096 DEBUG oslo_concurrency.lockutils [req-09489186-0029-402d-9944-534d60843ee1 req-329546a7-caee-46f0-8141-e309ef2fdf4a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:32 compute-0 nova_compute[182092]: 2026-01-23 09:24:32.524 182096 DEBUG oslo_concurrency.lockutils [req-09489186-0029-402d-9944-534d60843ee1 req-329546a7-caee-46f0-8141-e309ef2fdf4a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:32 compute-0 nova_compute[182092]: 2026-01-23 09:24:32.524 182096 DEBUG oslo_concurrency.lockutils [req-09489186-0029-402d-9944-534d60843ee1 req-329546a7-caee-46f0-8141-e309ef2fdf4a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:32 compute-0 nova_compute[182092]: 2026-01-23 09:24:32.524 182096 DEBUG nova.compute.manager [req-09489186-0029-402d-9944-534d60843ee1 req-329546a7-caee-46f0-8141-e309ef2fdf4a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] No waiting events found dispatching network-vif-unplugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:32 compute-0 nova_compute[182092]: 2026-01-23 09:24:32.524 182096 WARNING nova.compute.manager [req-09489186-0029-402d-9944-534d60843ee1 req-329546a7-caee-46f0-8141-e309ef2fdf4a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received unexpected event network-vif-unplugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f for instance with vm_state active and task_state resize_migrating.
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:32.999 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'name': 'tempest-ServerActionsTestJSON-server-1490124151', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '860ef09b9e6e4866bbe99b6e769733a3', 'user_id': '89c019480e524c04af4d250b1c4051e5', 'hostId': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.003 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 74eaa05c-e365-4879-af9a-1bf1c102eda7 / tapb1eb1ee2-fc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.003 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdb76757-437b-4aba-9d0c-a13b499dfc05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.000811', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53bc6fda-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '1d6ab30fbec6c3c11f42b62fdba254d191de41f08655b64da0cccbf93ab53222'}]}, 'timestamp': '2026-01-23 09:24:33.003875', '_unique_id': 'bb29389ab87043eb9d28edb5d75c97a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.004 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.005 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.007 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fa60b84-b76c-467b-a649-850f29d81309', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.007200', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53bd02ba-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '620dc5b5cb53e362423ac6aabbb6381d154431a5c2e9a41bb7d9c50ec66bb1be'}]}, 'timestamp': '2026-01-23 09:24:33.007593', '_unique_id': 'ebcfe91cc5ec49a9ad657c4e1b44cbee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.008 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.010 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.010 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1490124151>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1490124151>]
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.028 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a925aa5d-bcea-4f50-97f2-fff9f597e25f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.011021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c04ac4-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '72b5ccc61c40eabca6c5fb64e4ab57f2957681156004087472ad79d3fe44e31b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.011021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53c056d6-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '8593346b0110899b475af65148f2568f07786e813c9d0aba7629458c4fecfc59'}]}, 'timestamp': '2026-01-23 09:24:33.029395', '_unique_id': '807101fda84b4872b7afe3e623fef0ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.038 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.039 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd56109a6-b540-4b1d-9973-494ea06a51c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.032289', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c1d7a4-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.561744982, 'message_signature': '429a343471b6c8d4148ed04ae7371d68fb99450b8086d338cf6952f29873aeca'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.032289', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53c1e316-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.561744982, 'message_signature': 'e0ae36eec47b541db849dc0399caa56c88a122f0735e67f4ac78e9d8afe11714'}]}, 'timestamp': '2026-01-23 09:24:33.039538', '_unique_id': '7217d7c2b170496390ae243757b19d95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.040 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.042 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.042 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1490124151>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1490124151>]
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdf0fef2-6de2-4066-92ad-e18982661abc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.043021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53c27984-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '92f81e28ffdefdbec73ce2437ff70283591944d40a303a7a6a6fa17c30c69db5'}]}, 'timestamp': '2026-01-23 09:24:33.043399', '_unique_id': 'd59c143ccdbe4ca4bbcb1f537072823d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.043 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.046 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.056 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/cpu volume: 8040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5f91477-5dfa-41ae-a7a3-8fcff1f7ddc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8040000000, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'timestamp': '2026-01-23T09:24:33.046338', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '53c49610-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.586230565, 'message_signature': 'e6a6d5b3a8796ea2b45e71e91f9c3dc9b3651906606d4042fafbd09cf8d87695'}]}, 'timestamp': '2026-01-23 09:24:33.057236', '_unique_id': 'd74bce1c1aa34c8b9c8b3e077470b040'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.058 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd98b0999-fd47-4a8d-9e8a-22446a428b36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.060072', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53c51414-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '22215e63c330058267ede6effa3757e8db9eaa5707fbab639361912bb22c0a0e'}]}, 'timestamp': '2026-01-23 09:24:33.060464', '_unique_id': 'c9dd652b67f14587b20bade2ccab57b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.060 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.063 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89bcb791-f1ab-4405-a765-27ddd787771b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.063270', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53c590e2-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '6779eda7a13d630268e2c740c51b15b72bdfbcabb7bc5a5738a7a831bff81bc4'}]}, 'timestamp': '2026-01-23 09:24:33.063674', '_unique_id': 'f814419f323a45868b17bec96a0b1a64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.064 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.066 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44d05e2e-df09-406d-b01e-0056fe34c305', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.066476', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53c60f04-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': 'e19d7d96976107ea312598f11fce768b0172f4b7c4f0808add436b510a4e9ef7'}]}, 'timestamp': '2026-01-23 09:24:33.066889', '_unique_id': '3da417698ae94d7ab7f3a59cf1528dd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.069 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.069 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.069 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 74eaa05c-e365-4879-af9a-1bf1c102eda7: ceilometer.compute.pollsters.NoVolumeException
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.070 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.070 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.070 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1490124151>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1490124151>]
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.070 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.070 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77b52296-35d4-4859-84c6-d5190334e691', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.070792', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53c6be04-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '1e338fcc06002c48495c324a7bc7b6e47726f7de03e0b53f987aff89f1547237'}]}, 'timestamp': '2026-01-23 09:24:33.071371', '_unique_id': '67f91dd052b74a21be8eaf08ad265f51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.074 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47bddfe3-3a6e-45be-8c60-f8af2e21a1ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.074629', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53c74e96-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '81792c39901f6c8be85601c660a22db306c4efac31ac00b18406949a912c2759'}]}, 'timestamp': '2026-01-23 09:24:33.075070', '_unique_id': '8bc9ad5b756448bea3f27448a72a1d1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.078 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.usage volume: 30277632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.078 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1976ea50-cbd0-4dc6-9712-352e5e8a8390', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30277632, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.077918', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c7cd58-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.561744982, 'message_signature': '1230c77bdfacecd6ee81f646b0306405ef26e1d64724fb1c999d87c2bbbe0f68'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.077918', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53c7d8a2-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.561744982, 'message_signature': '6b45f8f49f436c59790f9d9edb1801345a08ab901dcb29cf980e524cfe31038c'}]}, 'timestamp': '2026-01-23 09:24:33.078591', '_unique_id': '5a8572823c144f38b201e3f07a879e17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.080 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.081 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.081 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1490124151>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1490124151>]
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.081 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.081 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d081d17-0c0f-4756-9298-68aa16b0fc7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.081467', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c85340-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '529dd9dd238688505393e5b73a799252fda7a8d0b8431c6b420e99b9553a948e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.081467', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53c85ec6-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '0cb5412b020228ac8b06791545001d802bf458362818c884ad1bfd60f61a377c'}]}, 'timestamp': '2026-01-23 09:24:33.082030', '_unique_id': '31ba68a7850249f799ede7e284a56d27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.082 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.083 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98b5972f-d7e9-4dca-9f16-07bd9b8cb191', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.083486', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53c8a098-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '998c01a1b26c0d44e2a038ae35d58a4f7c847b2757ee19177c450e0b68fa6d23'}]}, 'timestamp': '2026-01-23 09:24:33.083751', '_unique_id': '00f610231f5a40068081dc5732eed7f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.084 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.read.latency volume: 146403796 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.read.latency volume: 728273 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '490a1cea-a93a-4cc6-a4f1-a66386ba0358', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 146403796, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.084878', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c8d6a8-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '3e94687595d395691310bb41e5b195846e4f60255c8cb7083d5e7ca6b35d22a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 728273, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.084878', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53c8de96-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': 'fd179474522d1557f7ede0d4254406d0d9b201b1ff4fd81659efafe0000f8314'}]}, 'timestamp': '2026-01-23 09:24:33.085291', '_unique_id': '809f377904aa48ea93818bf746e8efd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.086 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.086 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54693a27-468b-46af-a6d5-92e4dcc73e3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.086378', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c91122-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': 'ea66963550a1079bd6fa9581ccc98a6744579e9666debb95ad18f1661d7040ce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.086378', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53c91a14-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '44c8c24ee13a2bb9e33d948266fa78beea4be9fd94199dc5f57a512378ede9f9'}]}, 'timestamp': '2026-01-23 09:24:33.086812', '_unique_id': '52243e75fc14498baeb973b95aa95ad3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.087 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1eb2f373-327a-498c-9104-b54fbc0c6baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': 'instance-0000004b-74eaa05c-e365-4879-af9a-1bf1c102eda7-tapb1eb1ee2-fc', 'timestamp': '2026-01-23T09:24:33.087875', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'tapb1eb1ee2-fc', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:52:52', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb1eb1ee2-fc'}, 'message_id': '53c94bd8-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.530272177, 'message_signature': '0c6eeac263c6f635aacae7bd42eea8ef630c4971dcd380676f3d850e05976198'}]}, 'timestamp': '2026-01-23 09:24:33.088098', '_unique_id': '15829948908b424bbf12673e5a8448d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.089 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.089 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f76f92c-1da4-4576-a94d-37d2fe22fd3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.089125', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c97c8e-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.561744982, 'message_signature': '61a1fefb2b9716eb88e4e2cfa70ffadf4ace8a197d917a72565e6913c20ae33f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.089125', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53c9844a-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.561744982, 'message_signature': '63ed4151c6ccea2d43345c9ac018798bb1342f57d5b861031861c3558d222d20'}]}, 'timestamp': '2026-01-23 09:24:33.089531', '_unique_id': '39c0c651f26c418b9adc6e7cc0af0473'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.090 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0607b8b-16cc-47d2-8b20-3c27d1853dca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.090749', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c9bbfe-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '1646a50b443e0fcbd0a39e3ecba424ff28cef5f15f31353f0c7f2f4443d517ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.090749', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53c9c3c4-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': 'e51e62c6e6d7510e1a84550c09a01a78cf3e238dac49a5beb07238d7e69a87b8'}]}, 'timestamp': '2026-01-23 09:24:33.091156', '_unique_id': 'ac75c0e115574a2da4eca64a2a455cb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.091 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.092 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.092 12 DEBUG ceilometer.compute.pollsters [-] 74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ba76013-680d-4440-9296-2cc1d9bbe3aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-vda', 'timestamp': '2026-01-23T09:24:33.092345', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53c9fa42-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '467fb88f02913f672dedc765f77f8a53b00325a2f4c4c83c37b69e675fe42bf8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_name': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_name': None, 'resource_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7-sda', 'timestamp': '2026-01-23T09:24:33.092345', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1490124151', 'name': 'instance-0000004b', 'instance_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'instance_type': 'm1.nano', 'host': '742af336c184b58d980cb7924de2587f7c488a441b4c6b8237f99e70', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53ca029e-f83d-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 3778.540470206, 'message_signature': '54838da54da14fa1aa75d3249bfe3f0a647c6081a7b0cc6d4b8e03f1bb836d81'}]}, 'timestamp': '2026-01-23 09:24:33.092774', '_unique_id': '2c29db3609a44fe9a33a1b99353b2304'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:24:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:24:33.093 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:24:33 compute-0 sshd-session[219121]: Accepted publickey for nova from 192.168.122.101 port 43470 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:24:33 compute-0 systemd-logind[746]: New session 54 of user nova.
Jan 23 09:24:33 compute-0 systemd[1]: Started Session 54 of User nova.
Jan 23 09:24:33 compute-0 sshd-session[219121]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:24:33 compute-0 sshd-session[219124]: Received disconnect from 192.168.122.101 port 43470:11: disconnected by user
Jan 23 09:24:33 compute-0 sshd-session[219124]: Disconnected from user nova 192.168.122.101 port 43470
Jan 23 09:24:33 compute-0 sshd-session[219121]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:24:33 compute-0 systemd[1]: session-54.scope: Deactivated successfully.
Jan 23 09:24:33 compute-0 systemd-logind[746]: Session 54 logged out. Waiting for processes to exit.
Jan 23 09:24:33 compute-0 systemd-logind[746]: Removed session 54.
Jan 23 09:24:33 compute-0 sshd-session[219126]: Accepted publickey for nova from 192.168.122.101 port 43484 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:24:33 compute-0 systemd-logind[746]: New session 55 of user nova.
Jan 23 09:24:33 compute-0 systemd[1]: Started Session 55 of User nova.
Jan 23 09:24:33 compute-0 sshd-session[219126]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:24:33 compute-0 sshd-session[219129]: Received disconnect from 192.168.122.101 port 43484:11: disconnected by user
Jan 23 09:24:33 compute-0 sshd-session[219129]: Disconnected from user nova 192.168.122.101 port 43484
Jan 23 09:24:33 compute-0 sshd-session[219126]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:24:33 compute-0 systemd[1]: session-55.scope: Deactivated successfully.
Jan 23 09:24:33 compute-0 systemd-logind[746]: Session 55 logged out. Waiting for processes to exit.
Jan 23 09:24:33 compute-0 systemd-logind[746]: Removed session 55.
Jan 23 09:24:33 compute-0 sshd-session[219131]: Accepted publickey for nova from 192.168.122.101 port 43486 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:24:33 compute-0 systemd-logind[746]: New session 56 of user nova.
Jan 23 09:24:33 compute-0 systemd[1]: Started Session 56 of User nova.
Jan 23 09:24:33 compute-0 sshd-session[219131]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:24:33 compute-0 nova_compute[182092]: 2026-01-23 09:24:33.908 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:33 compute-0 sshd-session[219134]: Received disconnect from 192.168.122.101 port 43486:11: disconnected by user
Jan 23 09:24:33 compute-0 sshd-session[219134]: Disconnected from user nova 192.168.122.101 port 43486
Jan 23 09:24:33 compute-0 sshd-session[219131]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:24:33 compute-0 systemd[1]: session-56.scope: Deactivated successfully.
Jan 23 09:24:33 compute-0 systemd-logind[746]: Session 56 logged out. Waiting for processes to exit.
Jan 23 09:24:33 compute-0 systemd-logind[746]: Removed session 56.
Jan 23 09:24:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:34.352 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:34 compute-0 nova_compute[182092]: 2026-01-23 09:24:34.671 182096 DEBUG nova.compute.manager [req-69ed2572-4b19-4aa4-b7e8-7811968c51db req-47d91183-cb3b-4918-bac8-4d61722ff171 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:34 compute-0 nova_compute[182092]: 2026-01-23 09:24:34.672 182096 DEBUG oslo_concurrency.lockutils [req-69ed2572-4b19-4aa4-b7e8-7811968c51db req-47d91183-cb3b-4918-bac8-4d61722ff171 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:34 compute-0 nova_compute[182092]: 2026-01-23 09:24:34.672 182096 DEBUG oslo_concurrency.lockutils [req-69ed2572-4b19-4aa4-b7e8-7811968c51db req-47d91183-cb3b-4918-bac8-4d61722ff171 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:34 compute-0 nova_compute[182092]: 2026-01-23 09:24:34.672 182096 DEBUG oslo_concurrency.lockutils [req-69ed2572-4b19-4aa4-b7e8-7811968c51db req-47d91183-cb3b-4918-bac8-4d61722ff171 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:34 compute-0 nova_compute[182092]: 2026-01-23 09:24:34.673 182096 DEBUG nova.compute.manager [req-69ed2572-4b19-4aa4-b7e8-7811968c51db req-47d91183-cb3b-4918-bac8-4d61722ff171 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] No waiting events found dispatching network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:34 compute-0 nova_compute[182092]: 2026-01-23 09:24:34.673 182096 WARNING nova.compute.manager [req-69ed2572-4b19-4aa4-b7e8-7811968c51db req-47d91183-cb3b-4918-bac8-4d61722ff171 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received unexpected event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f for instance with vm_state active and task_state resize_migrating.
Jan 23 09:24:35 compute-0 ovn_controller[94697]: 2026-01-23T09:24:35Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:52:52 10.100.0.10
Jan 23 09:24:37 compute-0 nova_compute[182092]: 2026-01-23 09:24:37.043 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:38 compute-0 nova_compute[182092]: 2026-01-23 09:24:38.909 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:39 compute-0 nova_compute[182092]: 2026-01-23 09:24:39.038 182096 INFO nova.network.neutron [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Updating port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 23 09:24:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:39.859 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:39.860 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:39.861 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:40 compute-0 podman[219144]: 2026-01-23 09:24:40.215437499 +0000 UTC m=+0.046318825 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:24:40 compute-0 podman[219143]: 2026-01-23 09:24:40.237220605 +0000 UTC m=+0.068264989 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:24:40 compute-0 nova_compute[182092]: 2026-01-23 09:24:40.520 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Acquiring lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:24:40 compute-0 nova_compute[182092]: 2026-01-23 09:24:40.520 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Acquired lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:24:40 compute-0 nova_compute[182092]: 2026-01-23 09:24:40.521 182096 DEBUG nova.network.neutron [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:24:41 compute-0 nova_compute[182092]: 2026-01-23 09:24:41.202 182096 DEBUG nova.compute.manager [req-05bbf37b-a468-4809-9660-70bfabf382aa req-3763eacd-762b-46df-9126-ba270c584dcc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-changed-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:41 compute-0 nova_compute[182092]: 2026-01-23 09:24:41.202 182096 DEBUG nova.compute.manager [req-05bbf37b-a468-4809-9660-70bfabf382aa req-3763eacd-762b-46df-9126-ba270c584dcc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Refreshing instance network info cache due to event network-changed-0ebc09ac-8a21-4794-a3b1-a164d16aea2f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:24:41 compute-0 nova_compute[182092]: 2026-01-23 09:24:41.202 182096 DEBUG oslo_concurrency.lockutils [req-05bbf37b-a468-4809-9660-70bfabf382aa req-3763eacd-762b-46df-9126-ba270c584dcc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:24:42 compute-0 nova_compute[182092]: 2026-01-23 09:24:42.044 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:42 compute-0 podman[219180]: 2026-01-23 09:24:42.195919908 +0000 UTC m=+0.034910712 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=openstack_network_exporter, io.openshift.expose-services=, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:24:43 compute-0 nova_compute[182092]: 2026-01-23 09:24:43.911 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:44 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:24:44 compute-0 systemd[219099]: Activating special unit Exit the Session...
Jan 23 09:24:44 compute-0 systemd[219099]: Stopped target Main User Target.
Jan 23 09:24:44 compute-0 systemd[219099]: Stopped target Basic System.
Jan 23 09:24:44 compute-0 systemd[219099]: Stopped target Paths.
Jan 23 09:24:44 compute-0 systemd[219099]: Stopped target Sockets.
Jan 23 09:24:44 compute-0 systemd[219099]: Stopped target Timers.
Jan 23 09:24:44 compute-0 systemd[219099]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:24:44 compute-0 systemd[219099]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:24:44 compute-0 systemd[219099]: Closed D-Bus User Message Bus Socket.
Jan 23 09:24:44 compute-0 systemd[219099]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:24:44 compute-0 systemd[219099]: Removed slice User Application Slice.
Jan 23 09:24:44 compute-0 systemd[219099]: Reached target Shutdown.
Jan 23 09:24:44 compute-0 systemd[219099]: Finished Exit the Session.
Jan 23 09:24:44 compute-0 systemd[219099]: Reached target Exit the Session.
Jan 23 09:24:44 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:24:44 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:24:44 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:24:44 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:24:44 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:24:44 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:24:44 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:24:46 compute-0 nova_compute[182092]: 2026-01-23 09:24:46.051 182096 DEBUG nova.network.neutron [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Updating instance_info_cache with network_info: [{"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:24:46 compute-0 nova_compute[182092]: 2026-01-23 09:24:46.759 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Releasing lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:24:46 compute-0 nova_compute[182092]: 2026-01-23 09:24:46.762 182096 DEBUG oslo_concurrency.lockutils [req-05bbf37b-a468-4809-9660-70bfabf382aa req-3763eacd-762b-46df-9126-ba270c584dcc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:24:46 compute-0 nova_compute[182092]: 2026-01-23 09:24:46.762 182096 DEBUG nova.network.neutron [req-05bbf37b-a468-4809-9660-70bfabf382aa req-3763eacd-762b-46df-9126-ba270c584dcc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Refreshing network info cache for port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:24:46 compute-0 nova_compute[182092]: 2026-01-23 09:24:46.871 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 23 09:24:46 compute-0 nova_compute[182092]: 2026-01-23 09:24:46.872 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 23 09:24:46 compute-0 nova_compute[182092]: 2026-01-23 09:24:46.873 182096 INFO nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Creating image(s)
Jan 23 09:24:46 compute-0 nova_compute[182092]: 2026-01-23 09:24:46.873 182096 DEBUG nova.objects.instance [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 06627a8c-e9af-44f2-8f53-e757c83abd9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.045 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.252 182096 DEBUG oslo_concurrency.processutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.297 182096 DEBUG oslo_concurrency.processutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.297 182096 DEBUG nova.virt.disk.api [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Checking if we can resize image /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.298 182096 DEBUG oslo_concurrency.processutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.339 182096 DEBUG oslo_concurrency.processutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.340 182096 DEBUG nova.virt.disk.api [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Cannot resize image /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.355 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.355 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Ensure instance console log exists: /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.355 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.356 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.356 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.358 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Start _get_guest_xml network_info=[{"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--430675", "vif_mac": "fa:16:3e:e7:17:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.361 182096 WARNING nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.369 182096 DEBUG nova.virt.libvirt.host [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.370 182096 DEBUG nova.virt.libvirt.host [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.372 182096 DEBUG nova.virt.libvirt.host [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.372 182096 DEBUG nova.virt.libvirt.host [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.373 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.373 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.374 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.374 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.374 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.374 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.375 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.375 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.375 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.375 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.375 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.376 182096 DEBUG nova.virt.hardware [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.376 182096 DEBUG nova.objects.instance [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 06627a8c-e9af-44f2-8f53-e757c83abd9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.392 182096 DEBUG oslo_concurrency.processutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.406 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.440 182096 DEBUG oslo_concurrency.processutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk.config --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.441 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Acquiring lock "/var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.442 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lock "/var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.442 182096 DEBUG oslo_concurrency.lockutils [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Lock "/var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.443 182096 DEBUG nova.virt.libvirt.vif [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1666544074',display_name='tempest-TestNetworkAdvancedServerOps-server-1666544074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1666544074',id=79,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUy3GwNT1ABl30+dbXTc8mev0VCv0LaxmfqitpN52Zzp2Hp4SusYqDYas9WPokOYp7bRtKXWgMTk6zdKjQqEkYfv82d+d6JBq9I1XNFt1DRjcM0XHRWTlMQr8+rQmk02A==',key_name='tempest-TestNetworkAdvancedServerOps-822185329',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:24:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-axzfp38y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:24:35Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=06627a8c-e9af-44f2-8f53-e757c83abd9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--430675", "vif_mac": "fa:16:3e:e7:17:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.444 182096 DEBUG nova.network.os_vif_util [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Converting VIF {"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--430675", "vif_mac": "fa:16:3e:e7:17:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.445 182096 DEBUG nova.network.os_vif_util [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:17:10,bridge_name='br-int',has_traffic_filtering=True,id=0ebc09ac-8a21-4794-a3b1-a164d16aea2f,network=Network(e4de5fdb-5fcd-4811-bc87-18beb2397eb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ebc09ac-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.447 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <uuid>06627a8c-e9af-44f2-8f53-e757c83abd9c</uuid>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <name>instance-0000004f</name>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1666544074</nova:name>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:24:47</nova:creationTime>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:24:47 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:24:47 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:24:47 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:24:47 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:24:47 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:24:47 compute-0 nova_compute[182092]:         <nova:user uuid="2880f53bded147989ea61dc68ec0880e">tempest-TestNetworkAdvancedServerOps-169193993-project-member</nova:user>
Jan 23 09:24:47 compute-0 nova_compute[182092]:         <nova:project uuid="5a5525bfc549464cace77d44548fb012">tempest-TestNetworkAdvancedServerOps-169193993</nova:project>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:24:47 compute-0 nova_compute[182092]:         <nova:port uuid="0ebc09ac-8a21-4794-a3b1-a164d16aea2f">
Jan 23 09:24:47 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <system>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <entry name="serial">06627a8c-e9af-44f2-8f53-e757c83abd9c</entry>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <entry name="uuid">06627a8c-e9af-44f2-8f53-e757c83abd9c</entry>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </system>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <os>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   </os>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <features>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   </features>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk.config"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:e7:17:10"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <target dev="tap0ebc09ac-8a"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/console.log" append="off"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <video>
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </video>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:24:47 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:24:47 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:24:47 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:24:47 compute-0 nova_compute[182092]: </domain>
Jan 23 09:24:47 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.448 182096 DEBUG nova.virt.libvirt.vif [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1666544074',display_name='tempest-TestNetworkAdvancedServerOps-server-1666544074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1666544074',id=79,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUy3GwNT1ABl30+dbXTc8mev0VCv0LaxmfqitpN52Zzp2Hp4SusYqDYas9WPokOYp7bRtKXWgMTk6zdKjQqEkYfv82d+d6JBq9I1XNFt1DRjcM0XHRWTlMQr8+rQmk02A==',key_name='tempest-TestNetworkAdvancedServerOps-822185329',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:24:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-axzfp38y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:24:35Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=06627a8c-e9af-44f2-8f53-e757c83abd9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--430675", "vif_mac": "fa:16:3e:e7:17:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.448 182096 DEBUG nova.network.os_vif_util [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Converting VIF {"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--430675", "vif_mac": "fa:16:3e:e7:17:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.449 182096 DEBUG nova.network.os_vif_util [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:17:10,bridge_name='br-int',has_traffic_filtering=True,id=0ebc09ac-8a21-4794-a3b1-a164d16aea2f,network=Network(e4de5fdb-5fcd-4811-bc87-18beb2397eb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ebc09ac-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.449 182096 DEBUG os_vif [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:17:10,bridge_name='br-int',has_traffic_filtering=True,id=0ebc09ac-8a21-4794-a3b1-a164d16aea2f,network=Network(e4de5fdb-5fcd-4811-bc87-18beb2397eb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ebc09ac-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.452 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.452 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.453 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.454 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.455 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ebc09ac-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.455 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ebc09ac-8a, col_values=(('external_ids', {'iface-id': '0ebc09ac-8a21-4794-a3b1-a164d16aea2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:17:10', 'vm-uuid': '06627a8c-e9af-44f2-8f53-e757c83abd9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:47 compute-0 NetworkManager[54920]: <info>  [1769160287.4570] manager: (tap0ebc09ac-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.459 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.461 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.462 182096 INFO os_vif [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:17:10,bridge_name='br-int',has_traffic_filtering=True,id=0ebc09ac-8a21-4794-a3b1-a164d16aea2f,network=Network(e4de5fdb-5fcd-4811-bc87-18beb2397eb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ebc09ac-8a')
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.502 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.503 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.503 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] No VIF found with MAC fa:16:3e:e7:17:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.503 182096 INFO nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Using config drive
Jan 23 09:24:47 compute-0 NetworkManager[54920]: <info>  [1769160287.5414] manager: (tap0ebc09ac-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/147)
Jan 23 09:24:47 compute-0 kernel: tap0ebc09ac-8a: entered promiscuous mode
Jan 23 09:24:47 compute-0 ovn_controller[94697]: 2026-01-23T09:24:47Z|00265|binding|INFO|Claiming lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f for this chassis.
Jan 23 09:24:47 compute-0 ovn_controller[94697]: 2026-01-23T09:24:47Z|00266|binding|INFO|0ebc09ac-8a21-4794-a3b1-a164d16aea2f: Claiming fa:16:3e:e7:17:10 10.100.0.9
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.545 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.549 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:17:10 10.100.0.9'], port_security=['fa:16:3e:e7:17:10 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '06627a8c-e9af-44f2-8f53-e757c83abd9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9e3e89d3-be6c-4364-8201-2a41a92c813c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1581789d-a988-41eb-abd2-ab643a835c6e, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0ebc09ac-8a21-4794-a3b1-a164d16aea2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.550 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f in datapath e4de5fdb-5fcd-4811-bc87-18beb2397eb7 bound to our chassis
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.552 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e4de5fdb-5fcd-4811-bc87-18beb2397eb7
Jan 23 09:24:47 compute-0 ovn_controller[94697]: 2026-01-23T09:24:47Z|00267|binding|INFO|Setting lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f ovn-installed in OVS
Jan 23 09:24:47 compute-0 ovn_controller[94697]: 2026-01-23T09:24:47Z|00268|binding|INFO|Setting lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f up in Southbound
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.563 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[288e36de-e1bc-41b5-8af7-3f0d144611c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.565 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape4de5fdb-51 in ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.564 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.567 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape4de5fdb-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.567 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dbdef0dc-7258-4fd1-b63b-fb8438a08beb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.568 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7fdc6210-0cee-43fa-9426-0cd67c583495]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.576 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[7472363d-128a-4be6-9bde-f073927cab19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 systemd-udevd[219228]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:24:47 compute-0 systemd-machined[153562]: New machine qemu-40-instance-0000004f.
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.586 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0aa567-da2a-4bd8-bc45-610a579efa21]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 systemd[1]: Started Virtual Machine qemu-40-instance-0000004f.
Jan 23 09:24:47 compute-0 NetworkManager[54920]: <info>  [1769160287.5923] device (tap0ebc09ac-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:24:47 compute-0 NetworkManager[54920]: <info>  [1769160287.5938] device (tap0ebc09ac-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.613 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b8cb24dd-9773-4d6a-9d44-5011f4e35ad3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 systemd-udevd[219232]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.619 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7e1759-dc8e-4d46-81f8-e3df4990ebc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 NetworkManager[54920]: <info>  [1769160287.6202] manager: (tape4de5fdb-50): new Veth device (/org/freedesktop/NetworkManager/Devices/148)
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.642 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[eed12eb2-d27f-44f3-8e91-4f7f60f0a25b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.645 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1c10ece9-afeb-4db1-8ced-63e800763b4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 NetworkManager[54920]: <info>  [1769160287.6632] device (tape4de5fdb-50): carrier: link connected
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.666 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[051c2c50-f416-4cc0-a76a-760815a0731e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.679 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0892f4e4-a490-4ac3-9f15-40e5dc4f0c70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape4de5fdb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:af:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379314, 'reachable_time': 38735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219251, 'error': None, 'target': 'ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.692 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e7433966-72c9-40fd-ae58-0ee22a874a86]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe05:af66'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379314, 'tstamp': 379314}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219252, 'error': None, 'target': 'ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.704 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[52a9ca60-a354-4cec-bba7-04a56171d087]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape4de5fdb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:af:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379314, 'reachable_time': 38735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219253, 'error': None, 'target': 'ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.727 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[53e688bd-fd75-4c51-b9ed-d952b0cc7e59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.769 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8de29aad-bdd1-4fa7-9b50-8d87fb539cd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.770 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape4de5fdb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.770 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.770 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape4de5fdb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.772 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 NetworkManager[54920]: <info>  [1769160287.7726] manager: (tape4de5fdb-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Jan 23 09:24:47 compute-0 kernel: tape4de5fdb-50: entered promiscuous mode
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.775 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape4de5fdb-50, col_values=(('external_ids', {'iface-id': 'c17b89c7-e246-4f27-b70c-145aec317b7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.775 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 ovn_controller[94697]: 2026-01-23T09:24:47Z|00269|binding|INFO|Releasing lport c17b89c7-e246-4f27-b70c-145aec317b7f from this chassis (sb_readonly=0)
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.777 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e4de5fdb-5fcd-4811-bc87-18beb2397eb7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e4de5fdb-5fcd-4811-bc87-18beb2397eb7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.777 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.788 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[93381378-8c01-4aff-8bbf-9bfa9e92bad0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.788 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-e4de5fdb-5fcd-4811-bc87-18beb2397eb7
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/e4de5fdb-5fcd-4811-bc87-18beb2397eb7.pid.haproxy
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID e4de5fdb-5fcd-4811-bc87-18beb2397eb7
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:24:47 compute-0 nova_compute[182092]: 2026-01-23 09:24:47.790 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:47.790 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'env', 'PROCESS_TAG=haproxy-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e4de5fdb-5fcd-4811-bc87-18beb2397eb7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.023 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160288.0235715, 06627a8c-e9af-44f2-8f53-e757c83abd9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.024 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] VM Resumed (Lifecycle Event)
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.026 182096 DEBUG nova.compute.manager [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.029 182096 INFO nova.virt.libvirt.driver [-] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Instance running successfully.
Jan 23 09:24:48 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.031 182096 DEBUG nova.virt.libvirt.guest [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.031 182096 DEBUG nova.virt.libvirt.driver [None req-ea099c86-40a8-43b1-aa3f-ea7295d98ba1 7bb73acf70454a6d9e8c92a0a47a0581 0253a51f40244c77bb1cc7f6316c1fe5 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.046 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.048 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.068 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.068 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160288.026289, 06627a8c-e9af-44f2-8f53-e757c83abd9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.068 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] VM Started (Lifecycle Event)
Jan 23 09:24:48 compute-0 podman[219288]: 2026-01-23 09:24:48.087195983 +0000 UTC m=+0.042698595 container create 33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.103 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.106 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:48 compute-0 systemd[1]: Started libpod-conmon-33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef.scope.
Jan 23 09:24:48 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:24:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/809fd9fb917e8bcd7ce64acb2fa45b3199426dd5550a788e6093c4c5334a5160/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:24:48 compute-0 podman[219288]: 2026-01-23 09:24:48.138986971 +0000 UTC m=+0.094489602 container init 33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.141 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 23 09:24:48 compute-0 podman[219288]: 2026-01-23 09:24:48.143043384 +0000 UTC m=+0.098545995 container start 33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:24:48 compute-0 podman[219288]: 2026-01-23 09:24:48.07153044 +0000 UTC m=+0.027033071 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:24:48 compute-0 neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7[219300]: [NOTICE]   (219304) : New worker (219306) forked
Jan 23 09:24:48 compute-0 neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7[219300]: [NOTICE]   (219304) : Loading success.
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.675 182096 DEBUG nova.network.neutron [req-05bbf37b-a468-4809-9660-70bfabf382aa req-3763eacd-762b-46df-9126-ba270c584dcc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Updated VIF entry in instance network info cache for port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.675 182096 DEBUG nova.network.neutron [req-05bbf37b-a468-4809-9660-70bfabf382aa req-3763eacd-762b-46df-9126-ba270c584dcc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Updating instance_info_cache with network_info: [{"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:24:48 compute-0 nova_compute[182092]: 2026-01-23 09:24:48.687 182096 DEBUG oslo_concurrency.lockutils [req-05bbf37b-a468-4809-9660-70bfabf382aa req-3763eacd-762b-46df-9126-ba270c584dcc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:24:50 compute-0 nova_compute[182092]: 2026-01-23 09:24:50.436 182096 DEBUG nova.compute.manager [req-f9079a70-61c1-43b8-be16-e12cd1c2f697 req-831545e4-11e0-4c1f-be85-73bc6182743f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:50 compute-0 nova_compute[182092]: 2026-01-23 09:24:50.436 182096 DEBUG oslo_concurrency.lockutils [req-f9079a70-61c1-43b8-be16-e12cd1c2f697 req-831545e4-11e0-4c1f-be85-73bc6182743f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:50 compute-0 nova_compute[182092]: 2026-01-23 09:24:50.437 182096 DEBUG oslo_concurrency.lockutils [req-f9079a70-61c1-43b8-be16-e12cd1c2f697 req-831545e4-11e0-4c1f-be85-73bc6182743f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:50 compute-0 nova_compute[182092]: 2026-01-23 09:24:50.437 182096 DEBUG oslo_concurrency.lockutils [req-f9079a70-61c1-43b8-be16-e12cd1c2f697 req-831545e4-11e0-4c1f-be85-73bc6182743f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:50 compute-0 nova_compute[182092]: 2026-01-23 09:24:50.437 182096 DEBUG nova.compute.manager [req-f9079a70-61c1-43b8-be16-e12cd1c2f697 req-831545e4-11e0-4c1f-be85-73bc6182743f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] No waiting events found dispatching network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:50 compute-0 nova_compute[182092]: 2026-01-23 09:24:50.437 182096 WARNING nova.compute.manager [req-f9079a70-61c1-43b8-be16-e12cd1c2f697 req-831545e4-11e0-4c1f-be85-73bc6182743f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received unexpected event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f for instance with vm_state resized and task_state None.
Jan 23 09:24:50 compute-0 nova_compute[182092]: 2026-01-23 09:24:50.799 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:51 compute-0 podman[219312]: 2026-01-23 09:24:51.224235413 +0000 UTC m=+0.065064791 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.527 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.528 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.543 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.626 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.626 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.633 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.634 182096 INFO nova.compute.claims [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.776 182096 DEBUG nova.compute.provider_tree [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.791 182096 DEBUG nova.scheduler.client.report [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.814 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.814 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.868 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.868 182096 DEBUG nova.network.neutron [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.884 182096 INFO nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:24:51 compute-0 nova_compute[182092]: 2026-01-23 09:24:51.895 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.046 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.096 182096 DEBUG nova.policy [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.115 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.116 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.116 182096 INFO nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Creating image(s)
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.117 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.117 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.118 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.128 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.186 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.187 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.187 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.197 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.255 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.256 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.278 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.279 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.280 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.338 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.339 182096 DEBUG nova.virt.disk.api [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Checking if we can resize image /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.339 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.395 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.396 182096 DEBUG nova.virt.disk.api [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Cannot resize image /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.397 182096 DEBUG nova.objects.instance [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.407 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.407 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Ensure instance console log exists: /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.407 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.408 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.408 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.456 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.657 182096 DEBUG nova.compute.manager [req-9be4b291-372d-44ba-959e-f1b05ee11af2 req-5ea2cf69-51dc-4f4f-96de-c46cebd8f746 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.658 182096 DEBUG oslo_concurrency.lockutils [req-9be4b291-372d-44ba-959e-f1b05ee11af2 req-5ea2cf69-51dc-4f4f-96de-c46cebd8f746 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.658 182096 DEBUG oslo_concurrency.lockutils [req-9be4b291-372d-44ba-959e-f1b05ee11af2 req-5ea2cf69-51dc-4f4f-96de-c46cebd8f746 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.658 182096 DEBUG oslo_concurrency.lockutils [req-9be4b291-372d-44ba-959e-f1b05ee11af2 req-5ea2cf69-51dc-4f4f-96de-c46cebd8f746 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.659 182096 DEBUG nova.compute.manager [req-9be4b291-372d-44ba-959e-f1b05ee11af2 req-5ea2cf69-51dc-4f4f-96de-c46cebd8f746 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] No waiting events found dispatching network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.659 182096 WARNING nova.compute.manager [req-9be4b291-372d-44ba-959e-f1b05ee11af2 req-5ea2cf69-51dc-4f4f-96de-c46cebd8f746 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received unexpected event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f for instance with vm_state active and task_state None.
Jan 23 09:24:52 compute-0 nova_compute[182092]: 2026-01-23 09:24:52.771 182096 DEBUG nova.network.neutron [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Successfully created port: 1f3b05bb-8f61-4855-8260-da7acc110775 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:24:53 compute-0 nova_compute[182092]: 2026-01-23 09:24:53.632 182096 DEBUG nova.network.neutron [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Successfully updated port: 1f3b05bb-8f61-4855-8260-da7acc110775 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:24:53 compute-0 nova_compute[182092]: 2026-01-23 09:24:53.644 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-014ff798-a231-4a80-b2b9-35eba3c3e263" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:24:53 compute-0 nova_compute[182092]: 2026-01-23 09:24:53.644 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-014ff798-a231-4a80-b2b9-35eba3c3e263" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:24:53 compute-0 nova_compute[182092]: 2026-01-23 09:24:53.644 182096 DEBUG nova.network.neutron [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:24:53 compute-0 nova_compute[182092]: 2026-01-23 09:24:53.803 182096 DEBUG nova.compute.manager [req-f2617008-0600-4293-a721-5ca18dcd6e82 req-da136eea-5668-4ed5-8fec-8d6f7ed4baa0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-changed-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:53 compute-0 nova_compute[182092]: 2026-01-23 09:24:53.803 182096 DEBUG nova.compute.manager [req-f2617008-0600-4293-a721-5ca18dcd6e82 req-da136eea-5668-4ed5-8fec-8d6f7ed4baa0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Refreshing instance network info cache due to event network-changed-1f3b05bb-8f61-4855-8260-da7acc110775. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:24:53 compute-0 nova_compute[182092]: 2026-01-23 09:24:53.804 182096 DEBUG oslo_concurrency.lockutils [req-f2617008-0600-4293-a721-5ca18dcd6e82 req-da136eea-5668-4ed5-8fec-8d6f7ed4baa0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-014ff798-a231-4a80-b2b9-35eba3c3e263" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:24:54 compute-0 nova_compute[182092]: 2026-01-23 09:24:54.032 182096 DEBUG nova.network.neutron [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.058 182096 DEBUG nova.network.neutron [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Updating instance_info_cache with network_info: [{"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.080 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-014ff798-a231-4a80-b2b9-35eba3c3e263" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.081 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance network_info: |[{"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.081 182096 DEBUG oslo_concurrency.lockutils [req-f2617008-0600-4293-a721-5ca18dcd6e82 req-da136eea-5668-4ed5-8fec-8d6f7ed4baa0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-014ff798-a231-4a80-b2b9-35eba3c3e263" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.081 182096 DEBUG nova.network.neutron [req-f2617008-0600-4293-a721-5ca18dcd6e82 req-da136eea-5668-4ed5-8fec-8d6f7ed4baa0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Refreshing network info cache for port 1f3b05bb-8f61-4855-8260-da7acc110775 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.083 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Start _get_guest_xml network_info=[{"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.087 182096 WARNING nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.095 182096 DEBUG nova.virt.libvirt.host [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.095 182096 DEBUG nova.virt.libvirt.host [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.100 182096 DEBUG nova.virt.libvirt.host [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.101 182096 DEBUG nova.virt.libvirt.host [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.102 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.102 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.103 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.103 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.103 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.103 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.104 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.104 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.104 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.105 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.105 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.105 182096 DEBUG nova.virt.hardware [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.108 182096 DEBUG nova.virt.libvirt.vif [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2027005408',display_name='tempest-tempest.common.compute-instance-2027005408',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2027005408',id=82,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-d64z8hlg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:24:51Z,user_data=None,user_id='89c019480e524c04af4d250b1c4051e5',uuid=014ff798-a231-4a80-b2b9-35eba3c3e263,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.109 182096 DEBUG nova.network.os_vif_util [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.109 182096 DEBUG nova.network.os_vif_util [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.110 182096 DEBUG nova.objects.instance [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.121 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <uuid>014ff798-a231-4a80-b2b9-35eba3c3e263</uuid>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <name>instance-00000052</name>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <nova:name>tempest-tempest.common.compute-instance-2027005408</nova:name>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:24:55</nova:creationTime>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:24:55 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:24:55 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:24:55 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:24:55 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:24:55 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:24:55 compute-0 nova_compute[182092]:         <nova:user uuid="89c019480e524c04af4d250b1c4051e5">tempest-ServerActionsTestJSON-766366320-project-member</nova:user>
Jan 23 09:24:55 compute-0 nova_compute[182092]:         <nova:project uuid="860ef09b9e6e4866bbe99b6e769733a3">tempest-ServerActionsTestJSON-766366320</nova:project>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:24:55 compute-0 nova_compute[182092]:         <nova:port uuid="1f3b05bb-8f61-4855-8260-da7acc110775">
Jan 23 09:24:55 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <system>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <entry name="serial">014ff798-a231-4a80-b2b9-35eba3c3e263</entry>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <entry name="uuid">014ff798-a231-4a80-b2b9-35eba3c3e263</entry>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </system>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <os>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   </os>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <features>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   </features>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.config"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:ff:77:b3"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <target dev="tap1f3b05bb-8f"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/console.log" append="off"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <video>
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </video>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:24:55 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:24:55 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:24:55 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:24:55 compute-0 nova_compute[182092]: </domain>
Jan 23 09:24:55 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.125 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Preparing to wait for external event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.126 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.126 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.126 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.127 182096 DEBUG nova.virt.libvirt.vif [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2027005408',display_name='tempest-tempest.common.compute-instance-2027005408',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2027005408',id=82,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-d64z8hlg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:24:51Z,user_data=None,user_id='89c019480e524c04af4d250b1c4051e5',uuid=014ff798-a231-4a80-b2b9-35eba3c3e263,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.127 182096 DEBUG nova.network.os_vif_util [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.128 182096 DEBUG nova.network.os_vif_util [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.128 182096 DEBUG os_vif [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.129 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.129 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.130 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.132 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.132 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f3b05bb-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.133 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f3b05bb-8f, col_values=(('external_ids', {'iface-id': '1f3b05bb-8f61-4855-8260-da7acc110775', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:77:b3', 'vm-uuid': '014ff798-a231-4a80-b2b9-35eba3c3e263'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:55 compute-0 NetworkManager[54920]: <info>  [1769160295.1351] manager: (tap1f3b05bb-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.138 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.141 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.142 182096 INFO os_vif [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f')
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.181 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.181 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.181 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No VIF found with MAC fa:16:3e:ff:77:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.182 182096 INFO nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Using config drive
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.577 182096 INFO nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Creating config drive at /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.config
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.581 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjbh9382m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.699 182096 DEBUG oslo_concurrency.processutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjbh9382m" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:24:55 compute-0 kernel: tap1f3b05bb-8f: entered promiscuous mode
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.742 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:55 compute-0 NetworkManager[54920]: <info>  [1769160295.7443] manager: (tap1f3b05bb-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Jan 23 09:24:55 compute-0 ovn_controller[94697]: 2026-01-23T09:24:55Z|00270|binding|INFO|Claiming lport 1f3b05bb-8f61-4855-8260-da7acc110775 for this chassis.
Jan 23 09:24:55 compute-0 ovn_controller[94697]: 2026-01-23T09:24:55Z|00271|binding|INFO|1f3b05bb-8f61-4855-8260-da7acc110775: Claiming fa:16:3e:ff:77:b3 10.100.0.5
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.758 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:77:b3 10.100.0.5'], port_security=['fa:16:3e:ff:77:b3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '014ff798-a231-4a80-b2b9-35eba3c3e263', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5d4e4b42-43fc-4dc5-ab6b-9700587279aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1f3b05bb-8f61-4855-8260-da7acc110775) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.759 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1f3b05bb-8f61-4855-8260-da7acc110775 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.761 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:24:55 compute-0 ovn_controller[94697]: 2026-01-23T09:24:55Z|00272|binding|INFO|Setting lport 1f3b05bb-8f61-4855-8260-da7acc110775 ovn-installed in OVS
Jan 23 09:24:55 compute-0 ovn_controller[94697]: 2026-01-23T09:24:55Z|00273|binding|INFO|Setting lport 1f3b05bb-8f61-4855-8260-da7acc110775 up in Southbound
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.768 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.770 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.775 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d8f2bf7-6014-4c52-b9cd-f314b506a760]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:55 compute-0 systemd-udevd[219372]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:24:55 compute-0 systemd-machined[153562]: New machine qemu-41-instance-00000052.
Jan 23 09:24:55 compute-0 systemd[1]: Started Virtual Machine qemu-41-instance-00000052.
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.801 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b34b7ec4-9454-4995-8062-853851370e99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.803 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a45543-99e7-40f2-90fe-0a99efa50a86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:55 compute-0 NetworkManager[54920]: <info>  [1769160295.8080] device (tap1f3b05bb-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:24:55 compute-0 NetworkManager[54920]: <info>  [1769160295.8086] device (tap1f3b05bb-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.827 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[764a50e3-6785-4e00-9c01-87beb587211e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.842 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f9c38e07-9884-46ce-bb6d-f0c4f2bc3d9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376957, 'reachable_time': 28767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219382, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.852 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[de81bd9c-cf5d-4797-81a0-a987e27168b4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfa610e7c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376965, 'tstamp': 376965}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219384, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfa610e7c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376968, 'tstamp': 376968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219384, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.853 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:55 compute-0 nova_compute[182092]: 2026-01-23 09:24:55.855 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.856 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.856 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.857 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:24:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:24:55.857 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.093 182096 DEBUG nova.compute.manager [req-a778ce8c-e64f-4bd8-9d53-c1cd02df7660 req-b83fc469-a064-4596-89bd-d5b24d1bd102 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.093 182096 DEBUG oslo_concurrency.lockutils [req-a778ce8c-e64f-4bd8-9d53-c1cd02df7660 req-b83fc469-a064-4596-89bd-d5b24d1bd102 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.094 182096 DEBUG oslo_concurrency.lockutils [req-a778ce8c-e64f-4bd8-9d53-c1cd02df7660 req-b83fc469-a064-4596-89bd-d5b24d1bd102 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.094 182096 DEBUG oslo_concurrency.lockutils [req-a778ce8c-e64f-4bd8-9d53-c1cd02df7660 req-b83fc469-a064-4596-89bd-d5b24d1bd102 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.094 182096 DEBUG nova.compute.manager [req-a778ce8c-e64f-4bd8-9d53-c1cd02df7660 req-b83fc469-a064-4596-89bd-d5b24d1bd102 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Processing event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.282 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160296.282446, 014ff798-a231-4a80-b2b9-35eba3c3e263 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.283 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] VM Started (Lifecycle Event)
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.285 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.287 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.290 182096 INFO nova.virt.libvirt.driver [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance spawned successfully.
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.290 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.305 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.309 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.309 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.310 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.310 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.311 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.312 182096 DEBUG nova.virt.libvirt.driver [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.318 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.348 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.349 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160296.2831175, 014ff798-a231-4a80-b2b9-35eba3c3e263 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.349 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] VM Paused (Lifecycle Event)
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.362 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.364 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160296.2865682, 014ff798-a231-4a80-b2b9-35eba3c3e263 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.364 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] VM Resumed (Lifecycle Event)
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.379 182096 INFO nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Took 4.26 seconds to spawn the instance on the hypervisor.
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.379 182096 DEBUG nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.380 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.384 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.419 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.453 182096 INFO nova.compute.manager [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Took 4.86 seconds to build instance.
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.464 182096 DEBUG oslo_concurrency.lockutils [None req-050f7bd9-77b5-4823-8d77-8d7e582dc985 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.533 182096 DEBUG nova.network.neutron [req-f2617008-0600-4293-a721-5ca18dcd6e82 req-da136eea-5668-4ed5-8fec-8d6f7ed4baa0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Updated VIF entry in instance network info cache for port 1f3b05bb-8f61-4855-8260-da7acc110775. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.533 182096 DEBUG nova.network.neutron [req-f2617008-0600-4293-a721-5ca18dcd6e82 req-da136eea-5668-4ed5-8fec-8d6f7ed4baa0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Updating instance_info_cache with network_info: [{"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:24:56 compute-0 nova_compute[182092]: 2026-01-23 09:24:56.547 182096 DEBUG oslo_concurrency.lockutils [req-f2617008-0600-4293-a721-5ca18dcd6e82 req-da136eea-5668-4ed5-8fec-8d6f7ed4baa0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-014ff798-a231-4a80-b2b9-35eba3c3e263" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:24:57 compute-0 nova_compute[182092]: 2026-01-23 09:24:57.048 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:24:58 compute-0 nova_compute[182092]: 2026-01-23 09:24:58.272 182096 DEBUG nova.compute.manager [req-8a3dce81-0959-463e-a2fa-61ff6cfd25b6 req-4359dd1f-78aa-443b-a87e-e92491a41820 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:24:58 compute-0 nova_compute[182092]: 2026-01-23 09:24:58.272 182096 DEBUG oslo_concurrency.lockutils [req-8a3dce81-0959-463e-a2fa-61ff6cfd25b6 req-4359dd1f-78aa-443b-a87e-e92491a41820 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:24:58 compute-0 nova_compute[182092]: 2026-01-23 09:24:58.272 182096 DEBUG oslo_concurrency.lockutils [req-8a3dce81-0959-463e-a2fa-61ff6cfd25b6 req-4359dd1f-78aa-443b-a87e-e92491a41820 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:24:58 compute-0 nova_compute[182092]: 2026-01-23 09:24:58.272 182096 DEBUG oslo_concurrency.lockutils [req-8a3dce81-0959-463e-a2fa-61ff6cfd25b6 req-4359dd1f-78aa-443b-a87e-e92491a41820 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:24:58 compute-0 nova_compute[182092]: 2026-01-23 09:24:58.272 182096 DEBUG nova.compute.manager [req-8a3dce81-0959-463e-a2fa-61ff6cfd25b6 req-4359dd1f-78aa-443b-a87e-e92491a41820 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] No waiting events found dispatching network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:24:58 compute-0 nova_compute[182092]: 2026-01-23 09:24:58.273 182096 WARNING nova.compute.manager [req-8a3dce81-0959-463e-a2fa-61ff6cfd25b6 req-4359dd1f-78aa-443b-a87e-e92491a41820 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received unexpected event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 for instance with vm_state active and task_state None.
Jan 23 09:24:58 compute-0 nova_compute[182092]: 2026-01-23 09:24:58.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:24:59 compute-0 podman[219399]: 2026-01-23 09:24:59.228763385 +0000 UTC m=+0.055749696 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:24:59 compute-0 podman[219400]: 2026-01-23 09:24:59.240179322 +0000 UTC m=+0.068820595 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:24:59 compute-0 nova_compute[182092]: 2026-01-23 09:24:59.386 182096 INFO nova.compute.manager [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Rebuilding instance
Jan 23 09:24:59 compute-0 ovn_controller[94697]: 2026-01-23T09:24:59Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:17:10 10.100.0.9
Jan 23 09:24:59 compute-0 nova_compute[182092]: 2026-01-23 09:24:59.750 182096 DEBUG nova.compute.manager [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:24:59 compute-0 nova_compute[182092]: 2026-01-23 09:24:59.863 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:59 compute-0 nova_compute[182092]: 2026-01-23 09:24:59.884 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:59 compute-0 nova_compute[182092]: 2026-01-23 09:24:59.893 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'resources' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:59 compute-0 nova_compute[182092]: 2026-01-23 09:24:59.902 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:24:59 compute-0 nova_compute[182092]: 2026-01-23 09:24:59.918 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:24:59 compute-0 nova_compute[182092]: 2026-01-23 09:24:59.920 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.135 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.668 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.668 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.668 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.669 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.719 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.777 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.777 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.833 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.838 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.893 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.894 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.950 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:00 compute-0 nova_compute[182092]: 2026-01-23 09:25:00.955 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.010 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.011 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.066 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.297 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.298 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5329MB free_disk=73.20527648925781GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.299 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.299 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.374 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 74eaa05c-e365-4879-af9a-1bf1c102eda7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.374 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 06627a8c-e9af-44f2-8f53-e757c83abd9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.374 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 014ff798-a231-4a80-b2b9-35eba3c3e263 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.375 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.375 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=4 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.465 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.486 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.505 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:25:01 compute-0 nova_compute[182092]: 2026-01-23 09:25:01.506 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.050 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.506 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.506 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.507 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.667 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.668 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:25:02 compute-0 nova_compute[182092]: 2026-01-23 09:25:02.668 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:25:05 compute-0 nova_compute[182092]: 2026-01-23 09:25:05.137 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:05 compute-0 nova_compute[182092]: 2026-01-23 09:25:05.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:25:06 compute-0 nova_compute[182092]: 2026-01-23 09:25:06.786 182096 INFO nova.compute.manager [None req-0c35f2e3-dca5-4cbd-b3ad-2b7a9745b711 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Get console output
Jan 23 09:25:06 compute-0 nova_compute[182092]: 2026-01-23 09:25:06.789 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:25:07 compute-0 nova_compute[182092]: 2026-01-23 09:25:07.050 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:07 compute-0 ovn_controller[94697]: 2026-01-23T09:25:07Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:77:b3 10.100.0.5
Jan 23 09:25:07 compute-0 ovn_controller[94697]: 2026-01-23T09:25:07Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:77:b3 10.100.0.5
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.639 182096 DEBUG nova.compute.manager [req-8c758668-e759-44a0-a2b8-4b4fc9f9befc req-2ac7d90e-cb5d-4923-8881-658d034982ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-changed-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.639 182096 DEBUG nova.compute.manager [req-8c758668-e759-44a0-a2b8-4b4fc9f9befc req-2ac7d90e-cb5d-4923-8881-658d034982ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Refreshing instance network info cache due to event network-changed-0ebc09ac-8a21-4794-a3b1-a164d16aea2f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.640 182096 DEBUG oslo_concurrency.lockutils [req-8c758668-e759-44a0-a2b8-4b4fc9f9befc req-2ac7d90e-cb5d-4923-8881-658d034982ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.640 182096 DEBUG oslo_concurrency.lockutils [req-8c758668-e759-44a0-a2b8-4b4fc9f9befc req-2ac7d90e-cb5d-4923-8881-658d034982ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.640 182096 DEBUG nova.network.neutron [req-8c758668-e759-44a0-a2b8-4b4fc9f9befc req-2ac7d90e-cb5d-4923-8881-658d034982ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Refreshing network info cache for port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.799 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:09.799 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:09.800 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.942 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.942 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.943 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.943 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.943 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.953 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.953 182096 INFO nova.compute.manager [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Terminating instance
Jan 23 09:25:09 compute-0 nova_compute[182092]: 2026-01-23 09:25:09.959 182096 DEBUG nova.compute.manager [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:25:09 compute-0 kernel: tap0ebc09ac-8a (unregistering): left promiscuous mode
Jan 23 09:25:09 compute-0 NetworkManager[54920]: <info>  [1769160309.9877] device (tap0ebc09ac-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.001 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00274|binding|INFO|Releasing lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f from this chassis (sb_readonly=0)
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00275|binding|INFO|Setting lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f down in Southbound
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00276|binding|INFO|Removing iface tap0ebc09ac-8a ovn-installed in OVS
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.003 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.008 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:17:10 10.100.0.9'], port_security=['fa:16:3e:e7:17:10 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '06627a8c-e9af-44f2-8f53-e757c83abd9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9e3e89d3-be6c-4364-8201-2a41a92c813c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1581789d-a988-41eb-abd2-ab643a835c6e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0ebc09ac-8a21-4794-a3b1-a164d16aea2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.010 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f in datapath e4de5fdb-5fcd-4811-bc87-18beb2397eb7 unbound from our chassis
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.011 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e4de5fdb-5fcd-4811-bc87-18beb2397eb7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.014 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b49c967a-9d41-4bf3-9082-7b599bb4978a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.015 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7 namespace which is not needed anymore
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.019 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Jan 23 09:25:10 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000004f.scope: Consumed 11.024s CPU time.
Jan 23 09:25:10 compute-0 systemd-machined[153562]: Machine qemu-40-instance-0000004f terminated.
Jan 23 09:25:10 compute-0 neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7[219300]: [NOTICE]   (219304) : haproxy version is 2.8.14-c23fe91
Jan 23 09:25:10 compute-0 neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7[219300]: [NOTICE]   (219304) : path to executable is /usr/sbin/haproxy
Jan 23 09:25:10 compute-0 neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7[219300]: [ALERT]    (219304) : Current worker (219306) exited with code 143 (Terminated)
Jan 23 09:25:10 compute-0 neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7[219300]: [WARNING]  (219304) : All workers exited. Exiting... (0)
Jan 23 09:25:10 compute-0 systemd[1]: libpod-33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef.scope: Deactivated successfully.
Jan 23 09:25:10 compute-0 podman[219493]: 2026-01-23 09:25:10.115213328 +0000 UTC m=+0.037238114 container died 33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:25:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef-userdata-shm.mount: Deactivated successfully.
Jan 23 09:25:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-809fd9fb917e8bcd7ce64acb2fa45b3199426dd5550a788e6093c4c5334a5160-merged.mount: Deactivated successfully.
Jan 23 09:25:10 compute-0 podman[219493]: 2026-01-23 09:25:10.136822896 +0000 UTC m=+0.058847683 container cleanup 33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.138 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 systemd[1]: libpod-conmon-33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef.scope: Deactivated successfully.
Jan 23 09:25:10 compute-0 kernel: tap0ebc09ac-8a: entered promiscuous mode
Jan 23 09:25:10 compute-0 NetworkManager[54920]: <info>  [1769160310.1732] manager: (tap0ebc09ac-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.174 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00277|binding|INFO|Claiming lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f for this chassis.
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00278|binding|INFO|0ebc09ac-8a21-4794-a3b1-a164d16aea2f: Claiming fa:16:3e:e7:17:10 10.100.0.9
Jan 23 09:25:10 compute-0 kernel: tap0ebc09ac-8a (unregistering): left promiscuous mode
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.189 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:17:10 10.100.0.9'], port_security=['fa:16:3e:e7:17:10 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '06627a8c-e9af-44f2-8f53-e757c83abd9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9e3e89d3-be6c-4364-8201-2a41a92c813c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1581789d-a988-41eb-abd2-ab643a835c6e, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0ebc09ac-8a21-4794-a3b1-a164d16aea2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00279|binding|INFO|Setting lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f ovn-installed in OVS
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.194 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00280|binding|INFO|Setting lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f up in Southbound
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00281|binding|INFO|Releasing lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f from this chassis (sb_readonly=1)
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00282|if_status|INFO|Not setting lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f down as sb is readonly
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00283|binding|INFO|Releasing lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f from this chassis (sb_readonly=0)
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00284|binding|INFO|Setting lport 0ebc09ac-8a21-4794-a3b1-a164d16aea2f down in Southbound
Jan 23 09:25:10 compute-0 ovn_controller[94697]: 2026-01-23T09:25:10Z|00285|binding|INFO|Removing iface tap0ebc09ac-8a ovn-installed in OVS
Jan 23 09:25:10 compute-0 podman[219518]: 2026-01-23 09:25:10.200042824 +0000 UTC m=+0.048569130 container remove 33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.203 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:17:10 10.100.0.9'], port_security=['fa:16:3e:e7:17:10 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '06627a8c-e9af-44f2-8f53-e757c83abd9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9e3e89d3-be6c-4364-8201-2a41a92c813c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1581789d-a988-41eb-abd2-ab643a835c6e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0ebc09ac-8a21-4794-a3b1-a164d16aea2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.201 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.208 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9c43f9-84ae-4a0b-821a-bdb313c19723]: (4, ('Fri Jan 23 09:25:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7 (33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef)\n33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef\nFri Jan 23 09:25:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7 (33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef)\n33c7c6bdaa2c609a1d6f6b35daa01e50d26e92b29ea15a745f7d8e899280faef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.210 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[67ed1d25-1a97-4177-bf31-ff305c522e49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.211 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape4de5fdb-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.211 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.213 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 kernel: tape4de5fdb-50: left promiscuous mode
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.225 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.227 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.229 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ccd0be-01b3-4942-af23-96ed8df868fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.236 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0c06198d-9b60-4644-89eb-230a899b7766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.237 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3551a80c-85f7-4683-9c86-7e905c314996]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.240 182096 INFO nova.virt.libvirt.driver [-] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Instance destroyed successfully.
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.240 182096 DEBUG nova.objects.instance [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'resources' on Instance uuid 06627a8c-e9af-44f2-8f53-e757c83abd9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.250 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[55ce6816-81d1-4097-99bd-e296c0ebd26a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379309, 'reachable_time': 25006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219543, 'error': None, 'target': 'ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 systemd[1]: run-netns-ovnmeta\x2de4de5fdb\x2d5fcd\x2d4811\x2dbc87\x2d18beb2397eb7.mount: Deactivated successfully.
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.253 182096 DEBUG nova.virt.libvirt.vif [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1666544074',display_name='tempest-TestNetworkAdvancedServerOps-server-1666544074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1666544074',id=79,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUy3GwNT1ABl30+dbXTc8mev0VCv0LaxmfqitpN52Zzp2Hp4SusYqDYas9WPokOYp7bRtKXWgMTk6zdKjQqEkYfv82d+d6JBq9I1XNFt1DRjcM0XHRWTlMQr8+rQmk02A==',key_name='tempest-TestNetworkAdvancedServerOps-822185329',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:24:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-axzfp38y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:24:52Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=06627a8c-e9af-44f2-8f53-e757c83abd9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.254 182096 DEBUG nova.network.os_vif_util [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.254 182096 DEBUG nova.network.os_vif_util [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:17:10,bridge_name='br-int',has_traffic_filtering=True,id=0ebc09ac-8a21-4794-a3b1-a164d16aea2f,network=Network(e4de5fdb-5fcd-4811-bc87-18beb2397eb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ebc09ac-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.255 182096 DEBUG os_vif [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:17:10,bridge_name='br-int',has_traffic_filtering=True,id=0ebc09ac-8a21-4794-a3b1-a164d16aea2f,network=Network(e4de5fdb-5fcd-4811-bc87-18beb2397eb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ebc09ac-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.256 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.256 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ebc09ac-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.257 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.254 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e4de5fdb-5fcd-4811-bc87-18beb2397eb7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.254 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[489b1335-1ac5-4a19-a11f-3d91dfd4c378]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.259 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f in datapath e4de5fdb-5fcd-4811-bc87-18beb2397eb7 unbound from our chassis
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.261 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.263 182096 INFO os_vif [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:17:10,bridge_name='br-int',has_traffic_filtering=True,id=0ebc09ac-8a21-4794-a3b1-a164d16aea2f,network=Network(e4de5fdb-5fcd-4811-bc87-18beb2397eb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ebc09ac-8a')
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.263 182096 INFO nova.virt.libvirt.driver [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Deleting instance files /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c_del
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.266 182096 INFO nova.virt.libvirt.driver [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Deletion of /var/lib/nova/instances/06627a8c-e9af-44f2-8f53-e757c83abd9c_del complete
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.269 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e4de5fdb-5fcd-4811-bc87-18beb2397eb7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.269 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[67f28411-3e03-4c3f-9b6d-0878611af272]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.270 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f in datapath e4de5fdb-5fcd-4811-bc87-18beb2397eb7 unbound from our chassis
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.272 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e4de5fdb-5fcd-4811-bc87-18beb2397eb7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:25:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:10.275 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9b9edd-b79d-43c3-8628-928a4267fc1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:10 compute-0 podman[219545]: 2026-01-23 09:25:10.294855736 +0000 UTC m=+0.037966399 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.297 182096 DEBUG nova.compute.manager [req-08a60c4d-b713-4e14-bbc1-e508de0b2f33 req-3fa2d114-15f9-424d-81dc-e6e90e0e82d7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-unplugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.297 182096 DEBUG oslo_concurrency.lockutils [req-08a60c4d-b713-4e14-bbc1-e508de0b2f33 req-3fa2d114-15f9-424d-81dc-e6e90e0e82d7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.298 182096 DEBUG oslo_concurrency.lockutils [req-08a60c4d-b713-4e14-bbc1-e508de0b2f33 req-3fa2d114-15f9-424d-81dc-e6e90e0e82d7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.298 182096 DEBUG oslo_concurrency.lockutils [req-08a60c4d-b713-4e14-bbc1-e508de0b2f33 req-3fa2d114-15f9-424d-81dc-e6e90e0e82d7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.298 182096 DEBUG nova.compute.manager [req-08a60c4d-b713-4e14-bbc1-e508de0b2f33 req-3fa2d114-15f9-424d-81dc-e6e90e0e82d7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] No waiting events found dispatching network-vif-unplugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.298 182096 DEBUG nova.compute.manager [req-08a60c4d-b713-4e14-bbc1-e508de0b2f33 req-3fa2d114-15f9-424d-81dc-e6e90e0e82d7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-unplugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:25:10 compute-0 podman[219542]: 2026-01-23 09:25:10.301164859 +0000 UTC m=+0.050051227 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.340 182096 INFO nova.compute.manager [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.340 182096 DEBUG oslo.service.loopingcall [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.340 182096 DEBUG nova.compute.manager [-] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:25:10 compute-0 nova_compute[182092]: 2026-01-23 09:25:10.341 182096 DEBUG nova.network.neutron [-] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.052 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 kernel: tap1f3b05bb-8f (unregistering): left promiscuous mode
Jan 23 09:25:12 compute-0 NetworkManager[54920]: <info>  [1769160312.0760] device (tap1f3b05bb-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.082 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 ovn_controller[94697]: 2026-01-23T09:25:12Z|00286|binding|INFO|Releasing lport 1f3b05bb-8f61-4855-8260-da7acc110775 from this chassis (sb_readonly=0)
Jan 23 09:25:12 compute-0 ovn_controller[94697]: 2026-01-23T09:25:12Z|00287|binding|INFO|Setting lport 1f3b05bb-8f61-4855-8260-da7acc110775 down in Southbound
Jan 23 09:25:12 compute-0 ovn_controller[94697]: 2026-01-23T09:25:12Z|00288|binding|INFO|Removing iface tap1f3b05bb-8f ovn-installed in OVS
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.090 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.094 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:77:b3 10.100.0.5'], port_security=['fa:16:3e:ff:77:b3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '014ff798-a231-4a80-b2b9-35eba3c3e263', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5d4e4b42-43fc-4dc5-ab6b-9700587279aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1f3b05bb-8f61-4855-8260-da7acc110775) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.095 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1f3b05bb-8f61-4855-8260-da7acc110775 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.096 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.102 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.109 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c0644221-22a2-44fa-9fc1-849f1871b65d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.115 182096 DEBUG nova.network.neutron [-] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.129 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b62be0-88cc-4be1-bb1c-ac54b27c126d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:12 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000052.scope: Deactivated successfully.
Jan 23 09:25:12 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000052.scope: Consumed 11.609s CPU time.
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.131 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1186534d-bb40-4405-98c2-28280d8917a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.133 182096 INFO nova.compute.manager [-] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Took 1.79 seconds to deallocate network for instance.
Jan 23 09:25:12 compute-0 systemd-machined[153562]: Machine qemu-41-instance-00000052 terminated.
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.151 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5566c664-f2fe-46e6-bb17-0451b5fb0f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.164 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c6c65c-8468-4bc3-b6eb-d301cc03dc3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376957, 'reachable_time': 28767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219591, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.176 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6f997e-8cd9-4a10-bbc5-11fad260fa98]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfa610e7c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376965, 'tstamp': 376965}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219592, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfa610e7c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376968, 'tstamp': 376968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219592, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.177 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.178 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.181 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.182 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.182 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.182 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:12.182 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.211 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.211 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.271 182096 DEBUG nova.network.neutron [req-8c758668-e759-44a0-a2b8-4b4fc9f9befc req-2ac7d90e-cb5d-4923-8881-658d034982ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Updated VIF entry in instance network info cache for port 0ebc09ac-8a21-4794-a3b1-a164d16aea2f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.271 182096 DEBUG nova.network.neutron [req-8c758668-e759-44a0-a2b8-4b4fc9f9befc req-2ac7d90e-cb5d-4923-8881-658d034982ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Updating instance_info_cache with network_info: [{"id": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "address": "fa:16:3e:e7:17:10", "network": {"id": "e4de5fdb-5fcd-4811-bc87-18beb2397eb7", "bridge": "br-int", "label": "tempest-network-smoke--430675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ebc09ac-8a", "ovs_interfaceid": "0ebc09ac-8a21-4794-a3b1-a164d16aea2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.278 182096 DEBUG nova.compute.manager [req-e9750a05-74dd-4e50-b00e-519d30a0c78e req-18ee7ad8-c9a5-493a-9735-be95e545240e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-deleted-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.290 182096 DEBUG oslo_concurrency.lockutils [req-8c758668-e759-44a0-a2b8-4b4fc9f9befc req-2ac7d90e-cb5d-4923-8881-658d034982ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-06627a8c-e9af-44f2-8f53-e757c83abd9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.295 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.299 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.316 182096 DEBUG nova.compute.provider_tree [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.330 182096 DEBUG nova.scheduler.client.report [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.348 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:12 compute-0 podman[219605]: 2026-01-23 09:25:12.371389011 +0000 UTC m=+0.047813265 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.393 182096 INFO nova.scheduler.client.report [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Deleted allocations for instance 06627a8c-e9af-44f2-8f53-e757c83abd9c
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.431 182096 DEBUG nova.compute.manager [req-04d910b7-f9ae-4deb-9693-dbf66508ff15 req-7ad61bf1-a1eb-429d-a3d5-ab49b9a2f774 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-unplugged-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.432 182096 DEBUG oslo_concurrency.lockutils [req-04d910b7-f9ae-4deb-9693-dbf66508ff15 req-7ad61bf1-a1eb-429d-a3d5-ab49b9a2f774 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.432 182096 DEBUG oslo_concurrency.lockutils [req-04d910b7-f9ae-4deb-9693-dbf66508ff15 req-7ad61bf1-a1eb-429d-a3d5-ab49b9a2f774 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.432 182096 DEBUG oslo_concurrency.lockutils [req-04d910b7-f9ae-4deb-9693-dbf66508ff15 req-7ad61bf1-a1eb-429d-a3d5-ab49b9a2f774 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.432 182096 DEBUG nova.compute.manager [req-04d910b7-f9ae-4deb-9693-dbf66508ff15 req-7ad61bf1-a1eb-429d-a3d5-ab49b9a2f774 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] No waiting events found dispatching network-vif-unplugged-1f3b05bb-8f61-4855-8260-da7acc110775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.432 182096 WARNING nova.compute.manager [req-04d910b7-f9ae-4deb-9693-dbf66508ff15 req-7ad61bf1-a1eb-429d-a3d5-ab49b9a2f774 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received unexpected event network-vif-unplugged-1f3b05bb-8f61-4855-8260-da7acc110775 for instance with vm_state active and task_state rebuilding.
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.434 182096 DEBUG nova.compute.manager [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.434 182096 DEBUG oslo_concurrency.lockutils [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.434 182096 DEBUG oslo_concurrency.lockutils [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.435 182096 DEBUG oslo_concurrency.lockutils [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.435 182096 DEBUG nova.compute.manager [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] No waiting events found dispatching network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.435 182096 WARNING nova.compute.manager [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received unexpected event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f for instance with vm_state deleted and task_state None.
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.435 182096 DEBUG nova.compute.manager [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.435 182096 DEBUG oslo_concurrency.lockutils [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.435 182096 DEBUG oslo_concurrency.lockutils [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.435 182096 DEBUG oslo_concurrency.lockutils [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.436 182096 DEBUG nova.compute.manager [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] No waiting events found dispatching network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.436 182096 WARNING nova.compute.manager [req-a89c7553-a6c9-4174-97dc-524b432bcb3e req-57ff270f-aeda-44b0-bf2b-47f45528b38e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Received unexpected event network-vif-plugged-0ebc09ac-8a21-4794-a3b1-a164d16aea2f for instance with vm_state deleted and task_state None.
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.460 182096 DEBUG oslo_concurrency.lockutils [None req-df47d83e-1ddb-4ebf-a045-3f71cdb6d594 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "06627a8c-e9af-44f2-8f53-e757c83abd9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.963 182096 INFO nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance shutdown successfully after 13 seconds.
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.966 182096 INFO nova.virt.libvirt.driver [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance destroyed successfully.
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.969 182096 INFO nova.virt.libvirt.driver [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance destroyed successfully.
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.970 182096 DEBUG nova.virt.libvirt.vif [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2027005408',display_name='tempest-ServerActionsTestJSON-server-896717732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2027005408',id=82,image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:24:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-d64z8hlg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:24:58Z,user_data=None,user_id='89c019480e524c04af4d250b1c4051e5',uuid=014ff798-a231-4a80-b2b9-35eba3c3e263,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.970 182096 DEBUG nova.network.os_vif_util [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.971 182096 DEBUG nova.network.os_vif_util [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.971 182096 DEBUG os_vif [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.972 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.972 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f3b05bb-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.974 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.976 182096 INFO os_vif [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f')
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.976 182096 INFO nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Deleting instance files /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263_del
Jan 23 09:25:12 compute-0 nova_compute[182092]: 2026-01-23 09:25:12.977 182096 INFO nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Deletion of /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263_del complete
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.189 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.190 182096 INFO nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Creating image(s)
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.190 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.190 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.191 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.201 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.246 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.247 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.248 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.257 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.303 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.304 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.326 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.327 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.327 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.372 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.373 182096 DEBUG nova.virt.disk.api [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Checking if we can resize image /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.373 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.419 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.420 182096 DEBUG nova.virt.disk.api [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Cannot resize image /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.420 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.420 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Ensure instance console log exists: /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.421 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.421 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.421 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.423 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Start _get_guest_xml network_info=[{"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.426 182096 WARNING nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.435 182096 DEBUG nova.virt.libvirt.host [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.435 182096 DEBUG nova.virt.libvirt.host [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.440 182096 DEBUG nova.virt.libvirt.host [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.440 182096 DEBUG nova.virt.libvirt.host [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.441 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.441 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.441 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.442 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.442 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.442 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.442 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.442 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.442 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.443 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.443 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.443 182096 DEBUG nova.virt.hardware [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.443 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.467 182096 DEBUG nova.virt.libvirt.vif [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2027005408',display_name='tempest-ServerActionsTestJSON-server-896717732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2027005408',id=82,image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:24:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-d64z8hlg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:25:13Z,user_data=None,user_id='89c019480e524c04af4d250b1c4051e5',uuid=014ff798-a231-4a80-b2b9-35eba3c3e263,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.467 182096 DEBUG nova.network.os_vif_util [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.468 182096 DEBUG nova.network.os_vif_util [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.470 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <uuid>014ff798-a231-4a80-b2b9-35eba3c3e263</uuid>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <name>instance-00000052</name>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerActionsTestJSON-server-896717732</nova:name>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:25:13</nova:creationTime>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:25:13 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:25:13 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:25:13 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:25:13 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:25:13 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:25:13 compute-0 nova_compute[182092]:         <nova:user uuid="89c019480e524c04af4d250b1c4051e5">tempest-ServerActionsTestJSON-766366320-project-member</nova:user>
Jan 23 09:25:13 compute-0 nova_compute[182092]:         <nova:project uuid="860ef09b9e6e4866bbe99b6e769733a3">tempest-ServerActionsTestJSON-766366320</nova:project>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="11cc348c-4b05-42ba-a4b9-513b91dede76"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:25:13 compute-0 nova_compute[182092]:         <nova:port uuid="1f3b05bb-8f61-4855-8260-da7acc110775">
Jan 23 09:25:13 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <system>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <entry name="serial">014ff798-a231-4a80-b2b9-35eba3c3e263</entry>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <entry name="uuid">014ff798-a231-4a80-b2b9-35eba3c3e263</entry>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </system>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <os>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   </os>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <features>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   </features>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.config"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:ff:77:b3"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <target dev="tap1f3b05bb-8f"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/console.log" append="off"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <video>
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </video>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:25:13 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:25:13 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:25:13 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:25:13 compute-0 nova_compute[182092]: </domain>
Jan 23 09:25:13 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.470 182096 DEBUG nova.compute.manager [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Preparing to wait for external event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.470 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.470 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.470 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.471 182096 DEBUG nova.virt.libvirt.vif [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2027005408',display_name='tempest-ServerActionsTestJSON-server-896717732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2027005408',id=82,image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:24:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-d64z8hlg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:25:13Z,user_data=None,user_id='89c019480e524c04af4d250b1c4051e5',uuid=014ff798-a231-4a80-b2b9-35eba3c3e263,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.471 182096 DEBUG nova.network.os_vif_util [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.471 182096 DEBUG nova.network.os_vif_util [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.472 182096 DEBUG os_vif [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.472 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.473 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.473 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.475 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.476 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f3b05bb-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.476 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f3b05bb-8f, col_values=(('external_ids', {'iface-id': '1f3b05bb-8f61-4855-8260-da7acc110775', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:77:b3', 'vm-uuid': '014ff798-a231-4a80-b2b9-35eba3c3e263'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.477 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:13 compute-0 NetworkManager[54920]: <info>  [1769160313.4781] manager: (tap1f3b05bb-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.479 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.481 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.481 182096 INFO os_vif [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f')
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.531 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.531 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.532 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No VIF found with MAC fa:16:3e:ff:77:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.532 182096 INFO nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Using config drive
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.562 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:13 compute-0 nova_compute[182092]: 2026-01-23 09:25:13.594 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'keypairs' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.275 182096 INFO nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Creating config drive at /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.config
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.279 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzkvg92wx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.398 182096 DEBUG oslo_concurrency.processutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzkvg92wx" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:14 compute-0 kernel: tap1f3b05bb-8f: entered promiscuous mode
Jan 23 09:25:14 compute-0 NetworkManager[54920]: <info>  [1769160314.4414] manager: (tap1f3b05bb-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Jan 23 09:25:14 compute-0 ovn_controller[94697]: 2026-01-23T09:25:14Z|00289|binding|INFO|Claiming lport 1f3b05bb-8f61-4855-8260-da7acc110775 for this chassis.
Jan 23 09:25:14 compute-0 ovn_controller[94697]: 2026-01-23T09:25:14Z|00290|binding|INFO|1f3b05bb-8f61-4855-8260-da7acc110775: Claiming fa:16:3e:ff:77:b3 10.100.0.5
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.442 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.461 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:77:b3 10.100.0.5'], port_security=['fa:16:3e:ff:77:b3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '014ff798-a231-4a80-b2b9-35eba3c3e263', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5d4e4b42-43fc-4dc5-ab6b-9700587279aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1f3b05bb-8f61-4855-8260-da7acc110775) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.461 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:14 compute-0 ovn_controller[94697]: 2026-01-23T09:25:14Z|00291|binding|INFO|Setting lport 1f3b05bb-8f61-4855-8260-da7acc110775 ovn-installed in OVS
Jan 23 09:25:14 compute-0 ovn_controller[94697]: 2026-01-23T09:25:14Z|00292|binding|INFO|Setting lport 1f3b05bb-8f61-4855-8260-da7acc110775 up in Southbound
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.464 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.464 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1f3b05bb-8f61-4855-8260-da7acc110775 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.466 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.474 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.477 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f7152f92-5cfa-40ca-88cc-886951c60f30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:14 compute-0 systemd-udevd[219661]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:25:14 compute-0 systemd-machined[153562]: New machine qemu-42-instance-00000052.
Jan 23 09:25:14 compute-0 systemd[1]: Started Virtual Machine qemu-42-instance-00000052.
Jan 23 09:25:14 compute-0 NetworkManager[54920]: <info>  [1769160314.4924] device (tap1f3b05bb-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:25:14 compute-0 NetworkManager[54920]: <info>  [1769160314.4931] device (tap1f3b05bb-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.502 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[33a003f2-a3e7-4ff2-89bc-87c45edae1ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.504 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[90f75420-7102-4033-ba2f-576c057615ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.523 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd1f935-a261-4db3-8402-2c63b73f066c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.535 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dba6b707-f09d-46ac-a97c-6834e85e3e84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376957, 'reachable_time': 28767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219673, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.546 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[88f1afc4-9d2f-4237-b970-993fdca52a2f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfa610e7c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376965, 'tstamp': 376965}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219674, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfa610e7c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376968, 'tstamp': 376968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219674, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.547 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.549 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.550 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.550 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.550 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:14.551 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.759 182096 DEBUG nova.compute.manager [req-aa4447e2-f43f-4633-8dea-62a71c895f51 req-3e12fa00-9edb-4c48-8157-637ac8b094ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.759 182096 DEBUG oslo_concurrency.lockutils [req-aa4447e2-f43f-4633-8dea-62a71c895f51 req-3e12fa00-9edb-4c48-8157-637ac8b094ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.759 182096 DEBUG oslo_concurrency.lockutils [req-aa4447e2-f43f-4633-8dea-62a71c895f51 req-3e12fa00-9edb-4c48-8157-637ac8b094ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.759 182096 DEBUG oslo_concurrency.lockutils [req-aa4447e2-f43f-4633-8dea-62a71c895f51 req-3e12fa00-9edb-4c48-8157-637ac8b094ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.760 182096 DEBUG nova.compute.manager [req-aa4447e2-f43f-4633-8dea-62a71c895f51 req-3e12fa00-9edb-4c48-8157-637ac8b094ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Processing event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.906 182096 DEBUG nova.compute.manager [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.907 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 014ff798-a231-4a80-b2b9-35eba3c3e263 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.907 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160314.9061136, 014ff798-a231-4a80-b2b9-35eba3c3e263 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.907 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] VM Started (Lifecycle Event)
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.909 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.912 182096 INFO nova.virt.libvirt.driver [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance spawned successfully.
Jan 23 09:25:14 compute-0 nova_compute[182092]: 2026-01-23 09:25:14.912 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.121 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.123 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.134 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.134 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.135 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.135 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.135 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.136 182096 DEBUG nova.virt.libvirt.driver [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.150 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.150 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160314.9067402, 014ff798-a231-4a80-b2b9-35eba3c3e263 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.151 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] VM Paused (Lifecycle Event)
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.166 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.167 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160314.9095414, 014ff798-a231-4a80-b2b9-35eba3c3e263 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.168 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] VM Resumed (Lifecycle Event)
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.178 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.180 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.197 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.201 182096 DEBUG nova.compute.manager [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.283 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.283 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.284 182096 DEBUG nova.objects.instance [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:25:15 compute-0 nova_compute[182092]: 2026-01-23 09:25:15.344 182096 DEBUG oslo_concurrency.lockutils [None req-dcc395d2-a153-4dd0-ba18-0cbc52a5737a 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:16 compute-0 ovn_controller[94697]: 2026-01-23T09:25:16Z|00293|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:25:16 compute-0 nova_compute[182092]: 2026-01-23 09:25:16.519 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.053 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.340 182096 DEBUG nova.compute.manager [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.340 182096 DEBUG oslo_concurrency.lockutils [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.340 182096 DEBUG oslo_concurrency.lockutils [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.341 182096 DEBUG oslo_concurrency.lockutils [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.341 182096 DEBUG nova.compute.manager [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] No waiting events found dispatching network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.341 182096 WARNING nova.compute.manager [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received unexpected event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 for instance with vm_state active and task_state None.
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.341 182096 DEBUG nova.compute.manager [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.341 182096 DEBUG oslo_concurrency.lockutils [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.341 182096 DEBUG oslo_concurrency.lockutils [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.342 182096 DEBUG oslo_concurrency.lockutils [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.342 182096 DEBUG nova.compute.manager [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] No waiting events found dispatching network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:17 compute-0 nova_compute[182092]: 2026-01-23 09:25:17.342 182096 WARNING nova.compute.manager [req-1f3dcb4f-1658-4992-bc98-52444d0e7e81 req-0582d84f-9842-491e-9c0d-ee2b70adec59 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received unexpected event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 for instance with vm_state active and task_state None.
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.479 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.643 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.643 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.644 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.644 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.645 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.652 182096 INFO nova.compute.manager [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Terminating instance
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.658 182096 DEBUG nova.compute.manager [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:25:18 compute-0 kernel: tap1f3b05bb-8f (unregistering): left promiscuous mode
Jan 23 09:25:18 compute-0 NetworkManager[54920]: <info>  [1769160318.6761] device (tap1f3b05bb-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.680 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 ovn_controller[94697]: 2026-01-23T09:25:18Z|00294|binding|INFO|Releasing lport 1f3b05bb-8f61-4855-8260-da7acc110775 from this chassis (sb_readonly=0)
Jan 23 09:25:18 compute-0 ovn_controller[94697]: 2026-01-23T09:25:18Z|00295|binding|INFO|Setting lport 1f3b05bb-8f61-4855-8260-da7acc110775 down in Southbound
Jan 23 09:25:18 compute-0 ovn_controller[94697]: 2026-01-23T09:25:18Z|00296|binding|INFO|Removing iface tap1f3b05bb-8f ovn-installed in OVS
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.682 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.694 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:77:b3 10.100.0.5'], port_security=['fa:16:3e:ff:77:b3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '014ff798-a231-4a80-b2b9-35eba3c3e263', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5d4e4b42-43fc-4dc5-ab6b-9700587279aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1f3b05bb-8f61-4855-8260-da7acc110775) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.695 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.696 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1f3b05bb-8f61-4855-8260-da7acc110775 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.697 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.711 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a7605693-3d12-45f6-a77e-1e6a5543396f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:18 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000052.scope: Deactivated successfully.
Jan 23 09:25:18 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000052.scope: Consumed 4.174s CPU time.
Jan 23 09:25:18 compute-0 systemd-machined[153562]: Machine qemu-42-instance-00000052 terminated.
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.733 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc2322d-0fbc-4049-ba1b-d291cba173ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.735 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8cabab46-0dfc-4843-b687-99d3b4dc684b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.753 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[30e7325c-fcae-476f-907c-ab41464c2b22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.767 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c3cb656f-ddc0-4479-a7c2-080186cdb4e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376957, 'reachable_time': 28767, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219694, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.778 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5f233d-eb6d-40e2-a919-17e405a28804]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfa610e7c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376965, 'tstamp': 376965}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219695, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfa610e7c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376968, 'tstamp': 376968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219695, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.780 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.781 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.784 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.784 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.784 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.785 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:18.785 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.872 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.875 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.898 182096 INFO nova.virt.libvirt.driver [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Instance destroyed successfully.
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.899 182096 DEBUG nova.objects.instance [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'resources' on Instance uuid 014ff798-a231-4a80-b2b9-35eba3c3e263 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.920 182096 DEBUG nova.virt.libvirt.vif [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:24:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2027005408',display_name='tempest-ServerActionsTestJSON-server-896717732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2027005408',id=82,image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:25:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-d64z8hlg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='11cc348c-4b05-42ba-a4b9-513b91dede76',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:25:15Z,user_data=None,user_id='89c019480e524c04af4d250b1c4051e5',uuid=014ff798-a231-4a80-b2b9-35eba3c3e263,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.920 182096 DEBUG nova.network.os_vif_util [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "1f3b05bb-8f61-4855-8260-da7acc110775", "address": "fa:16:3e:ff:77:b3", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f3b05bb-8f", "ovs_interfaceid": "1f3b05bb-8f61-4855-8260-da7acc110775", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.921 182096 DEBUG nova.network.os_vif_util [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.921 182096 DEBUG os_vif [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.923 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.923 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f3b05bb-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.924 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.925 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.927 182096 INFO os_vif [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:77:b3,bridge_name='br-int',has_traffic_filtering=True,id=1f3b05bb-8f61-4855-8260-da7acc110775,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f3b05bb-8f')
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.927 182096 INFO nova.virt.libvirt.driver [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Deleting instance files /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263_del
Jan 23 09:25:18 compute-0 nova_compute[182092]: 2026-01-23 09:25:18.927 182096 INFO nova.virt.libvirt.driver [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Deletion of /var/lib/nova/instances/014ff798-a231-4a80-b2b9-35eba3c3e263_del complete
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.008 182096 INFO nova.compute.manager [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.008 182096 DEBUG oslo.service.loopingcall [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.008 182096 DEBUG nova.compute.manager [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.008 182096 DEBUG nova.network.neutron [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.429 182096 DEBUG nova.compute.manager [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-unplugged-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.429 182096 DEBUG oslo_concurrency.lockutils [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.430 182096 DEBUG oslo_concurrency.lockutils [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.430 182096 DEBUG oslo_concurrency.lockutils [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.430 182096 DEBUG nova.compute.manager [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] No waiting events found dispatching network-vif-unplugged-1f3b05bb-8f61-4855-8260-da7acc110775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.430 182096 DEBUG nova.compute.manager [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-unplugged-1f3b05bb-8f61-4855-8260-da7acc110775 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.430 182096 DEBUG nova.compute.manager [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.430 182096 DEBUG oslo_concurrency.lockutils [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.431 182096 DEBUG oslo_concurrency.lockutils [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.431 182096 DEBUG oslo_concurrency.lockutils [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.431 182096 DEBUG nova.compute.manager [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] No waiting events found dispatching network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.431 182096 WARNING nova.compute.manager [req-073da1c7-7c38-44c0-bef0-b95c4920041e req-3c59f693-dd90-445e-84f7-1407700592f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received unexpected event network-vif-plugged-1f3b05bb-8f61-4855-8260-da7acc110775 for instance with vm_state active and task_state deleting.
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.459 182096 DEBUG nova.network.neutron [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.472 182096 INFO nova.compute.manager [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Took 0.46 seconds to deallocate network for instance.
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.521 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.521 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.594 182096 DEBUG nova.compute.provider_tree [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.607 182096 DEBUG nova.scheduler.client.report [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.621 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.639 182096 INFO nova.scheduler.client.report [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Deleted allocations for instance 014ff798-a231-4a80-b2b9-35eba3c3e263
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.690 182096 DEBUG oslo_concurrency.lockutils [None req-e57b1184-44ce-4b27-a9ce-529d8b11d01d 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "014ff798-a231-4a80-b2b9-35eba3c3e263" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:19 compute-0 ovn_controller[94697]: 2026-01-23T09:25:19Z|00297|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:25:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:19.802 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:19 compute-0 nova_compute[182092]: 2026-01-23 09:25:19.834 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:21 compute-0 nova_compute[182092]: 2026-01-23 09:25:21.527 182096 DEBUG nova.compute.manager [req-e89b6d7d-acc3-42f1-9e31-8bb9fa310ecb req-b979add3-018a-4d17-ab23-6afb3bbb8f37 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Received event network-vif-deleted-1f3b05bb-8f61-4855-8260-da7acc110775 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:22 compute-0 nova_compute[182092]: 2026-01-23 09:25:22.054 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:22 compute-0 podman[219713]: 2026-01-23 09:25:22.229692752 +0000 UTC m=+0.062123741 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 09:25:22 compute-0 nova_compute[182092]: 2026-01-23 09:25:22.338 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:23 compute-0 nova_compute[182092]: 2026-01-23 09:25:23.924 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:24 compute-0 nova_compute[182092]: 2026-01-23 09:25:24.110 182096 DEBUG oslo_concurrency.lockutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:25:24 compute-0 nova_compute[182092]: 2026-01-23 09:25:24.110 182096 DEBUG oslo_concurrency.lockutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:25:24 compute-0 nova_compute[182092]: 2026-01-23 09:25:24.110 182096 DEBUG nova.network.neutron [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:25:25 compute-0 nova_compute[182092]: 2026-01-23 09:25:25.237 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160310.2365968, 06627a8c-e9af-44f2-8f53-e757c83abd9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:25 compute-0 nova_compute[182092]: 2026-01-23 09:25:25.238 182096 INFO nova.compute.manager [-] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] VM Stopped (Lifecycle Event)
Jan 23 09:25:25 compute-0 nova_compute[182092]: 2026-01-23 09:25:25.257 182096 DEBUG nova.compute.manager [None req-0318fce2-aaf6-41f4-818c-c9b182a97e95 - - - - - -] [instance: 06627a8c-e9af-44f2-8f53-e757c83abd9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.135 182096 DEBUG nova.network.neutron [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.165 182096 DEBUG oslo_concurrency.lockutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.255 182096 DEBUG nova.virt.libvirt.driver [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.255 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Creating file /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/379382eb10b7436eb97de95c1db6caba.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.256 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/379382eb10b7436eb97de95c1db6caba.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.564 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/379382eb10b7436eb97de95c1db6caba.tmp" returned: 1 in 0.308s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.565 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/379382eb10b7436eb97de95c1db6caba.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.565 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Creating directory /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.566 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.728 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7" returned: 0 in 0.162s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.731 182096 DEBUG nova.virt.libvirt.driver [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:25:26 compute-0 nova_compute[182092]: 2026-01-23 09:25:26.829 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:27 compute-0 nova_compute[182092]: 2026-01-23 09:25:27.056 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:28 compute-0 kernel: tapb1eb1ee2-fc (unregistering): left promiscuous mode
Jan 23 09:25:28 compute-0 NetworkManager[54920]: <info>  [1769160328.8488] device (tapb1eb1ee2-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:25:28 compute-0 ovn_controller[94697]: 2026-01-23T09:25:28Z|00298|binding|INFO|Releasing lport b1eb1ee2-fc97-423a-bac5-6219bd097839 from this chassis (sb_readonly=0)
Jan 23 09:25:28 compute-0 ovn_controller[94697]: 2026-01-23T09:25:28Z|00299|binding|INFO|Setting lport b1eb1ee2-fc97-423a-bac5-6219bd097839 down in Southbound
Jan 23 09:25:28 compute-0 ovn_controller[94697]: 2026-01-23T09:25:28Z|00300|binding|INFO|Removing iface tapb1eb1ee2-fc ovn-installed in OVS
Jan 23 09:25:28 compute-0 nova_compute[182092]: 2026-01-23 09:25:28.857 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:28 compute-0 nova_compute[182092]: 2026-01-23 09:25:28.858 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:28.863 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:52:52 10.100.0.10'], port_security=['fa:16:3e:27:52:52 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '74eaa05c-e365-4879-af9a-1bf1c102eda7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b1eb1ee2-fc97-423a-bac5-6219bd097839) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:28.865 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b1eb1ee2-fc97-423a-bac5-6219bd097839 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:25:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:28.866 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa610e7c-53f8-4775-b5b8-aa45897b011c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:25:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:28.866 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d38357a6-d963-417b-a2d4-640419c141f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:28.867 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace which is not needed anymore
Jan 23 09:25:28 compute-0 nova_compute[182092]: 2026-01-23 09:25:28.879 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:28 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 23 09:25:28 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004b.scope: Consumed 12.438s CPU time.
Jan 23 09:25:28 compute-0 systemd-machined[153562]: Machine qemu-39-instance-0000004b terminated.
Jan 23 09:25:28 compute-0 nova_compute[182092]: 2026-01-23 09:25:28.925 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:28 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[219037]: [NOTICE]   (219047) : haproxy version is 2.8.14-c23fe91
Jan 23 09:25:28 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[219037]: [NOTICE]   (219047) : path to executable is /usr/sbin/haproxy
Jan 23 09:25:28 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[219037]: [WARNING]  (219047) : Exiting Master process...
Jan 23 09:25:28 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[219037]: [ALERT]    (219047) : Current worker (219049) exited with code 143 (Terminated)
Jan 23 09:25:28 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[219037]: [WARNING]  (219047) : All workers exited. Exiting... (0)
Jan 23 09:25:28 compute-0 systemd[1]: libpod-160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39.scope: Deactivated successfully.
Jan 23 09:25:28 compute-0 podman[219760]: 2026-01-23 09:25:28.967154053 +0000 UTC m=+0.034846051 container died 160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 09:25:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39-userdata-shm.mount: Deactivated successfully.
Jan 23 09:25:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-ce8f1ea40c6e34988106fc13f4d8f34d56503c76d0c5b60759714311d189796a-merged.mount: Deactivated successfully.
Jan 23 09:25:28 compute-0 podman[219760]: 2026-01-23 09:25:28.987803079 +0000 UTC m=+0.055495077 container cleanup 160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:25:28 compute-0 systemd[1]: libpod-conmon-160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39.scope: Deactivated successfully.
Jan 23 09:25:29 compute-0 podman[219783]: 2026-01-23 09:25:29.028136982 +0000 UTC m=+0.025826304 container remove 160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.031 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d4816e6e-8342-4d71-a3be-2e07c9223a4f]: (4, ('Fri Jan 23 09:25:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39)\n160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39\nFri Jan 23 09:25:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39)\n160f4fdf79196c7d2eca1972141ef1bb87a7f6f244d8ebc5eb51a3a737bbca39\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.033 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d534a3-8a2e-4916-b46f-764587054a36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.033 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.035 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:29 compute-0 kernel: tapfa610e7c-50: left promiscuous mode
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.049 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.052 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ce197197-049f-4cb7-bbc5-127a08beeb72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.064 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[299ad564-2c66-4d31-a31e-1e725215bec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.064 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[703ff9e1-7e19-4649-ab63-df90c33ff4c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.076 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0233ec45-4dc5-4888-9259-f35a2ee476be]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376952, 'reachable_time': 38608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219799, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:29 compute-0 systemd[1]: run-netns-ovnmeta\x2dfa610e7c\x2d53f8\x2d4775\x2db5b8\x2daa45897b011c.mount: Deactivated successfully.
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.078 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:25:29 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:29.078 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[22e3b9b5-a8d6-4c08-ae73-4282a347b065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.188 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.427 182096 DEBUG nova.compute.manager [req-d82bffa0-d9dd-4ddf-ab39-eb8287c0e9e5 req-e31a37e0-ce3c-48a1-ad3a-19ccb342982e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.427 182096 DEBUG oslo_concurrency.lockutils [req-d82bffa0-d9dd-4ddf-ab39-eb8287c0e9e5 req-e31a37e0-ce3c-48a1-ad3a-19ccb342982e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.427 182096 DEBUG oslo_concurrency.lockutils [req-d82bffa0-d9dd-4ddf-ab39-eb8287c0e9e5 req-e31a37e0-ce3c-48a1-ad3a-19ccb342982e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.427 182096 DEBUG oslo_concurrency.lockutils [req-d82bffa0-d9dd-4ddf-ab39-eb8287c0e9e5 req-e31a37e0-ce3c-48a1-ad3a-19ccb342982e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.428 182096 DEBUG nova.compute.manager [req-d82bffa0-d9dd-4ddf-ab39-eb8287c0e9e5 req-e31a37e0-ce3c-48a1-ad3a-19ccb342982e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.428 182096 WARNING nova.compute.manager [req-d82bffa0-d9dd-4ddf-ab39-eb8287c0e9e5 req-e31a37e0-ce3c-48a1-ad3a-19ccb342982e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-unplugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state resize_migrating.
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.743 182096 INFO nova.virt.libvirt.driver [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance shutdown successfully after 3 seconds.
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.747 182096 INFO nova.virt.libvirt.driver [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Instance destroyed successfully.
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.748 182096 DEBUG nova.virt.libvirt.vif [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:23:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:25:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-549684817-network", "vif_mac": "fa:16:3e:27:52:52"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.748 182096 DEBUG nova.network.os_vif_util [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-549684817-network", "vif_mac": "fa:16:3e:27:52:52"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.748 182096 DEBUG nova.network.os_vif_util [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.749 182096 DEBUG os_vif [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.750 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.750 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1eb1ee2-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.751 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.753 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.754 182096 INFO os_vif [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.757 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.804 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.804 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.850 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.851 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Copying file /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk to 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:25:29 compute-0 nova_compute[182092]: 2026-01-23 09:25:29.852 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:30 compute-0 podman[219823]: 2026-01-23 09:25:30.222230487 +0000 UTC m=+0.059321034 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 09:25:30 compute-0 podman[219824]: 2026-01-23 09:25:30.22301094 +0000 UTC m=+0.055001636 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.304 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "scp -r /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.305 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Copying file /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.306 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk.config 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.480 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "scp -C -r /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk.config 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.config" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.481 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Copying file /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.481 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk.info 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.646 182096 DEBUG oslo_concurrency.processutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "scp -C -r /var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7_resize/disk.info 192.168.122.101:/var/lib/nova/instances/74eaa05c-e365-4879-af9a-1bf1c102eda7/disk.info" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.806 182096 DEBUG neutronclient.v2_0.client [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b1eb1ee2-fc97-423a-bac5-6219bd097839 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.886 182096 DEBUG oslo_concurrency.lockutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.887 182096 DEBUG oslo_concurrency.lockutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:30 compute-0 nova_compute[182092]: 2026-01-23 09:25:30.887 182096 DEBUG oslo_concurrency.lockutils [None req-52b4e5a6-3a04-4745-8971-31b878b9e58b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:31 compute-0 nova_compute[182092]: 2026-01-23 09:25:31.506 182096 DEBUG nova.compute.manager [req-678076d7-ce15-4b4c-834a-64c5677e65a0 req-1f2d1585-b549-48dc-a287-76a65f1dad48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:31 compute-0 nova_compute[182092]: 2026-01-23 09:25:31.507 182096 DEBUG oslo_concurrency.lockutils [req-678076d7-ce15-4b4c-834a-64c5677e65a0 req-1f2d1585-b549-48dc-a287-76a65f1dad48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:31 compute-0 nova_compute[182092]: 2026-01-23 09:25:31.507 182096 DEBUG oslo_concurrency.lockutils [req-678076d7-ce15-4b4c-834a-64c5677e65a0 req-1f2d1585-b549-48dc-a287-76a65f1dad48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:31 compute-0 nova_compute[182092]: 2026-01-23 09:25:31.507 182096 DEBUG oslo_concurrency.lockutils [req-678076d7-ce15-4b4c-834a-64c5677e65a0 req-1f2d1585-b549-48dc-a287-76a65f1dad48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:31 compute-0 nova_compute[182092]: 2026-01-23 09:25:31.508 182096 DEBUG nova.compute.manager [req-678076d7-ce15-4b4c-834a-64c5677e65a0 req-1f2d1585-b549-48dc-a287-76a65f1dad48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:31 compute-0 nova_compute[182092]: 2026-01-23 09:25:31.508 182096 WARNING nova.compute.manager [req-678076d7-ce15-4b4c-834a-64c5677e65a0 req-1f2d1585-b549-48dc-a287-76a65f1dad48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state active and task_state resize_migrated.
Jan 23 09:25:31 compute-0 nova_compute[182092]: 2026-01-23 09:25:31.508 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:32 compute-0 nova_compute[182092]: 2026-01-23 09:25:32.058 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:33 compute-0 nova_compute[182092]: 2026-01-23 09:25:33.616 182096 DEBUG nova.compute.manager [req-37daf921-d1b2-41ae-8280-e3de39ae1273 req-e7fc5e14-a97e-4696-a4e4-d73dc28eed3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-changed-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:33 compute-0 nova_compute[182092]: 2026-01-23 09:25:33.616 182096 DEBUG nova.compute.manager [req-37daf921-d1b2-41ae-8280-e3de39ae1273 req-e7fc5e14-a97e-4696-a4e4-d73dc28eed3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Refreshing instance network info cache due to event network-changed-b1eb1ee2-fc97-423a-bac5-6219bd097839. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:25:33 compute-0 nova_compute[182092]: 2026-01-23 09:25:33.616 182096 DEBUG oslo_concurrency.lockutils [req-37daf921-d1b2-41ae-8280-e3de39ae1273 req-e7fc5e14-a97e-4696-a4e4-d73dc28eed3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:25:33 compute-0 nova_compute[182092]: 2026-01-23 09:25:33.617 182096 DEBUG oslo_concurrency.lockutils [req-37daf921-d1b2-41ae-8280-e3de39ae1273 req-e7fc5e14-a97e-4696-a4e4-d73dc28eed3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:25:33 compute-0 nova_compute[182092]: 2026-01-23 09:25:33.617 182096 DEBUG nova.network.neutron [req-37daf921-d1b2-41ae-8280-e3de39ae1273 req-e7fc5e14-a97e-4696-a4e4-d73dc28eed3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Refreshing network info cache for port b1eb1ee2-fc97-423a-bac5-6219bd097839 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:25:33 compute-0 nova_compute[182092]: 2026-01-23 09:25:33.897 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160318.8968234, 014ff798-a231-4a80-b2b9-35eba3c3e263 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:33 compute-0 nova_compute[182092]: 2026-01-23 09:25:33.898 182096 INFO nova.compute.manager [-] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] VM Stopped (Lifecycle Event)
Jan 23 09:25:33 compute-0 nova_compute[182092]: 2026-01-23 09:25:33.912 182096 DEBUG nova.compute.manager [None req-757fb9af-784b-4575-934d-3b7b467cdd96 - - - - - -] [instance: 014ff798-a231-4a80-b2b9-35eba3c3e263] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:34 compute-0 nova_compute[182092]: 2026-01-23 09:25:34.751 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.166 182096 DEBUG nova.network.neutron [req-37daf921-d1b2-41ae-8280-e3de39ae1273 req-e7fc5e14-a97e-4696-a4e4-d73dc28eed3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updated VIF entry in instance network info cache for port b1eb1ee2-fc97-423a-bac5-6219bd097839. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.167 182096 DEBUG nova.network.neutron [req-37daf921-d1b2-41ae-8280-e3de39ae1273 req-e7fc5e14-a97e-4696-a4e4-d73dc28eed3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.177 182096 DEBUG oslo_concurrency.lockutils [req-37daf921-d1b2-41ae-8280-e3de39ae1273 req-e7fc5e14-a97e-4696-a4e4-d73dc28eed3b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.405 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.405 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.405 182096 DEBUG nova.compute.manager [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Going to confirm migration 13 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.429 182096 DEBUG nova.objects.instance [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'info_cache' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.750 182096 DEBUG nova.compute.manager [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.751 182096 DEBUG oslo_concurrency.lockutils [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.751 182096 DEBUG oslo_concurrency.lockutils [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.751 182096 DEBUG oslo_concurrency.lockutils [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.752 182096 DEBUG nova.compute.manager [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.752 182096 WARNING nova.compute.manager [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state resized and task_state None.
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.752 182096 DEBUG nova.compute.manager [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.752 182096 DEBUG oslo_concurrency.lockutils [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.752 182096 DEBUG oslo_concurrency.lockutils [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.752 182096 DEBUG oslo_concurrency.lockutils [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.753 182096 DEBUG nova.compute.manager [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] No waiting events found dispatching network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:35 compute-0 nova_compute[182092]: 2026-01-23 09:25:35.753 182096 WARNING nova.compute.manager [req-b94c0820-37d3-45f9-978f-c1a6e361fc0c req-61f54ccb-64db-47f9-a00e-501492ee5b19 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Received unexpected event network-vif-plugged-b1eb1ee2-fc97-423a-bac5-6219bd097839 for instance with vm_state resized and task_state None.
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.041 182096 DEBUG neutronclient.v2_0.client [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b1eb1ee2-fc97-423a-bac5-6219bd097839 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.042 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.042 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.042 182096 DEBUG nova.network.neutron [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.303 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "477e372f-4119-4251-9759-6196a5c39ac4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.303 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.316 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.431 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.431 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.437 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.437 182096 INFO nova.compute.claims [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.538 182096 DEBUG nova.compute.provider_tree [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.550 182096 DEBUG nova.scheduler.client.report [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.568 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.568 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.628 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.628 182096 DEBUG nova.network.neutron [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.645 182096 INFO nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.673 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.779 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.780 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.780 182096 INFO nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Creating image(s)
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.780 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "/var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.781 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "/var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.781 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "/var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.791 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.826 182096 DEBUG nova.policy [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc644e38442747bca17c837db68e75c5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b65592e54488427f9c77e89ceb3bcd59', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.836 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.836 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.836 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.846 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.888 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.889 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.910 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.910 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.911 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.952 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.953 182096 DEBUG nova.virt.disk.api [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Checking if we can resize image /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.953 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.996 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.996 182096 DEBUG nova.virt.disk.api [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Cannot resize image /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:25:36 compute-0 nova_compute[182092]: 2026-01-23 09:25:36.997 182096 DEBUG nova.objects.instance [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lazy-loading 'migration_context' on Instance uuid 477e372f-4119-4251-9759-6196a5c39ac4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.012 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.013 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Ensure instance console log exists: /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.013 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.013 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.014 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.060 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.578 182096 DEBUG nova.network.neutron [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Updating instance_info_cache with network_info: [{"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.606 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-74eaa05c-e365-4879-af9a-1bf1c102eda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.607 182096 DEBUG nova.objects.instance [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 74eaa05c-e365-4879-af9a-1bf1c102eda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.633 182096 DEBUG nova.virt.libvirt.vif [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:23:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1490124151',display_name='tempest-ServerActionsTestJSON-server-1490124151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1490124151',id=75,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:25:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-doifwx0j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:25:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=74eaa05c-e365-4879-af9a-1bf1c102eda7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.633 182096 DEBUG nova.network.os_vif_util [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "address": "fa:16:3e:27:52:52", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1eb1ee2-fc", "ovs_interfaceid": "b1eb1ee2-fc97-423a-bac5-6219bd097839", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.634 182096 DEBUG nova.network.os_vif_util [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.634 182096 DEBUG os_vif [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.635 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.635 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1eb1ee2-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.635 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.637 182096 INFO os_vif [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:52:52,bridge_name='br-int',has_traffic_filtering=True,id=b1eb1ee2-fc97-423a-bac5-6219bd097839,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1eb1ee2-fc')
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.637 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.637 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.655 182096 DEBUG nova.network.neutron [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Successfully created port: 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.710 182096 DEBUG nova.compute.provider_tree [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.721 182096 DEBUG nova.scheduler.client.report [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.752 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.870 182096 INFO nova.scheduler.client.report [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Deleted allocation for migration a4a4c505-3d6d-4664-95f7-df2800849143
Jan 23 09:25:37 compute-0 nova_compute[182092]: 2026-01-23 09:25:37.926 182096 DEBUG oslo_concurrency.lockutils [None req-620de2e1-526b-4a25-b753-10b45bc3cbe3 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "74eaa05c-e365-4879-af9a-1bf1c102eda7" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:38 compute-0 nova_compute[182092]: 2026-01-23 09:25:38.353 182096 DEBUG nova.network.neutron [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Successfully updated port: 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:25:38 compute-0 nova_compute[182092]: 2026-01-23 09:25:38.379 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:25:38 compute-0 nova_compute[182092]: 2026-01-23 09:25:38.379 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquired lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:25:38 compute-0 nova_compute[182092]: 2026-01-23 09:25:38.379 182096 DEBUG nova.network.neutron [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:25:38 compute-0 nova_compute[182092]: 2026-01-23 09:25:38.440 182096 DEBUG nova.compute.manager [req-8e3a3c04-797d-4bb9-a631-07869c73f6d4 req-db75b9ff-e737-4614-aae4-9490e9478c15 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received event network-changed-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:38 compute-0 nova_compute[182092]: 2026-01-23 09:25:38.440 182096 DEBUG nova.compute.manager [req-8e3a3c04-797d-4bb9-a631-07869c73f6d4 req-db75b9ff-e737-4614-aae4-9490e9478c15 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Refreshing instance network info cache due to event network-changed-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:25:38 compute-0 nova_compute[182092]: 2026-01-23 09:25:38.440 182096 DEBUG oslo_concurrency.lockutils [req-8e3a3c04-797d-4bb9-a631-07869c73f6d4 req-db75b9ff-e737-4614-aae4-9490e9478c15 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:25:38 compute-0 nova_compute[182092]: 2026-01-23 09:25:38.520 182096 DEBUG nova.network.neutron [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.522 182096 DEBUG nova.compute.manager [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.596 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.596 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.610 182096 DEBUG nova.network.neutron [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Updating instance_info_cache with network_info: [{"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.620 182096 DEBUG nova.objects.instance [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.622 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Releasing lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.622 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Instance network_info: |[{"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.622 182096 DEBUG oslo_concurrency.lockutils [req-8e3a3c04-797d-4bb9-a631-07869c73f6d4 req-db75b9ff-e737-4614-aae4-9490e9478c15 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.623 182096 DEBUG nova.network.neutron [req-8e3a3c04-797d-4bb9-a631-07869c73f6d4 req-db75b9ff-e737-4614-aae4-9490e9478c15 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Refreshing network info cache for port 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.625 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Start _get_guest_xml network_info=[{"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.629 182096 WARNING nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.633 182096 DEBUG nova.virt.libvirt.host [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.634 182096 DEBUG nova.virt.libvirt.host [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.636 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.636 182096 INFO nova.compute.claims [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.636 182096 DEBUG nova.objects.instance [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'resources' on Instance uuid 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.642 182096 DEBUG nova.virt.libvirt.host [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.642 182096 DEBUG nova.virt.libvirt.host [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.643 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.643 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.643 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.643 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.644 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.644 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.644 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.644 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.644 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.645 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.645 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.645 182096 DEBUG nova.virt.hardware [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.647 182096 DEBUG nova.virt.libvirt.vif [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:25:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=86,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxZ7Tf0in3goJ2uObE95IcuJ64XtlSAPA58mWZExOlcVtK6HGkidKb+CZb+R/oLiCg6Hj8yvu9pJLycKNNqcdR1PBBjeqSP50WOdXM59knvkNqtLYjKQdraQ8tZ0LSD4g==',key_name='tempest-keypair-1714202238',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b65592e54488427f9c77e89ceb3bcd59',ramdisk_id='',reservation_id='r-5j20ahg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-912598233',owner_user_name='tempest-ServersTestFqdnHostnames-912598233-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:25:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc644e38442747bca17c837db68e75c5',uuid=477e372f-4119-4251-9759-6196a5c39ac4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.648 182096 DEBUG nova.network.os_vif_util [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Converting VIF {"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.648 182096 DEBUG nova.network.os_vif_util [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:29:f3,bridge_name='br-int',has_traffic_filtering=True,id=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0,network=Network(0f5ee857-c849-420e-9553-650a37d1a762),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c50adfe-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.649 182096 DEBUG nova.objects.instance [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lazy-loading 'pci_devices' on Instance uuid 477e372f-4119-4251-9759-6196a5c39ac4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.650 182096 DEBUG nova.objects.instance [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.659 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <uuid>477e372f-4119-4251-9759-6196a5c39ac4</uuid>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <name>instance-00000056</name>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <nova:name>guest-instance-1.domain.com</nova:name>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:25:39</nova:creationTime>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:25:39 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:25:39 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:25:39 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:25:39 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:25:39 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:25:39 compute-0 nova_compute[182092]:         <nova:user uuid="dc644e38442747bca17c837db68e75c5">tempest-ServersTestFqdnHostnames-912598233-project-member</nova:user>
Jan 23 09:25:39 compute-0 nova_compute[182092]:         <nova:project uuid="b65592e54488427f9c77e89ceb3bcd59">tempest-ServersTestFqdnHostnames-912598233</nova:project>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:25:39 compute-0 nova_compute[182092]:         <nova:port uuid="2c50adfe-51b5-42e7-be5a-8c935d5ae8e0">
Jan 23 09:25:39 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <system>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <entry name="serial">477e372f-4119-4251-9759-6196a5c39ac4</entry>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <entry name="uuid">477e372f-4119-4251-9759-6196a5c39ac4</entry>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </system>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <os>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   </os>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <features>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   </features>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk.config"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:a7:29:f3"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <target dev="tap2c50adfe-51"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/console.log" append="off"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <video>
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </video>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:25:39 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:25:39 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:25:39 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:25:39 compute-0 nova_compute[182092]: </domain>
Jan 23 09:25:39 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.660 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Preparing to wait for external event network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.660 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "477e372f-4119-4251-9759-6196a5c39ac4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.661 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.661 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.661 182096 DEBUG nova.virt.libvirt.vif [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:25:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=86,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxZ7Tf0in3goJ2uObE95IcuJ64XtlSAPA58mWZExOlcVtK6HGkidKb+CZb+R/oLiCg6Hj8yvu9pJLycKNNqcdR1PBBjeqSP50WOdXM59knvkNqtLYjKQdraQ8tZ0LSD4g==',key_name='tempest-keypair-1714202238',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b65592e54488427f9c77e89ceb3bcd59',ramdisk_id='',reservation_id='r-5j20ahg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-912598233',owner_user_name='tempest-ServersTestFqdnHostnames-912598233-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:25:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc644e38442747bca17c837db68e75c5',uuid=477e372f-4119-4251-9759-6196a5c39ac4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.661 182096 DEBUG nova.network.os_vif_util [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Converting VIF {"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.662 182096 DEBUG nova.network.os_vif_util [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:29:f3,bridge_name='br-int',has_traffic_filtering=True,id=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0,network=Network(0f5ee857-c849-420e-9553-650a37d1a762),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c50adfe-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.662 182096 DEBUG os_vif [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:29:f3,bridge_name='br-int',has_traffic_filtering=True,id=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0,network=Network(0f5ee857-c849-420e-9553-650a37d1a762),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c50adfe-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.663 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.663 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.663 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.667 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.667 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c50adfe-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.667 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c50adfe-51, col_values=(('external_ids', {'iface-id': '2c50adfe-51b5-42e7-be5a-8c935d5ae8e0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:29:f3', 'vm-uuid': '477e372f-4119-4251-9759-6196a5c39ac4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.668 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:39 compute-0 NetworkManager[54920]: <info>  [1769160339.6699] manager: (tap2c50adfe-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.670 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.673 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.674 182096 INFO os_vif [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:29:f3,bridge_name='br-int',has_traffic_filtering=True,id=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0,network=Network(0f5ee857-c849-420e-9553-650a37d1a762),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c50adfe-51')
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.690 182096 INFO nova.compute.resource_tracker [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updating resource usage from migration 18d8f6ad-0f84-431e-8bbe-cd8ce8c4e695
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.690 182096 DEBUG nova.compute.resource_tracker [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Starting to track incoming migration 18d8f6ad-0f84-431e-8bbe-cd8ce8c4e695 with flavor 9e575731-b613-4b19-83e1-51cae9e2c5da _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.711 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.712 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.712 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] No VIF found with MAC fa:16:3e:a7:29:f3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.712 182096 INFO nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Using config drive
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.758 182096 DEBUG nova.compute.provider_tree [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.768 182096 DEBUG nova.scheduler.client.report [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.787 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:39 compute-0 nova_compute[182092]: 2026-01-23 09:25:39.787 182096 INFO nova.compute.manager [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Migrating
Jan 23 09:25:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:39.860 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:39.860 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:39.861 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.017 182096 INFO nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Creating config drive at /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk.config
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.022 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpttch9dix execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.142 182096 DEBUG oslo_concurrency.processutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpttch9dix" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:25:40 compute-0 kernel: tap2c50adfe-51: entered promiscuous mode
Jan 23 09:25:40 compute-0 NetworkManager[54920]: <info>  [1769160340.1787] manager: (tap2c50adfe-51): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Jan 23 09:25:40 compute-0 ovn_controller[94697]: 2026-01-23T09:25:40Z|00301|binding|INFO|Claiming lport 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 for this chassis.
Jan 23 09:25:40 compute-0 ovn_controller[94697]: 2026-01-23T09:25:40Z|00302|binding|INFO|2c50adfe-51b5-42e7-be5a-8c935d5ae8e0: Claiming fa:16:3e:a7:29:f3 10.100.0.9
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.179 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.186 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:29:f3 10.100.0.9'], port_security=['fa:16:3e:a7:29:f3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '477e372f-4119-4251-9759-6196a5c39ac4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f5ee857-c849-420e-9553-650a37d1a762', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b65592e54488427f9c77e89ceb3bcd59', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'effecef3-baed-4049-850b-c3bd0f8effe8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10ead035-8a44-4e0a-a356-42f5bc32a5a5, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.187 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 in datapath 0f5ee857-c849-420e-9553-650a37d1a762 bound to our chassis
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.189 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f5ee857-c849-420e-9553-650a37d1a762
Jan 23 09:25:40 compute-0 ovn_controller[94697]: 2026-01-23T09:25:40Z|00303|binding|INFO|Setting lport 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 ovn-installed in OVS
Jan 23 09:25:40 compute-0 ovn_controller[94697]: 2026-01-23T09:25:40Z|00304|binding|INFO|Setting lport 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 up in Southbound
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.194 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.196 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:40 compute-0 systemd-udevd[219898]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.201 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a04fd4d8-faaa-4955-8922-dbb5ef92ac54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.202 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0f5ee857-c1 in ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.203 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0f5ee857-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.204 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e956d39c-98c7-446a-8e46-b8e10db2e565]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.204 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[211f1823-88b9-4e6e-a794-2b2031d7de90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.212 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd191a0-cac1-43cc-a64d-b93b047c67b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 NetworkManager[54920]: <info>  [1769160340.2139] device (tap2c50adfe-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:25:40 compute-0 NetworkManager[54920]: <info>  [1769160340.2144] device (tap2c50adfe-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:25:40 compute-0 systemd-machined[153562]: New machine qemu-43-instance-00000056.
Jan 23 09:25:40 compute-0 systemd[1]: Started Virtual Machine qemu-43-instance-00000056.
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.232 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e06577a9-a6eb-449b-a5a8-4268b5db1f9d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.252 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6a7d16e3-fe45-466d-a587-cd85aa2f83ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 NetworkManager[54920]: <info>  [1769160340.2562] manager: (tap0f5ee857-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/157)
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.256 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a3e917-107d-43f9-bc50-44fe881303f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.284 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0964fe-1529-4705-9d29-30bc5ac2d738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.287 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[af53f6d6-9eba-4cd8-a374-00735bc879cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 NetworkManager[54920]: <info>  [1769160340.3099] device (tap0f5ee857-c0): carrier: link connected
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.314 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5e358dd1-f581-4ba0-98a2-536efb32393b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.328 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb6da85-ee98-4949-8f32-1a783dc1a165]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f5ee857-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:42:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384579, 'reachable_time': 19658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219925, 'error': None, 'target': 'ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.340 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f3354423-fb8d-4f4c-88db-dc3503c6af3f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:42cd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384579, 'tstamp': 384579}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219927, 'error': None, 'target': 'ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.354 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dc201d88-d433-4bea-a928-01ba43b0d018]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f5ee857-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:42:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384579, 'reachable_time': 19658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219932, 'error': None, 'target': 'ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.378 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0f95d3-9948-4a0d-b53e-c601e190cccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.414 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160340.4140964, 477e372f-4119-4251-9759-6196a5c39ac4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.414 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] VM Started (Lifecycle Event)
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.423 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b0bbf0ce-5304-41ca-bbe9-861e03e81706]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.424 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f5ee857-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.425 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.425 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f5ee857-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:40 compute-0 kernel: tap0f5ee857-c0: entered promiscuous mode
Jan 23 09:25:40 compute-0 NetworkManager[54920]: <info>  [1769160340.4273] manager: (tap0f5ee857-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.426 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.428 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.430 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.431 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f5ee857-c0, col_values=(('external_ids', {'iface-id': 'e928d376-91f7-41b8-b6ed-7452681c6fc8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:25:40 compute-0 ovn_controller[94697]: 2026-01-23T09:25:40Z|00305|binding|INFO|Releasing lport e928d376-91f7-41b8-b6ed-7452681c6fc8 from this chassis (sb_readonly=0)
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.432 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.433 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0f5ee857-c849-420e-9553-650a37d1a762.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0f5ee857-c849-420e-9553-650a37d1a762.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.434 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160340.4141936, 477e372f-4119-4251-9759-6196a5c39ac4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.434 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] VM Paused (Lifecycle Event)
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.434 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[046c15b8-1149-4afd-a61c-da3945467464]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.435 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-0f5ee857-c849-420e-9553-650a37d1a762
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/0f5ee857-c849-420e-9553-650a37d1a762.pid.haproxy
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 0f5ee857-c849-420e-9553-650a37d1a762
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:25:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:25:40.435 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762', 'env', 'PROCESS_TAG=haproxy-0f5ee857-c849-420e-9553-650a37d1a762', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0f5ee857-c849-420e-9553-650a37d1a762.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.444 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.445 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.447 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.458 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:25:40 compute-0 podman[219962]: 2026-01-23 09:25:40.706142715 +0000 UTC m=+0.031242211 container create 7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:25:40 compute-0 systemd[1]: Started libpod-conmon-7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983.scope.
Jan 23 09:25:40 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:25:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd72e00d3cc8574647198aaf3f3a2df81fbfa9646f095007619b034e07cd40ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:25:40 compute-0 podman[219962]: 2026-01-23 09:25:40.767276048 +0000 UTC m=+0.092375554 container init 7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:25:40 compute-0 podman[219962]: 2026-01-23 09:25:40.771754106 +0000 UTC m=+0.096853603 container start 7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:25:40 compute-0 podman[219962]: 2026-01-23 09:25:40.691413228 +0000 UTC m=+0.016512744 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.779 182096 DEBUG nova.compute.manager [req-08412754-180f-4976-b476-e0ee863a31fb req-243c50b7-02c3-493d-9eab-371503103aa4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received event network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.779 182096 DEBUG oslo_concurrency.lockutils [req-08412754-180f-4976-b476-e0ee863a31fb req-243c50b7-02c3-493d-9eab-371503103aa4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "477e372f-4119-4251-9759-6196a5c39ac4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.779 182096 DEBUG oslo_concurrency.lockutils [req-08412754-180f-4976-b476-e0ee863a31fb req-243c50b7-02c3-493d-9eab-371503103aa4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.780 182096 DEBUG oslo_concurrency.lockutils [req-08412754-180f-4976-b476-e0ee863a31fb req-243c50b7-02c3-493d-9eab-371503103aa4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.780 182096 DEBUG nova.compute.manager [req-08412754-180f-4976-b476-e0ee863a31fb req-243c50b7-02c3-493d-9eab-371503103aa4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Processing event network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.780 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:25:40 compute-0 podman[219972]: 2026-01-23 09:25:40.783577492 +0000 UTC m=+0.052416588 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.787 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160340.7857862, 477e372f-4119-4251-9759-6196a5c39ac4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.787 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] VM Resumed (Lifecycle Event)
Jan 23 09:25:40 compute-0 neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762[219988]: [NOTICE]   (220008) : New worker (220018) forked
Jan 23 09:25:40 compute-0 neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762[219988]: [NOTICE]   (220008) : Loading success.
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.790 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.796 182096 INFO nova.virt.libvirt.driver [-] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Instance spawned successfully.
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.797 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.807 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:40 compute-0 podman[219975]: 2026-01-23 09:25:40.810510823 +0000 UTC m=+0.075586631 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.813 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.816 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.816 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.817 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.817 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.818 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.818 182096 DEBUG nova.virt.libvirt.driver [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.841 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.869 182096 INFO nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Took 4.09 seconds to spawn the instance on the hypervisor.
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.870 182096 DEBUG nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.944 182096 INFO nova.compute.manager [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Took 4.54 seconds to build instance.
Jan 23 09:25:40 compute-0 nova_compute[182092]: 2026-01-23 09:25:40.958 182096 DEBUG oslo_concurrency.lockutils [None req-b4bd49df-7704-4652-8cc2-276750d1fd24 dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:42 compute-0 nova_compute[182092]: 2026-01-23 09:25:42.061 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:42 compute-0 nova_compute[182092]: 2026-01-23 09:25:42.128 182096 DEBUG nova.network.neutron [req-8e3a3c04-797d-4bb9-a631-07869c73f6d4 req-db75b9ff-e737-4614-aae4-9490e9478c15 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Updated VIF entry in instance network info cache for port 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:25:42 compute-0 nova_compute[182092]: 2026-01-23 09:25:42.129 182096 DEBUG nova.network.neutron [req-8e3a3c04-797d-4bb9-a631-07869c73f6d4 req-db75b9ff-e737-4614-aae4-9490e9478c15 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Updating instance_info_cache with network_info: [{"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:42 compute-0 nova_compute[182092]: 2026-01-23 09:25:42.144 182096 DEBUG oslo_concurrency.lockutils [req-8e3a3c04-797d-4bb9-a631-07869c73f6d4 req-db75b9ff-e737-4614-aae4-9490e9478c15 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:25:43 compute-0 nova_compute[182092]: 2026-01-23 09:25:43.116 182096 DEBUG nova.compute.manager [req-d808237a-107b-47c7-b15e-80bf7fefbae2 req-cede7d70-f015-4714-803b-357e2fb80a09 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received event network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:43 compute-0 nova_compute[182092]: 2026-01-23 09:25:43.117 182096 DEBUG oslo_concurrency.lockutils [req-d808237a-107b-47c7-b15e-80bf7fefbae2 req-cede7d70-f015-4714-803b-357e2fb80a09 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "477e372f-4119-4251-9759-6196a5c39ac4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:43 compute-0 nova_compute[182092]: 2026-01-23 09:25:43.117 182096 DEBUG oslo_concurrency.lockutils [req-d808237a-107b-47c7-b15e-80bf7fefbae2 req-cede7d70-f015-4714-803b-357e2fb80a09 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:43 compute-0 nova_compute[182092]: 2026-01-23 09:25:43.118 182096 DEBUG oslo_concurrency.lockutils [req-d808237a-107b-47c7-b15e-80bf7fefbae2 req-cede7d70-f015-4714-803b-357e2fb80a09 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:43 compute-0 nova_compute[182092]: 2026-01-23 09:25:43.118 182096 DEBUG nova.compute.manager [req-d808237a-107b-47c7-b15e-80bf7fefbae2 req-cede7d70-f015-4714-803b-357e2fb80a09 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] No waiting events found dispatching network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:43 compute-0 nova_compute[182092]: 2026-01-23 09:25:43.118 182096 WARNING nova.compute.manager [req-d808237a-107b-47c7-b15e-80bf7fefbae2 req-cede7d70-f015-4714-803b-357e2fb80a09 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received unexpected event network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 for instance with vm_state active and task_state None.
Jan 23 09:25:43 compute-0 podman[220023]: 2026-01-23 09:25:43.204267484 +0000 UTC m=+0.041803386 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Jan 23 09:25:43 compute-0 sshd-session[220042]: Accepted publickey for nova from 192.168.122.101 port 60204 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:25:43 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:25:43 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:25:43 compute-0 systemd-logind[746]: New session 57 of user nova.
Jan 23 09:25:43 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:25:43 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:25:43 compute-0 systemd[220046]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:25:43 compute-0 systemd[220046]: Queued start job for default target Main User Target.
Jan 23 09:25:43 compute-0 systemd[220046]: Created slice User Application Slice.
Jan 23 09:25:43 compute-0 systemd[220046]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:25:43 compute-0 systemd[220046]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:25:43 compute-0 systemd[220046]: Reached target Paths.
Jan 23 09:25:43 compute-0 systemd[220046]: Reached target Timers.
Jan 23 09:25:43 compute-0 systemd[220046]: Starting D-Bus User Message Bus Socket...
Jan 23 09:25:43 compute-0 systemd[220046]: Starting Create User's Volatile Files and Directories...
Jan 23 09:25:43 compute-0 systemd[220046]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:25:43 compute-0 systemd[220046]: Reached target Sockets.
Jan 23 09:25:43 compute-0 systemd[220046]: Finished Create User's Volatile Files and Directories.
Jan 23 09:25:43 compute-0 systemd[220046]: Reached target Basic System.
Jan 23 09:25:43 compute-0 systemd[220046]: Reached target Main User Target.
Jan 23 09:25:43 compute-0 systemd[220046]: Startup finished in 111ms.
Jan 23 09:25:43 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:25:43 compute-0 systemd[1]: Started Session 57 of User nova.
Jan 23 09:25:43 compute-0 sshd-session[220042]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:25:43 compute-0 sshd-session[220062]: Received disconnect from 192.168.122.101 port 60204:11: disconnected by user
Jan 23 09:25:43 compute-0 sshd-session[220062]: Disconnected from user nova 192.168.122.101 port 60204
Jan 23 09:25:43 compute-0 sshd-session[220042]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:25:43 compute-0 systemd[1]: session-57.scope: Deactivated successfully.
Jan 23 09:25:43 compute-0 systemd-logind[746]: Session 57 logged out. Waiting for processes to exit.
Jan 23 09:25:43 compute-0 systemd-logind[746]: Removed session 57.
Jan 23 09:25:43 compute-0 sshd-session[220064]: Accepted publickey for nova from 192.168.122.101 port 60206 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:25:43 compute-0 systemd-logind[746]: New session 59 of user nova.
Jan 23 09:25:43 compute-0 systemd[1]: Started Session 59 of User nova.
Jan 23 09:25:43 compute-0 sshd-session[220064]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:25:43 compute-0 sshd-session[220067]: Received disconnect from 192.168.122.101 port 60206:11: disconnected by user
Jan 23 09:25:43 compute-0 sshd-session[220067]: Disconnected from user nova 192.168.122.101 port 60206
Jan 23 09:25:43 compute-0 sshd-session[220064]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:25:43 compute-0 systemd[1]: session-59.scope: Deactivated successfully.
Jan 23 09:25:43 compute-0 systemd-logind[746]: Session 59 logged out. Waiting for processes to exit.
Jan 23 09:25:43 compute-0 systemd-logind[746]: Removed session 59.
Jan 23 09:25:44 compute-0 nova_compute[182092]: 2026-01-23 09:25:44.099 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160329.0984602, 74eaa05c-e365-4879-af9a-1bf1c102eda7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:25:44 compute-0 nova_compute[182092]: 2026-01-23 09:25:44.100 182096 INFO nova.compute.manager [-] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] VM Stopped (Lifecycle Event)
Jan 23 09:25:44 compute-0 nova_compute[182092]: 2026-01-23 09:25:44.251 182096 DEBUG nova.compute.manager [None req-0e8b3816-c82d-400c-b9f9-161b27bcb80b - - - - - -] [instance: 74eaa05c-e365-4879-af9a-1bf1c102eda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:25:44 compute-0 nova_compute[182092]: 2026-01-23 09:25:44.668 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:45 compute-0 nova_compute[182092]: 2026-01-23 09:25:45.171 182096 DEBUG nova.compute.manager [req-a68d19ae-5c5a-45fc-8d51-c94299f1ee99 req-a4df838d-d168-4cb7-85f6-93c7681b4ad3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received event network-changed-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:45 compute-0 nova_compute[182092]: 2026-01-23 09:25:45.171 182096 DEBUG nova.compute.manager [req-a68d19ae-5c5a-45fc-8d51-c94299f1ee99 req-a4df838d-d168-4cb7-85f6-93c7681b4ad3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Refreshing instance network info cache due to event network-changed-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:25:45 compute-0 nova_compute[182092]: 2026-01-23 09:25:45.172 182096 DEBUG oslo_concurrency.lockutils [req-a68d19ae-5c5a-45fc-8d51-c94299f1ee99 req-a4df838d-d168-4cb7-85f6-93c7681b4ad3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:25:45 compute-0 nova_compute[182092]: 2026-01-23 09:25:45.172 182096 DEBUG oslo_concurrency.lockutils [req-a68d19ae-5c5a-45fc-8d51-c94299f1ee99 req-a4df838d-d168-4cb7-85f6-93c7681b4ad3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:25:45 compute-0 nova_compute[182092]: 2026-01-23 09:25:45.172 182096 DEBUG nova.network.neutron [req-a68d19ae-5c5a-45fc-8d51-c94299f1ee99 req-a4df838d-d168-4cb7-85f6-93c7681b4ad3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Refreshing network info cache for port 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:25:46 compute-0 nova_compute[182092]: 2026-01-23 09:25:46.453 182096 DEBUG nova.network.neutron [req-a68d19ae-5c5a-45fc-8d51-c94299f1ee99 req-a4df838d-d168-4cb7-85f6-93c7681b4ad3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Updated VIF entry in instance network info cache for port 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:25:46 compute-0 nova_compute[182092]: 2026-01-23 09:25:46.454 182096 DEBUG nova.network.neutron [req-a68d19ae-5c5a-45fc-8d51-c94299f1ee99 req-a4df838d-d168-4cb7-85f6-93c7681b4ad3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Updating instance_info_cache with network_info: [{"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:25:46 compute-0 nova_compute[182092]: 2026-01-23 09:25:46.467 182096 DEBUG oslo_concurrency.lockutils [req-a68d19ae-5c5a-45fc-8d51-c94299f1ee99 req-a4df838d-d168-4cb7-85f6-93c7681b4ad3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-477e372f-4119-4251-9759-6196a5c39ac4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:25:47 compute-0 nova_compute[182092]: 2026-01-23 09:25:47.062 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:49 compute-0 nova_compute[182092]: 2026-01-23 09:25:49.669 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:52 compute-0 nova_compute[182092]: 2026-01-23 09:25:52.064 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:52 compute-0 ovn_controller[94697]: 2026-01-23T09:25:52Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a7:29:f3 10.100.0.9
Jan 23 09:25:52 compute-0 ovn_controller[94697]: 2026-01-23T09:25:52Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a7:29:f3 10.100.0.9
Jan 23 09:25:53 compute-0 podman[220085]: 2026-01-23 09:25:53.225334891 +0000 UTC m=+0.057873583 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:25:54 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:25:54 compute-0 systemd[220046]: Activating special unit Exit the Session...
Jan 23 09:25:54 compute-0 systemd[220046]: Stopped target Main User Target.
Jan 23 09:25:54 compute-0 systemd[220046]: Stopped target Basic System.
Jan 23 09:25:54 compute-0 systemd[220046]: Stopped target Paths.
Jan 23 09:25:54 compute-0 systemd[220046]: Stopped target Sockets.
Jan 23 09:25:54 compute-0 systemd[220046]: Stopped target Timers.
Jan 23 09:25:54 compute-0 systemd[220046]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:25:54 compute-0 systemd[220046]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:25:54 compute-0 systemd[220046]: Closed D-Bus User Message Bus Socket.
Jan 23 09:25:54 compute-0 systemd[220046]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:25:54 compute-0 systemd[220046]: Removed slice User Application Slice.
Jan 23 09:25:54 compute-0 systemd[220046]: Reached target Shutdown.
Jan 23 09:25:54 compute-0 systemd[220046]: Finished Exit the Session.
Jan 23 09:25:54 compute-0 systemd[220046]: Reached target Exit the Session.
Jan 23 09:25:54 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:25:54 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:25:54 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:25:54 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:25:54 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:25:54 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:25:54 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:25:54 compute-0 nova_compute[182092]: 2026-01-23 09:25:54.671 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:56 compute-0 nova_compute[182092]: 2026-01-23 09:25:56.405 182096 DEBUG nova.compute.manager [req-b657253a-354d-44a5-837c-12f4a5c8678a req-fc6304a4-833c-406f-b407-5fb26ea2b708 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:56 compute-0 nova_compute[182092]: 2026-01-23 09:25:56.405 182096 DEBUG oslo_concurrency.lockutils [req-b657253a-354d-44a5-837c-12f4a5c8678a req-fc6304a4-833c-406f-b407-5fb26ea2b708 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:56 compute-0 nova_compute[182092]: 2026-01-23 09:25:56.405 182096 DEBUG oslo_concurrency.lockutils [req-b657253a-354d-44a5-837c-12f4a5c8678a req-fc6304a4-833c-406f-b407-5fb26ea2b708 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:56 compute-0 nova_compute[182092]: 2026-01-23 09:25:56.406 182096 DEBUG oslo_concurrency.lockutils [req-b657253a-354d-44a5-837c-12f4a5c8678a req-fc6304a4-833c-406f-b407-5fb26ea2b708 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:56 compute-0 nova_compute[182092]: 2026-01-23 09:25:56.406 182096 DEBUG nova.compute.manager [req-b657253a-354d-44a5-837c-12f4a5c8678a req-fc6304a4-833c-406f-b407-5fb26ea2b708 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:56 compute-0 nova_compute[182092]: 2026-01-23 09:25:56.406 182096 WARNING nova.compute.manager [req-b657253a-354d-44a5-837c-12f4a5c8678a req-fc6304a4-833c-406f-b407-5fb26ea2b708 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state active and task_state resize_migrating.
Jan 23 09:25:57 compute-0 nova_compute[182092]: 2026-01-23 09:25:57.066 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:25:57 compute-0 sshd-session[220109]: Accepted publickey for nova from 192.168.122.101 port 47162 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:25:57 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 09:25:57 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 09:25:57 compute-0 systemd-logind[746]: New session 60 of user nova.
Jan 23 09:25:57 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 09:25:57 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 23 09:25:57 compute-0 systemd[220113]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:25:57 compute-0 systemd[220113]: Queued start job for default target Main User Target.
Jan 23 09:25:57 compute-0 systemd[220113]: Created slice User Application Slice.
Jan 23 09:25:57 compute-0 systemd[220113]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:25:57 compute-0 systemd[220113]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:25:57 compute-0 systemd[220113]: Reached target Paths.
Jan 23 09:25:57 compute-0 systemd[220113]: Reached target Timers.
Jan 23 09:25:57 compute-0 systemd[220113]: Starting D-Bus User Message Bus Socket...
Jan 23 09:25:57 compute-0 systemd[220113]: Starting Create User's Volatile Files and Directories...
Jan 23 09:25:57 compute-0 systemd[220113]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:25:57 compute-0 systemd[220113]: Reached target Sockets.
Jan 23 09:25:57 compute-0 systemd[220113]: Finished Create User's Volatile Files and Directories.
Jan 23 09:25:57 compute-0 systemd[220113]: Reached target Basic System.
Jan 23 09:25:57 compute-0 systemd[220113]: Reached target Main User Target.
Jan 23 09:25:57 compute-0 systemd[220113]: Startup finished in 97ms.
Jan 23 09:25:57 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 23 09:25:57 compute-0 systemd[1]: Started Session 60 of User nova.
Jan 23 09:25:57 compute-0 sshd-session[220109]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:25:57 compute-0 sshd-session[220128]: Received disconnect from 192.168.122.101 port 47162:11: disconnected by user
Jan 23 09:25:57 compute-0 sshd-session[220128]: Disconnected from user nova 192.168.122.101 port 47162
Jan 23 09:25:57 compute-0 sshd-session[220109]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:25:57 compute-0 systemd[1]: session-60.scope: Deactivated successfully.
Jan 23 09:25:57 compute-0 systemd-logind[746]: Session 60 logged out. Waiting for processes to exit.
Jan 23 09:25:57 compute-0 systemd-logind[746]: Removed session 60.
Jan 23 09:25:57 compute-0 sshd-session[220130]: Accepted publickey for nova from 192.168.122.101 port 47168 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:25:57 compute-0 systemd-logind[746]: New session 62 of user nova.
Jan 23 09:25:57 compute-0 systemd[1]: Started Session 62 of User nova.
Jan 23 09:25:57 compute-0 sshd-session[220130]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:25:57 compute-0 sshd-session[220133]: Received disconnect from 192.168.122.101 port 47168:11: disconnected by user
Jan 23 09:25:57 compute-0 sshd-session[220133]: Disconnected from user nova 192.168.122.101 port 47168
Jan 23 09:25:57 compute-0 sshd-session[220130]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:25:57 compute-0 systemd-logind[746]: Session 62 logged out. Waiting for processes to exit.
Jan 23 09:25:57 compute-0 systemd[1]: session-62.scope: Deactivated successfully.
Jan 23 09:25:57 compute-0 systemd-logind[746]: Removed session 62.
Jan 23 09:25:58 compute-0 sshd-session[220135]: Accepted publickey for nova from 192.168.122.101 port 47182 ssh2: ECDSA SHA256:7laMRNbSTU1ygG8OI4/CSVxHW+Ob632eUfnCRNoyIYU
Jan 23 09:25:58 compute-0 systemd-logind[746]: New session 63 of user nova.
Jan 23 09:25:58 compute-0 systemd[1]: Started Session 63 of User nova.
Jan 23 09:25:58 compute-0 sshd-session[220135]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 23 09:25:58 compute-0 sshd-session[220138]: Received disconnect from 192.168.122.101 port 47182:11: disconnected by user
Jan 23 09:25:58 compute-0 sshd-session[220138]: Disconnected from user nova 192.168.122.101 port 47182
Jan 23 09:25:58 compute-0 sshd-session[220135]: pam_unix(sshd:session): session closed for user nova
Jan 23 09:25:58 compute-0 systemd[1]: session-63.scope: Deactivated successfully.
Jan 23 09:25:58 compute-0 systemd-logind[746]: Session 63 logged out. Waiting for processes to exit.
Jan 23 09:25:58 compute-0 systemd-logind[746]: Removed session 63.
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.481 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.483 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.483 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.483 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.483 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.484 182096 WARNING nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state active and task_state resize_migrated.
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.484 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.484 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.484 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.484 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.484 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.485 182096 WARNING nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state active and task_state resize_migrated.
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.485 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.485 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.485 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.485 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.486 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.486 182096 WARNING nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state active and task_state resize_migrated.
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.486 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.486 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.486 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.487 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.487 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.487 182096 WARNING nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state active and task_state resize_migrated.
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.487 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.487 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.487 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.488 182096 DEBUG oslo_concurrency.lockutils [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.488 182096 DEBUG nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:25:58 compute-0 nova_compute[182092]: 2026-01-23 09:25:58.488 182096 WARNING nova.compute.manager [req-2a8fb0f8-131b-466c-86b5-60c6e7fe14e9 req-c481e690-41ee-4356-b003-d12d08b6573e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state active and task_state resize_migrated.
Jan 23 09:25:59 compute-0 nova_compute[182092]: 2026-01-23 09:25:59.105 182096 INFO nova.network.neutron [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updating port b03613c5-81a2-4cce-8c31-aa177cb2e3e0 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 23 09:25:59 compute-0 nova_compute[182092]: 2026-01-23 09:25:59.672 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.622 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "477e372f-4119-4251-9759-6196a5c39ac4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.622 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.622 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "477e372f-4119-4251-9759-6196a5c39ac4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.622 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.623 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.629 182096 INFO nova.compute.manager [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Terminating instance
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.634 182096 DEBUG nova.compute.manager [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:00 compute-0 kernel: tap2c50adfe-51 (unregistering): left promiscuous mode
Jan 23 09:26:00 compute-0 NetworkManager[54920]: <info>  [1769160360.6554] device (tap2c50adfe-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.660 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 ovn_controller[94697]: 2026-01-23T09:26:00Z|00306|binding|INFO|Releasing lport 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 from this chassis (sb_readonly=0)
Jan 23 09:26:00 compute-0 ovn_controller[94697]: 2026-01-23T09:26:00Z|00307|binding|INFO|Setting lport 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 down in Southbound
Jan 23 09:26:00 compute-0 ovn_controller[94697]: 2026-01-23T09:26:00Z|00308|binding|INFO|Removing iface tap2c50adfe-51 ovn-installed in OVS
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.665 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.669 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:29:f3 10.100.0.9'], port_security=['fa:16:3e:a7:29:f3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '477e372f-4119-4251-9759-6196a5c39ac4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f5ee857-c849-420e-9553-650a37d1a762', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b65592e54488427f9c77e89ceb3bcd59', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'effecef3-baed-4049-850b-c3bd0f8effe8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10ead035-8a44-4e0a-a356-42f5bc32a5a5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.670 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 in datapath 0f5ee857-c849-420e-9553-650a37d1a762 unbound from our chassis
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.671 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f5ee857-c849-420e-9553-650a37d1a762, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.672 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e8aa0794-cc16-48d6-bb02-5ce3a561934a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.672 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762 namespace which is not needed anymore
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.679 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000056.scope: Deactivated successfully.
Jan 23 09:26:00 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000056.scope: Consumed 10.991s CPU time.
Jan 23 09:26:00 compute-0 systemd-machined[153562]: Machine qemu-43-instance-00000056 terminated.
Jan 23 09:26:00 compute-0 podman[220144]: 2026-01-23 09:26:00.737578496 +0000 UTC m=+0.057951941 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:26:00 compute-0 podman[220141]: 2026-01-23 09:26:00.755029238 +0000 UTC m=+0.074868116 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:26:00 compute-0 neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762[219988]: [NOTICE]   (220008) : haproxy version is 2.8.14-c23fe91
Jan 23 09:26:00 compute-0 neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762[219988]: [NOTICE]   (220008) : path to executable is /usr/sbin/haproxy
Jan 23 09:26:00 compute-0 neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762[219988]: [WARNING]  (220008) : Exiting Master process...
Jan 23 09:26:00 compute-0 neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762[219988]: [ALERT]    (220008) : Current worker (220018) exited with code 143 (Terminated)
Jan 23 09:26:00 compute-0 neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762[219988]: [WARNING]  (220008) : All workers exited. Exiting... (0)
Jan 23 09:26:00 compute-0 systemd[1]: libpod-7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983.scope: Deactivated successfully.
Jan 23 09:26:00 compute-0 conmon[219988]: conmon 7c8532f7a5d7fed9103d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983.scope/container/memory.events
Jan 23 09:26:00 compute-0 podman[220193]: 2026-01-23 09:26:00.775037865 +0000 UTC m=+0.034114581 container died 7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 09:26:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983-userdata-shm.mount: Deactivated successfully.
Jan 23 09:26:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd72e00d3cc8574647198aaf3f3a2df81fbfa9646f095007619b034e07cd40ba-merged.mount: Deactivated successfully.
Jan 23 09:26:00 compute-0 podman[220193]: 2026-01-23 09:26:00.797491036 +0000 UTC m=+0.056567750 container cleanup 7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:26:00 compute-0 systemd[1]: libpod-conmon-7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983.scope: Deactivated successfully.
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.803 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.804 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquired lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.804 182096 DEBUG nova.network.neutron [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:26:00 compute-0 podman[220222]: 2026-01-23 09:26:00.837356825 +0000 UTC m=+0.024221917 container remove 7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.841 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[58def7ab-ed5e-4472-b5f1-f721192d9577]: (4, ('Fri Jan 23 09:26:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762 (7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983)\n7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983\nFri Jan 23 09:26:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762 (7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983)\n7c8532f7a5d7fed9103df91ad897e47bc0dfe1c6b26091b7d72af677d9739983\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.842 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6df26d74-6834-4993-b4e0-284b4087b0f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.843 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f5ee857-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.845 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 kernel: tap0f5ee857-c0: left promiscuous mode
Jan 23 09:26:00 compute-0 kernel: tap2c50adfe-51: entered promiscuous mode
Jan 23 09:26:00 compute-0 kernel: tap2c50adfe-51 (unregistering): left promiscuous mode
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.864 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.866 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[629420f9-464f-42b9-8fa4-04e07b97e62e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.866 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.877 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[991df832-576f-43ba-8980-4ba9c8dcbf9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.877 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1db29ace-1708-4f41-a96d-5ad671a5b4da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.889 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0f717f-4d8b-4ee4-8699-068e6a6a8967]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384573, 'reachable_time': 24484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220248, 'error': None, 'target': 'ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.890 182096 INFO nova.virt.libvirt.driver [-] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Instance destroyed successfully.
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.890 182096 DEBUG nova.objects.instance [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lazy-loading 'resources' on Instance uuid 477e372f-4119-4251-9759-6196a5c39ac4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d0f5ee857\x2dc849\x2d420e\x2d9553\x2d650a37d1a762.mount: Deactivated successfully.
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.891 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0f5ee857-c849-420e-9553-650a37d1a762 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:26:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:00.892 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d3170a-61b6-4b13-a0b1-64919a087605]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.902 182096 DEBUG nova.virt.libvirt.vif [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:25:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=86,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxZ7Tf0in3goJ2uObE95IcuJ64XtlSAPA58mWZExOlcVtK6HGkidKb+CZb+R/oLiCg6Hj8yvu9pJLycKNNqcdR1PBBjeqSP50WOdXM59knvkNqtLYjKQdraQ8tZ0LSD4g==',key_name='tempest-keypair-1714202238',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:25:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b65592e54488427f9c77e89ceb3bcd59',ramdisk_id='',reservation_id='r-5j20ahg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-912598233',owner_user_name='tempest-ServersTestFqdnHostnames-912598233-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:25:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dc644e38442747bca17c837db68e75c5',uuid=477e372f-4119-4251-9759-6196a5c39ac4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.902 182096 DEBUG nova.network.os_vif_util [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Converting VIF {"id": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "address": "fa:16:3e:a7:29:f3", "network": {"id": "0f5ee857-c849-420e-9553-650a37d1a762", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1082676560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b65592e54488427f9c77e89ceb3bcd59", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c50adfe-51", "ovs_interfaceid": "2c50adfe-51b5-42e7-be5a-8c935d5ae8e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.903 182096 DEBUG nova.network.os_vif_util [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a7:29:f3,bridge_name='br-int',has_traffic_filtering=True,id=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0,network=Network(0f5ee857-c849-420e-9553-650a37d1a762),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c50adfe-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.903 182096 DEBUG os_vif [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:29:f3,bridge_name='br-int',has_traffic_filtering=True,id=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0,network=Network(0f5ee857-c849-420e-9553-650a37d1a762),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c50adfe-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.904 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.904 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c50adfe-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.906 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.908 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.910 182096 INFO os_vif [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:29:f3,bridge_name='br-int',has_traffic_filtering=True,id=2c50adfe-51b5-42e7-be5a-8c935d5ae8e0,network=Network(0f5ee857-c849-420e-9553-650a37d1a762),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c50adfe-51')
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.910 182096 INFO nova.virt.libvirt.driver [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Deleting instance files /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4_del
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.911 182096 INFO nova.virt.libvirt.driver [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Deletion of /var/lib/nova/instances/477e372f-4119-4251-9759-6196a5c39ac4_del complete
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.919 182096 DEBUG nova.compute.manager [req-d62119f7-95a2-42a2-9f6a-783d8193c6ec req-85d77e2b-614f-4b51-9c8b-369dc8940c10 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received event network-vif-unplugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.919 182096 DEBUG oslo_concurrency.lockutils [req-d62119f7-95a2-42a2-9f6a-783d8193c6ec req-85d77e2b-614f-4b51-9c8b-369dc8940c10 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "477e372f-4119-4251-9759-6196a5c39ac4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.920 182096 DEBUG oslo_concurrency.lockutils [req-d62119f7-95a2-42a2-9f6a-783d8193c6ec req-85d77e2b-614f-4b51-9c8b-369dc8940c10 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.920 182096 DEBUG oslo_concurrency.lockutils [req-d62119f7-95a2-42a2-9f6a-783d8193c6ec req-85d77e2b-614f-4b51-9c8b-369dc8940c10 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.920 182096 DEBUG nova.compute.manager [req-d62119f7-95a2-42a2-9f6a-783d8193c6ec req-85d77e2b-614f-4b51-9c8b-369dc8940c10 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] No waiting events found dispatching network-vif-unplugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.920 182096 DEBUG nova.compute.manager [req-d62119f7-95a2-42a2-9f6a-783d8193c6ec req-85d77e2b-614f-4b51-9c8b-369dc8940c10 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received event network-vif-unplugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.939 182096 DEBUG nova.compute.manager [req-60b8e2d7-9c3c-4724-ad08-f95c6a89e675 req-b2f5b9d1-ffc7-4fe8-b41c-cc7c1bfe6b46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-changed-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.939 182096 DEBUG nova.compute.manager [req-60b8e2d7-9c3c-4724-ad08-f95c6a89e675 req-b2f5b9d1-ffc7-4fe8-b41c-cc7c1bfe6b46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Refreshing instance network info cache due to event network-changed-b03613c5-81a2-4cce-8c31-aa177cb2e3e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:26:00 compute-0 nova_compute[182092]: 2026-01-23 09:26:00.940 182096 DEBUG oslo_concurrency.lockutils [req-60b8e2d7-9c3c-4724-ad08-f95c6a89e675 req-b2f5b9d1-ffc7-4fe8-b41c-cc7c1bfe6b46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.031 182096 INFO nova.compute.manager [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.032 182096 DEBUG oslo.service.loopingcall [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.032 182096 DEBUG nova.compute.manager [-] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.032 182096 DEBUG nova.network.neutron [-] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.857 182096 DEBUG nova.network.neutron [-] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.868 182096 INFO nova.compute.manager [-] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Took 0.84 seconds to deallocate network for instance.
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.913 182096 DEBUG nova.compute.manager [req-4d25b27a-311c-4cc8-a783-6d5f79eeb753 req-15bda027-6616-476b-a092-26be9aa7f048 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received event network-vif-deleted-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.919 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.920 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:01 compute-0 nova_compute[182092]: 2026-01-23 09:26:01.986 182096 DEBUG nova.compute.provider_tree [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.012 182096 DEBUG nova.scheduler.client.report [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.034 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.058 182096 INFO nova.scheduler.client.report [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Deleted allocations for instance 477e372f-4119-4251-9759-6196a5c39ac4
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.067 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.151 182096 DEBUG oslo_concurrency.lockutils [None req-e63729b9-6b86-469d-a26a-6ac6d465ce2c dc644e38442747bca17c837db68e75c5 b65592e54488427f9c77e89ceb3bcd59 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.253 182096 DEBUG nova.network.neutron [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updating instance_info_cache with network_info: [{"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.268 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Releasing lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.271 182096 DEBUG oslo_concurrency.lockutils [req-60b8e2d7-9c3c-4724-ad08-f95c6a89e675 req-b2f5b9d1-ffc7-4fe8-b41c-cc7c1bfe6b46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.271 182096 DEBUG nova.network.neutron [req-60b8e2d7-9c3c-4724-ad08-f95c6a89e675 req-b2f5b9d1-ffc7-4fe8-b41c-cc7c1bfe6b46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Refreshing network info cache for port b03613c5-81a2-4cce-8c31-aa177cb2e3e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.347 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.348 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.348 182096 INFO nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Creating image(s)
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.349 182096 DEBUG nova.objects.instance [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.355 182096 DEBUG oslo_concurrency.processutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.401 182096 DEBUG oslo_concurrency.processutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.402 182096 DEBUG nova.virt.disk.api [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Checking if we can resize image /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.402 182096 DEBUG oslo_concurrency.processutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.448 182096 DEBUG oslo_concurrency.processutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.449 182096 DEBUG nova.virt.disk.api [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Cannot resize image /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.462 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.462 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Ensure instance console log exists: /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.462 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.463 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.463 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.465 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Start _get_guest_xml network_info=[{"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "vif_mac": "fa:16:3e:1a:db:0e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.468 182096 WARNING nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.472 182096 DEBUG nova.virt.libvirt.host [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.472 182096 DEBUG nova.virt.libvirt.host [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.475 182096 DEBUG nova.virt.libvirt.host [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.475 182096 DEBUG nova.virt.libvirt.host [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.476 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.477 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9e575731-b613-4b19-83e1-51cae9e2c5da',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.477 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.477 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.477 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.478 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.478 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.478 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.478 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.478 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.479 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.479 182096 DEBUG nova.virt.hardware [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.479 182096 DEBUG nova.objects.instance [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.491 182096 DEBUG oslo_concurrency.processutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.536 182096 DEBUG oslo_concurrency.processutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk.config --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.537 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "/var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.538 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "/var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.538 182096 DEBUG oslo_concurrency.lockutils [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "/var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.539 182096 DEBUG nova.virt.libvirt.vif [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:25:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-17528005',display_name='tempest-ServerDiskConfigTestJSON-server-17528005',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-17528005',id=85,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:25:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-3mirjmas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:25:58Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=4e287d24-a3ba-4551-ac59-c2f692b6c9b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "vif_mac": "fa:16:3e:1a:db:0e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.539 182096 DEBUG nova.network.os_vif_util [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "vif_mac": "fa:16:3e:1a:db:0e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.540 182096 DEBUG nova.network.os_vif_util [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:db:0e,bridge_name='br-int',has_traffic_filtering=True,id=b03613c5-81a2-4cce-8c31-aa177cb2e3e0,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03613c5-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.542 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <uuid>4e287d24-a3ba-4551-ac59-c2f692b6c9b1</uuid>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <name>instance-00000055</name>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <memory>196608</memory>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-17528005</nova:name>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:26:02</nova:creationTime>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <nova:flavor name="m1.micro">
Jan 23 09:26:02 compute-0 nova_compute[182092]:         <nova:memory>192</nova:memory>
Jan 23 09:26:02 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:26:02 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:26:02 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:26:02 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:26:02 compute-0 nova_compute[182092]:         <nova:user uuid="5edc8d287a9f4ffd90f54ecea19df7e8">tempest-ServerDiskConfigTestJSON-1935371313-project-member</nova:user>
Jan 23 09:26:02 compute-0 nova_compute[182092]:         <nova:project uuid="b9afff815c4546ad97f6d3afa2c35483">tempest-ServerDiskConfigTestJSON-1935371313</nova:project>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:26:02 compute-0 nova_compute[182092]:         <nova:port uuid="b03613c5-81a2-4cce-8c31-aa177cb2e3e0">
Jan 23 09:26:02 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <system>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <entry name="serial">4e287d24-a3ba-4551-ac59-c2f692b6c9b1</entry>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <entry name="uuid">4e287d24-a3ba-4551-ac59-c2f692b6c9b1</entry>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </system>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <os>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   </os>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <features>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   </features>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk.config"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:1a:db:0e"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <target dev="tapb03613c5-81"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/console.log" append="off"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <video>
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </video>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:26:02 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:26:02 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:26:02 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:26:02 compute-0 nova_compute[182092]: </domain>
Jan 23 09:26:02 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.543 182096 DEBUG nova.virt.libvirt.vif [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:25:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-17528005',display_name='tempest-ServerDiskConfigTestJSON-server-17528005',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-17528005',id=85,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:25:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-3mirjmas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:25:58Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=4e287d24-a3ba-4551-ac59-c2f692b6c9b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "vif_mac": "fa:16:3e:1a:db:0e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.544 182096 DEBUG nova.network.os_vif_util [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "vif_mac": "fa:16:3e:1a:db:0e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.544 182096 DEBUG nova.network.os_vif_util [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:db:0e,bridge_name='br-int',has_traffic_filtering=True,id=b03613c5-81a2-4cce-8c31-aa177cb2e3e0,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03613c5-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.544 182096 DEBUG os_vif [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:db:0e,bridge_name='br-int',has_traffic_filtering=True,id=b03613c5-81a2-4cce-8c31-aa177cb2e3e0,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03613c5-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.548 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.549 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.551 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.552 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb03613c5-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.552 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb03613c5-81, col_values=(('external_ids', {'iface-id': 'b03613c5-81a2-4cce-8c31-aa177cb2e3e0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1a:db:0e', 'vm-uuid': '4e287d24-a3ba-4551-ac59-c2f692b6c9b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:02 compute-0 NetworkManager[54920]: <info>  [1769160362.5540] manager: (tapb03613c5-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.555 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.556 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.557 182096 INFO os_vif [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:db:0e,bridge_name='br-int',has_traffic_filtering=True,id=b03613c5-81a2-4cce-8c31-aa177cb2e3e0,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03613c5-81')
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.591 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.591 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.592 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No VIF found with MAC fa:16:3e:1a:db:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.592 182096 INFO nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Using config drive
Jan 23 09:26:02 compute-0 systemd-udevd[220179]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:26:02 compute-0 NetworkManager[54920]: <info>  [1769160362.6265] manager: (tapb03613c5-81): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Jan 23 09:26:02 compute-0 kernel: tapb03613c5-81: entered promiscuous mode
Jan 23 09:26:02 compute-0 ovn_controller[94697]: 2026-01-23T09:26:02Z|00309|binding|INFO|Claiming lport b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for this chassis.
Jan 23 09:26:02 compute-0 ovn_controller[94697]: 2026-01-23T09:26:02Z|00310|binding|INFO|b03613c5-81a2-4cce-8c31-aa177cb2e3e0: Claiming fa:16:3e:1a:db:0e 10.100.0.12
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.628 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.634 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:db:0e 10.100.0.12'], port_security=['fa:16:3e:1a:db:0e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4e287d24-a3ba-4551-ac59-c2f692b6c9b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3598732e-75d5-4a2b-8884-521ea92eab7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7652143e-debd-4a5a-90a5-8ccbe554976b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d5ac5-7f76-4d27-a905-10a9d18c8f4a, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b03613c5-81a2-4cce-8c31-aa177cb2e3e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.635 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b03613c5-81a2-4cce-8c31-aa177cb2e3e0 in datapath 3598732e-75d5-4a2b-8884-521ea92eab7a bound to our chassis
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.636 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:26:02 compute-0 NetworkManager[54920]: <info>  [1769160362.6384] device (tapb03613c5-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:26:02 compute-0 NetworkManager[54920]: <info>  [1769160362.6388] device (tapb03613c5-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:26:02 compute-0 ovn_controller[94697]: 2026-01-23T09:26:02Z|00311|binding|INFO|Setting lport b03613c5-81a2-4cce-8c31-aa177cb2e3e0 ovn-installed in OVS
Jan 23 09:26:02 compute-0 ovn_controller[94697]: 2026-01-23T09:26:02Z|00312|binding|INFO|Setting lport b03613c5-81a2-4cce-8c31-aa177cb2e3e0 up in Southbound
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.644 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.645 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.648 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8696f9e0-d83d-4dca-9baf-c882c4ef1947]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.649 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3598732e-71 in ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.650 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3598732e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.650 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[027e43e8-ce5e-445d-b198-693815d52b5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.651 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ada22966-bf4a-40c3-8f70-e8a6fbfa941b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.660 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[31393bb8-d29c-4905-bc2a-3307a0c8b1d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.664 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.664 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.665 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.665 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:26:02 compute-0 systemd-machined[153562]: New machine qemu-44-instance-00000055.
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.671 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f599da-108d-4a1c-aa80-ba99d3293fba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 systemd[1]: Started Virtual Machine qemu-44-instance-00000055.
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.690 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5d43d708-ebfa-4344-8aa1-c1efa9a766b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 NetworkManager[54920]: <info>  [1769160362.6945] manager: (tap3598732e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/161)
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.693 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3619a782-559c-4475-981b-e439e09af993]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.719 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb3f046-03d7-40a0-9d71-b05c4424eb7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.721 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[33a417e4-d750-4505-ac44-ab624174a41b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 NetworkManager[54920]: <info>  [1769160362.7382] device (tap3598732e-70): carrier: link connected
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.742 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.744 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[148e3521-9448-4a66-b508-b792ffb1c91f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.756 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[20bdbb6b-3298-4d92-8ce0-11ca00c814fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3598732e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:00:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386822, 'reachable_time': 35156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220300, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.768 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0d1314d0-5787-4b00-bdc4-6fce55c5ff6d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386822, 'tstamp': 386822}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220301, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.779 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[50864b61-6cd6-4d83-b03c-0e78af0ef5b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3598732e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:00:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386822, 'reachable_time': 35156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220302, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.790 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.791 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.805 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b002430a-cb03-4103-b51c-af382bcf9286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.844 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7e891a-5da7-492f-8a9f-e4576a9d1d8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.846 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3598732e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.846 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.846 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3598732e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:02 compute-0 kernel: tap3598732e-70: entered promiscuous mode
Jan 23 09:26:02 compute-0 NetworkManager[54920]: <info>  [1769160362.8486] manager: (tap3598732e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.852 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.853 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3598732e-70, col_values=(('external_ids', {'iface-id': '9bf071ba-d027-4af7-a154-40b491b7a535'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:02 compute-0 ovn_controller[94697]: 2026-01-23T09:26:02Z|00313|binding|INFO|Releasing lport 9bf071ba-d027-4af7-a154-40b491b7a535 from this chassis (sb_readonly=0)
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.857 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:26:02 compute-0 nova_compute[182092]: 2026-01-23 09:26:02.867 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.867 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[87d47f21-1d79-46d5-b2fe-fd2eac23e2e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.869 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:26:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:02.871 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'env', 'PROCESS_TAG=haproxy-3598732e-75d5-4a2b-8884-521ea92eab7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3598732e-75d5-4a2b-8884-521ea92eab7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.033 182096 DEBUG nova.compute.manager [req-fe165633-7916-4b8c-bf36-c3ab5f999e9b req-90c7f0b6-7ade-4465-b9d2-efceba0af501 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received event network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.033 182096 DEBUG oslo_concurrency.lockutils [req-fe165633-7916-4b8c-bf36-c3ab5f999e9b req-90c7f0b6-7ade-4465-b9d2-efceba0af501 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "477e372f-4119-4251-9759-6196a5c39ac4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.034 182096 DEBUG oslo_concurrency.lockutils [req-fe165633-7916-4b8c-bf36-c3ab5f999e9b req-90c7f0b6-7ade-4465-b9d2-efceba0af501 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.034 182096 DEBUG oslo_concurrency.lockutils [req-fe165633-7916-4b8c-bf36-c3ab5f999e9b req-90c7f0b6-7ade-4465-b9d2-efceba0af501 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "477e372f-4119-4251-9759-6196a5c39ac4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.034 182096 DEBUG nova.compute.manager [req-fe165633-7916-4b8c-bf36-c3ab5f999e9b req-90c7f0b6-7ade-4465-b9d2-efceba0af501 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] No waiting events found dispatching network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.034 182096 WARNING nova.compute.manager [req-fe165633-7916-4b8c-bf36-c3ab5f999e9b req-90c7f0b6-7ade-4465-b9d2-efceba0af501 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Received unexpected event network-vif-plugged-2c50adfe-51b5-42e7-be5a-8c935d5ae8e0 for instance with vm_state deleted and task_state None.
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.113 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.114 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5671MB free_disk=73.23538970947266GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.114 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.114 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.152 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Applying migration context for instance 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 as it has an incoming, in-progress migration 18d8f6ad-0f84-431e-8bbe-cd8ce8c4e695. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.153 182096 INFO nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updating resource usage from migration 18d8f6ad-0f84-431e-8bbe-cd8ce8c4e695
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.175 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.175 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.175 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.216 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:26:03 compute-0 podman[220337]: 2026-01-23 09:26:03.225323531 +0000 UTC m=+0.038358556 container create e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.229 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.252 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.252 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:03 compute-0 systemd[1]: Started libpod-conmon-e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73.scope.
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.262 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160363.2618926, 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.262 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] VM Resumed (Lifecycle Event)
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.263 182096 DEBUG nova.compute.manager [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:26:03 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:26:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c7f0f4ac8207c5be9f7472119af0f7fd59087d63473d26b503f8d307d36f9ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.271 182096 INFO nova.virt.libvirt.driver [-] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Instance running successfully.
Jan 23 09:26:03 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.275 182096 DEBUG nova.virt.libvirt.guest [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.276 182096 DEBUG nova.virt.libvirt.driver [None req-0c049741-6abc-40de-ac18-aa1c4b0e33eb 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 23 09:26:03 compute-0 podman[220337]: 2026-01-23 09:26:03.279788423 +0000 UTC m=+0.092823459 container init e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.282 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.283 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:26:03 compute-0 podman[220337]: 2026-01-23 09:26:03.287516374 +0000 UTC m=+0.100551408 container start e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 09:26:03 compute-0 podman[220337]: 2026-01-23 09:26:03.207621005 +0000 UTC m=+0.020656060 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:26:03 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220354]: [NOTICE]   (220358) : New worker (220360) forked
Jan 23 09:26:03 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220354]: [NOTICE]   (220358) : Loading success.
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.316 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.316 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160363.2651613, 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.316 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] VM Started (Lifecycle Event)
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.347 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.348 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.369 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.693 182096 DEBUG nova.network.neutron [req-60b8e2d7-9c3c-4724-ad08-f95c6a89e675 req-b2f5b9d1-ffc7-4fe8-b41c-cc7c1bfe6b46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updated VIF entry in instance network info cache for port b03613c5-81a2-4cce-8c31-aa177cb2e3e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.694 182096 DEBUG nova.network.neutron [req-60b8e2d7-9c3c-4724-ad08-f95c6a89e675 req-b2f5b9d1-ffc7-4fe8-b41c-cc7c1bfe6b46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updating instance_info_cache with network_info: [{"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.717 182096 DEBUG oslo_concurrency.lockutils [req-60b8e2d7-9c3c-4724-ad08-f95c6a89e675 req-b2f5b9d1-ffc7-4fe8-b41c-cc7c1bfe6b46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:26:03 compute-0 nova_compute[182092]: 2026-01-23 09:26:03.815 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:03.815 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:26:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:03.816 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.032 182096 DEBUG nova.compute.manager [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.033 182096 DEBUG oslo_concurrency.lockutils [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.033 182096 DEBUG oslo_concurrency.lockutils [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.034 182096 DEBUG oslo_concurrency.lockutils [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.034 182096 DEBUG nova.compute.manager [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.034 182096 WARNING nova.compute.manager [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state resized and task_state None.
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.034 182096 DEBUG nova.compute.manager [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.035 182096 DEBUG oslo_concurrency.lockutils [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.035 182096 DEBUG oslo_concurrency.lockutils [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.035 182096 DEBUG oslo_concurrency.lockutils [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.035 182096 DEBUG nova.compute.manager [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.036 182096 WARNING nova.compute.manager [req-5b955447-29d0-48b3-8655-7d220859183a req-30e8bff5-9920-4c11-b356-555d27d19634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state resized and task_state None.
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.248 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.248 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.249 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.249 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.379 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.379 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.379 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:26:04 compute-0 nova_compute[182092]: 2026-01-23 09:26:04.379 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:06 compute-0 nova_compute[182092]: 2026-01-23 09:26:06.350 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updating instance_info_cache with network_info: [{"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:06 compute-0 nova_compute[182092]: 2026-01-23 09:26:06.375 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-4e287d24-a3ba-4551-ac59-c2f692b6c9b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:26:06 compute-0 nova_compute[182092]: 2026-01-23 09:26:06.375 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:26:06 compute-0 nova_compute[182092]: 2026-01-23 09:26:06.376 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:06 compute-0 nova_compute[182092]: 2026-01-23 09:26:06.376 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:06 compute-0 nova_compute[182092]: 2026-01-23 09:26:06.376 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:06 compute-0 nova_compute[182092]: 2026-01-23 09:26:06.377 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:26:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:06.818 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:07 compute-0 nova_compute[182092]: 2026-01-23 09:26:07.071 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:07 compute-0 ovn_controller[94697]: 2026-01-23T09:26:07Z|00314|binding|INFO|Releasing lport 9bf071ba-d027-4af7-a154-40b491b7a535 from this chassis (sb_readonly=0)
Jan 23 09:26:07 compute-0 nova_compute[182092]: 2026-01-23 09:26:07.179 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:07 compute-0 ovn_controller[94697]: 2026-01-23T09:26:07Z|00315|binding|INFO|Releasing lport 9bf071ba-d027-4af7-a154-40b491b7a535 from this chassis (sb_readonly=0)
Jan 23 09:26:07 compute-0 nova_compute[182092]: 2026-01-23 09:26:07.332 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:07 compute-0 nova_compute[182092]: 2026-01-23 09:26:07.554 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:07 compute-0 nova_compute[182092]: 2026-01-23 09:26:07.773 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:26:08 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 09:26:08 compute-0 systemd[220113]: Activating special unit Exit the Session...
Jan 23 09:26:08 compute-0 systemd[220113]: Stopped target Main User Target.
Jan 23 09:26:08 compute-0 systemd[220113]: Stopped target Basic System.
Jan 23 09:26:08 compute-0 systemd[220113]: Stopped target Paths.
Jan 23 09:26:08 compute-0 systemd[220113]: Stopped target Sockets.
Jan 23 09:26:08 compute-0 systemd[220113]: Stopped target Timers.
Jan 23 09:26:08 compute-0 systemd[220113]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:26:08 compute-0 systemd[220113]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:26:08 compute-0 systemd[220113]: Closed D-Bus User Message Bus Socket.
Jan 23 09:26:08 compute-0 systemd[220113]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:26:08 compute-0 systemd[220113]: Removed slice User Application Slice.
Jan 23 09:26:08 compute-0 systemd[220113]: Reached target Shutdown.
Jan 23 09:26:08 compute-0 systemd[220113]: Finished Exit the Session.
Jan 23 09:26:08 compute-0 systemd[220113]: Reached target Exit the Session.
Jan 23 09:26:08 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 09:26:08 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 09:26:08 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 09:26:08 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 09:26:08 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 09:26:08 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 09:26:08 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 09:26:11 compute-0 podman[220369]: 2026-01-23 09:26:11.209993885 +0000 UTC m=+0.045098573 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:26:11 compute-0 podman[220368]: 2026-01-23 09:26:11.211627877 +0000 UTC m=+0.049195061 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.340 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.341 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.341 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.341 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.341 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.348 182096 INFO nova.compute.manager [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Terminating instance
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.354 182096 DEBUG nova.compute.manager [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:26:11 compute-0 kernel: tapb03613c5-81 (unregistering): left promiscuous mode
Jan 23 09:26:11 compute-0 NetworkManager[54920]: <info>  [1769160371.3770] device (tapb03613c5-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:26:11 compute-0 ovn_controller[94697]: 2026-01-23T09:26:11Z|00316|binding|INFO|Releasing lport b03613c5-81a2-4cce-8c31-aa177cb2e3e0 from this chassis (sb_readonly=0)
Jan 23 09:26:11 compute-0 ovn_controller[94697]: 2026-01-23T09:26:11Z|00317|binding|INFO|Setting lport b03613c5-81a2-4cce-8c31-aa177cb2e3e0 down in Southbound
Jan 23 09:26:11 compute-0 ovn_controller[94697]: 2026-01-23T09:26:11Z|00318|binding|INFO|Removing iface tapb03613c5-81 ovn-installed in OVS
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.382 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.384 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.387 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:db:0e 10.100.0.12'], port_security=['fa:16:3e:1a:db:0e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4e287d24-a3ba-4551-ac59-c2f692b6c9b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3598732e-75d5-4a2b-8884-521ea92eab7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'neutron:revision_number': '10', 'neutron:security_group_ids': '7652143e-debd-4a5a-90a5-8ccbe554976b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d5ac5-7f76-4d27-a905-10a9d18c8f4a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b03613c5-81a2-4cce-8c31-aa177cb2e3e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.388 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b03613c5-81a2-4cce-8c31-aa177cb2e3e0 in datapath 3598732e-75d5-4a2b-8884-521ea92eab7a unbound from our chassis
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.389 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3598732e-75d5-4a2b-8884-521ea92eab7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.390 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f1eed4a8-9379-4786-b3b5-489454295d68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.391 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a namespace which is not needed anymore
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.398 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:11 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000055.scope: Deactivated successfully.
Jan 23 09:26:11 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000055.scope: Consumed 8.744s CPU time.
Jan 23 09:26:11 compute-0 systemd-machined[153562]: Machine qemu-44-instance-00000055 terminated.
Jan 23 09:26:11 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220354]: [NOTICE]   (220358) : haproxy version is 2.8.14-c23fe91
Jan 23 09:26:11 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220354]: [NOTICE]   (220358) : path to executable is /usr/sbin/haproxy
Jan 23 09:26:11 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220354]: [ALERT]    (220358) : Current worker (220360) exited with code 143 (Terminated)
Jan 23 09:26:11 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220354]: [WARNING]  (220358) : All workers exited. Exiting... (0)
Jan 23 09:26:11 compute-0 systemd[1]: libpod-e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73.scope: Deactivated successfully.
Jan 23 09:26:11 compute-0 podman[220426]: 2026-01-23 09:26:11.494458478 +0000 UTC m=+0.038663392 container died e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:26:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73-userdata-shm.mount: Deactivated successfully.
Jan 23 09:26:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-7c7f0f4ac8207c5be9f7472119af0f7fd59087d63473d26b503f8d307d36f9ac-merged.mount: Deactivated successfully.
Jan 23 09:26:11 compute-0 podman[220426]: 2026-01-23 09:26:11.514279993 +0000 UTC m=+0.058484897 container cleanup e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:26:11 compute-0 systemd[1]: libpod-conmon-e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73.scope: Deactivated successfully.
Jan 23 09:26:11 compute-0 podman[220450]: 2026-01-23 09:26:11.558138335 +0000 UTC m=+0.027997510 container remove e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.562 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[363cb934-9420-4fed-86fa-22bf90146cac]: (4, ('Fri Jan 23 09:26:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a (e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73)\ne8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73\nFri Jan 23 09:26:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a (e8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73)\ne8d638b104deb4dbdb45648d802899a1aea372010d3a3d2161dc5554fa57fb73\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.563 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e581f9b3-3e00-4029-b5df-7db2166f773a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.564 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3598732e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.566 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:11 compute-0 kernel: tap3598732e-70: left promiscuous mode
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.583 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.585 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4863b773-26f6-4871-b93d-dfad6cf5913c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.595 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8e3c5fb1-dc94-45ad-bd29-2764746088f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.596 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9dbd93-007d-4f09-bd53-d7594154220e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.600 182096 INFO nova.virt.libvirt.driver [-] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Instance destroyed successfully.
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.600 182096 DEBUG nova.objects.instance [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'resources' on Instance uuid 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.609 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e21ce9de-09b4-456a-874e-0f4bbc1d0fb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386817, 'reachable_time': 25349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220477, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:11 compute-0 systemd[1]: run-netns-ovnmeta\x2d3598732e\x2d75d5\x2d4a2b\x2d8884\x2d521ea92eab7a.mount: Deactivated successfully.
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.611 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:26:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:11.612 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[70908d62-106d-47cc-ace7-81cf6b98bd83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.621 182096 DEBUG nova.virt.libvirt.vif [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:25:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-17528005',display_name='tempest-ServerDiskConfigTestJSON-server-17528005',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-17528005',id=85,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:26:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-3mirjmas',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:26:07Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=4e287d24-a3ba-4551-ac59-c2f692b6c9b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.621 182096 DEBUG nova.network.os_vif_util [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "address": "fa:16:3e:1a:db:0e", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb03613c5-81", "ovs_interfaceid": "b03613c5-81a2-4cce-8c31-aa177cb2e3e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.621 182096 DEBUG nova.network.os_vif_util [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1a:db:0e,bridge_name='br-int',has_traffic_filtering=True,id=b03613c5-81a2-4cce-8c31-aa177cb2e3e0,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03613c5-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.622 182096 DEBUG os_vif [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1a:db:0e,bridge_name='br-int',has_traffic_filtering=True,id=b03613c5-81a2-4cce-8c31-aa177cb2e3e0,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03613c5-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.623 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.623 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb03613c5-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.624 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.625 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.627 182096 INFO os_vif [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1a:db:0e,bridge_name='br-int',has_traffic_filtering=True,id=b03613c5-81a2-4cce-8c31-aa177cb2e3e0,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb03613c5-81')
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.628 182096 INFO nova.virt.libvirt.driver [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Deleting instance files /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1_del
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.632 182096 INFO nova.virt.libvirt.driver [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Deletion of /var/lib/nova/instances/4e287d24-a3ba-4551-ac59-c2f692b6c9b1_del complete
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.636 182096 DEBUG nova.compute.manager [req-686fc280-5050-446a-863e-d90f872de330 req-d06627e0-3a63-4df2-bde7-ff2a76c8f215 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.636 182096 DEBUG oslo_concurrency.lockutils [req-686fc280-5050-446a-863e-d90f872de330 req-d06627e0-3a63-4df2-bde7-ff2a76c8f215 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.637 182096 DEBUG oslo_concurrency.lockutils [req-686fc280-5050-446a-863e-d90f872de330 req-d06627e0-3a63-4df2-bde7-ff2a76c8f215 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.637 182096 DEBUG oslo_concurrency.lockutils [req-686fc280-5050-446a-863e-d90f872de330 req-d06627e0-3a63-4df2-bde7-ff2a76c8f215 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.637 182096 DEBUG nova.compute.manager [req-686fc280-5050-446a-863e-d90f872de330 req-d06627e0-3a63-4df2-bde7-ff2a76c8f215 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.637 182096 DEBUG nova.compute.manager [req-686fc280-5050-446a-863e-d90f872de330 req-d06627e0-3a63-4df2-bde7-ff2a76c8f215 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-unplugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.679 182096 INFO nova.compute.manager [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.679 182096 DEBUG oslo.service.loopingcall [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.679 182096 DEBUG nova.compute.manager [-] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:26:11 compute-0 nova_compute[182092]: 2026-01-23 09:26:11.680 182096 DEBUG nova.network.neutron [-] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.073 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.447 182096 DEBUG nova.network.neutron [-] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.462 182096 INFO nova.compute.manager [-] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Took 0.78 seconds to deallocate network for instance.
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.514 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.514 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.559 182096 DEBUG nova.compute.provider_tree [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.569 182096 DEBUG nova.compute.manager [req-093afc28-471c-41fb-bf58-09e4df04f596 req-7b59f540-0ef7-429a-adf6-d5c41fac0185 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-deleted-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.572 182096 DEBUG nova.scheduler.client.report [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.586 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.604 182096 INFO nova.scheduler.client.report [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Deleted allocations for instance 4e287d24-a3ba-4551-ac59-c2f692b6c9b1
Jan 23 09:26:12 compute-0 nova_compute[182092]: 2026-01-23 09:26:12.670 182096 DEBUG oslo_concurrency.lockutils [None req-cb19f73d-721d-45f8-895a-42f0a42155e8 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:13 compute-0 nova_compute[182092]: 2026-01-23 09:26:13.734 182096 DEBUG nova.compute.manager [req-25b493df-4151-4d2b-b333-06bc771c4088 req-a603c5b8-d3db-46db-99c8-ef5c999f645f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:13 compute-0 nova_compute[182092]: 2026-01-23 09:26:13.734 182096 DEBUG oslo_concurrency.lockutils [req-25b493df-4151-4d2b-b333-06bc771c4088 req-a603c5b8-d3db-46db-99c8-ef5c999f645f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:13 compute-0 nova_compute[182092]: 2026-01-23 09:26:13.735 182096 DEBUG oslo_concurrency.lockutils [req-25b493df-4151-4d2b-b333-06bc771c4088 req-a603c5b8-d3db-46db-99c8-ef5c999f645f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:13 compute-0 nova_compute[182092]: 2026-01-23 09:26:13.735 182096 DEBUG oslo_concurrency.lockutils [req-25b493df-4151-4d2b-b333-06bc771c4088 req-a603c5b8-d3db-46db-99c8-ef5c999f645f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4e287d24-a3ba-4551-ac59-c2f692b6c9b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:13 compute-0 nova_compute[182092]: 2026-01-23 09:26:13.735 182096 DEBUG nova.compute.manager [req-25b493df-4151-4d2b-b333-06bc771c4088 req-a603c5b8-d3db-46db-99c8-ef5c999f645f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] No waiting events found dispatching network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:13 compute-0 nova_compute[182092]: 2026-01-23 09:26:13.735 182096 WARNING nova.compute.manager [req-25b493df-4151-4d2b-b333-06bc771c4088 req-a603c5b8-d3db-46db-99c8-ef5c999f645f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Received unexpected event network-vif-plugged-b03613c5-81a2-4cce-8c31-aa177cb2e3e0 for instance with vm_state deleted and task_state None.
Jan 23 09:26:14 compute-0 podman[220478]: 2026-01-23 09:26:14.205861952 +0000 UTC m=+0.043601799 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, io.buildah.version=1.33.7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=)
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.373 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "636b203f-360e-4be0-958a-4c24011f0bca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.373 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.383 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.498 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.499 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.503 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.503 182096 INFO nova.compute.claims [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.627 182096 DEBUG nova.compute.provider_tree [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.642 182096 DEBUG nova.scheduler.client.report [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.657 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.657 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.703 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.704 182096 DEBUG nova.network.neutron [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.716 182096 INFO nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.728 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.806 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.808 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.809 182096 INFO nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Creating image(s)
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.809 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "/var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.809 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "/var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.810 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "/var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.820 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.867 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.868 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.868 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.877 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.923 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.924 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.947 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.947 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.948 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.993 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.994 182096 DEBUG nova.virt.disk.api [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Checking if we can resize image /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:26:14 compute-0 nova_compute[182092]: 2026-01-23 09:26:14.994 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.040 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.040 182096 DEBUG nova.virt.disk.api [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Cannot resize image /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.041 182096 DEBUG nova.objects.instance [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'migration_context' on Instance uuid 636b203f-360e-4be0-958a-4c24011f0bca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.057 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.058 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Ensure instance console log exists: /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.059 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.060 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.060 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.531 182096 DEBUG nova.policy [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5edc8d287a9f4ffd90f54ecea19df7e8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.889 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160360.8881536, 477e372f-4119-4251-9759-6196a5c39ac4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.889 182096 INFO nova.compute.manager [-] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] VM Stopped (Lifecycle Event)
Jan 23 09:26:15 compute-0 nova_compute[182092]: 2026-01-23 09:26:15.906 182096 DEBUG nova.compute.manager [None req-76c2ec32-009d-4b4c-8ed3-d724198df59c - - - - - -] [instance: 477e372f-4119-4251-9759-6196a5c39ac4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:16 compute-0 nova_compute[182092]: 2026-01-23 09:26:16.624 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:16 compute-0 nova_compute[182092]: 2026-01-23 09:26:16.694 182096 DEBUG nova.network.neutron [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Successfully created port: bed700ba-856a-4543-bdc3-88c5402aac8b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:26:17 compute-0 nova_compute[182092]: 2026-01-23 09:26:17.075 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:17 compute-0 nova_compute[182092]: 2026-01-23 09:26:17.642 182096 DEBUG nova.network.neutron [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Successfully updated port: bed700ba-856a-4543-bdc3-88c5402aac8b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:26:17 compute-0 nova_compute[182092]: 2026-01-23 09:26:17.655 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "refresh_cache-636b203f-360e-4be0-958a-4c24011f0bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:26:17 compute-0 nova_compute[182092]: 2026-01-23 09:26:17.655 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquired lock "refresh_cache-636b203f-360e-4be0-958a-4c24011f0bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:26:17 compute-0 nova_compute[182092]: 2026-01-23 09:26:17.656 182096 DEBUG nova.network.neutron [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:26:17 compute-0 nova_compute[182092]: 2026-01-23 09:26:17.757 182096 DEBUG nova.compute.manager [req-26f5e64e-c651-4461-960a-03fb23a9ed9b req-075e8e99-0d34-46e6-91b3-ccf86f966e76 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received event network-changed-bed700ba-856a-4543-bdc3-88c5402aac8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:17 compute-0 nova_compute[182092]: 2026-01-23 09:26:17.758 182096 DEBUG nova.compute.manager [req-26f5e64e-c651-4461-960a-03fb23a9ed9b req-075e8e99-0d34-46e6-91b3-ccf86f966e76 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Refreshing instance network info cache due to event network-changed-bed700ba-856a-4543-bdc3-88c5402aac8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:26:17 compute-0 nova_compute[182092]: 2026-01-23 09:26:17.758 182096 DEBUG oslo_concurrency.lockutils [req-26f5e64e-c651-4461-960a-03fb23a9ed9b req-075e8e99-0d34-46e6-91b3-ccf86f966e76 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-636b203f-360e-4be0-958a-4c24011f0bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.081 182096 DEBUG nova.network.neutron [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.680 182096 DEBUG nova.network.neutron [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Updating instance_info_cache with network_info: [{"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.695 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Releasing lock "refresh_cache-636b203f-360e-4be0-958a-4c24011f0bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.695 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Instance network_info: |[{"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.695 182096 DEBUG oslo_concurrency.lockutils [req-26f5e64e-c651-4461-960a-03fb23a9ed9b req-075e8e99-0d34-46e6-91b3-ccf86f966e76 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-636b203f-360e-4be0-958a-4c24011f0bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.696 182096 DEBUG nova.network.neutron [req-26f5e64e-c651-4461-960a-03fb23a9ed9b req-075e8e99-0d34-46e6-91b3-ccf86f966e76 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Refreshing network info cache for port bed700ba-856a-4543-bdc3-88c5402aac8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.698 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Start _get_guest_xml network_info=[{"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.701 182096 WARNING nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.708 182096 DEBUG nova.virt.libvirt.host [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.709 182096 DEBUG nova.virt.libvirt.host [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.711 182096 DEBUG nova.virt.libvirt.host [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.712 182096 DEBUG nova.virt.libvirt.host [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.713 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.713 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.713 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.714 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.714 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.714 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.714 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.715 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.715 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.715 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.715 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.716 182096 DEBUG nova.virt.hardware [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.718 182096 DEBUG nova.virt.libvirt.vif [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:26:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2014843578',display_name='tempest-ServerDiskConfigTestJSON-server-2014843578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2014843578',id=88,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-ag1bvrq3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:26:14Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=636b203f-360e-4be0-958a-4c24011f0bca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.719 182096 DEBUG nova.network.os_vif_util [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.719 182096 DEBUG nova.network.os_vif_util [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:b2:00,bridge_name='br-int',has_traffic_filtering=True,id=bed700ba-856a-4543-bdc3-88c5402aac8b,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbed700ba-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.720 182096 DEBUG nova.objects.instance [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'pci_devices' on Instance uuid 636b203f-360e-4be0-958a-4c24011f0bca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.728 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <uuid>636b203f-360e-4be0-958a-4c24011f0bca</uuid>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <name>instance-00000058</name>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-2014843578</nova:name>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:26:18</nova:creationTime>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:26:18 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:26:18 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:26:18 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:26:18 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:26:18 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:26:18 compute-0 nova_compute[182092]:         <nova:user uuid="5edc8d287a9f4ffd90f54ecea19df7e8">tempest-ServerDiskConfigTestJSON-1935371313-project-member</nova:user>
Jan 23 09:26:18 compute-0 nova_compute[182092]:         <nova:project uuid="b9afff815c4546ad97f6d3afa2c35483">tempest-ServerDiskConfigTestJSON-1935371313</nova:project>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:26:18 compute-0 nova_compute[182092]:         <nova:port uuid="bed700ba-856a-4543-bdc3-88c5402aac8b">
Jan 23 09:26:18 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <system>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <entry name="serial">636b203f-360e-4be0-958a-4c24011f0bca</entry>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <entry name="uuid">636b203f-360e-4be0-958a-4c24011f0bca</entry>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </system>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <os>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   </os>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <features>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   </features>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk.config"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:d2:b2:00"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <target dev="tapbed700ba-85"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/console.log" append="off"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <video>
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </video>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:26:18 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:26:18 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:26:18 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:26:18 compute-0 nova_compute[182092]: </domain>
Jan 23 09:26:18 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.729 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Preparing to wait for external event network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.730 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "636b203f-360e-4be0-958a-4c24011f0bca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.730 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.730 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.731 182096 DEBUG nova.virt.libvirt.vif [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:26:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2014843578',display_name='tempest-ServerDiskConfigTestJSON-server-2014843578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2014843578',id=88,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-ag1bvrq3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:26:14Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=636b203f-360e-4be0-958a-4c24011f0bca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.731 182096 DEBUG nova.network.os_vif_util [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.732 182096 DEBUG nova.network.os_vif_util [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:b2:00,bridge_name='br-int',has_traffic_filtering=True,id=bed700ba-856a-4543-bdc3-88c5402aac8b,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbed700ba-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.732 182096 DEBUG os_vif [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:b2:00,bridge_name='br-int',has_traffic_filtering=True,id=bed700ba-856a-4543-bdc3-88c5402aac8b,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbed700ba-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.732 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.733 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.733 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.735 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.735 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbed700ba-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.736 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbed700ba-85, col_values=(('external_ids', {'iface-id': 'bed700ba-856a-4543-bdc3-88c5402aac8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:b2:00', 'vm-uuid': '636b203f-360e-4be0-958a-4c24011f0bca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.737 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:18 compute-0 NetworkManager[54920]: <info>  [1769160378.7377] manager: (tapbed700ba-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.739 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.740 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.740 182096 INFO os_vif [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:b2:00,bridge_name='br-int',has_traffic_filtering=True,id=bed700ba-856a-4543-bdc3-88c5402aac8b,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbed700ba-85')
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.772 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.773 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.773 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] No VIF found with MAC fa:16:3e:d2:b2:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:26:18 compute-0 nova_compute[182092]: 2026-01-23 09:26:18.774 182096 INFO nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Using config drive
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.307 182096 INFO nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Creating config drive at /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk.config
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.311 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4nt9iiaf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.429 182096 DEBUG oslo_concurrency.processutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4nt9iiaf" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:19 compute-0 kernel: tapbed700ba-85: entered promiscuous mode
Jan 23 09:26:19 compute-0 NetworkManager[54920]: <info>  [1769160379.4696] manager: (tapbed700ba-85): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Jan 23 09:26:19 compute-0 ovn_controller[94697]: 2026-01-23T09:26:19Z|00319|binding|INFO|Claiming lport bed700ba-856a-4543-bdc3-88c5402aac8b for this chassis.
Jan 23 09:26:19 compute-0 ovn_controller[94697]: 2026-01-23T09:26:19Z|00320|binding|INFO|bed700ba-856a-4543-bdc3-88c5402aac8b: Claiming fa:16:3e:d2:b2:00 10.100.0.8
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.472 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.485 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:b2:00 10.100.0.8'], port_security=['fa:16:3e:d2:b2:00 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '636b203f-360e-4be0-958a-4c24011f0bca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3598732e-75d5-4a2b-8884-521ea92eab7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7652143e-debd-4a5a-90a5-8ccbe554976b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d5ac5-7f76-4d27-a905-10a9d18c8f4a, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=bed700ba-856a-4543-bdc3-88c5402aac8b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.486 103978 INFO neutron.agent.ovn.metadata.agent [-] Port bed700ba-856a-4543-bdc3-88c5402aac8b in datapath 3598732e-75d5-4a2b-8884-521ea92eab7a bound to our chassis
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.487 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.489 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:19 compute-0 ovn_controller[94697]: 2026-01-23T09:26:19Z|00321|binding|INFO|Setting lport bed700ba-856a-4543-bdc3-88c5402aac8b ovn-installed in OVS
Jan 23 09:26:19 compute-0 ovn_controller[94697]: 2026-01-23T09:26:19Z|00322|binding|INFO|Setting lport bed700ba-856a-4543-bdc3-88c5402aac8b up in Southbound
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.492 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.494 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:19 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.496 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fb201eff-8243-418b-8597-5a5ede0dde97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.497 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3598732e-71 in ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.499 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3598732e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.499 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9a31f2fa-cf65-4d1c-99d2-209354e1ea8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.500 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3d0752-5166-49e8-ac0a-746ced026b6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 systemd-udevd[220531]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.510 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[f297beba-14a1-459d-bfe0-d87162d1867a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 NetworkManager[54920]: <info>  [1769160379.5148] device (tapbed700ba-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:26:19 compute-0 NetworkManager[54920]: <info>  [1769160379.5154] device (tapbed700ba-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:26:19 compute-0 systemd-machined[153562]: New machine qemu-45-instance-00000058.
Jan 23 09:26:19 compute-0 systemd[1]: Started Virtual Machine qemu-45-instance-00000058.
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.530 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3d996f-5fe7-409f-aa36-29325b76385e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.551 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ec69d1fe-2474-4b9d-b811-4449fc07372a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 NetworkManager[54920]: <info>  [1769160379.5552] manager: (tap3598732e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/165)
Jan 23 09:26:19 compute-0 systemd-udevd[220536]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.556 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[50f40fc2-e49e-4061-9342-9d57b7faefbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.583 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[0905c9ba-bd17-44c7-9d1d-966b04ba7c01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.586 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b2bd5bde-f8a4-4ad1-8cfc-2268137b99ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 NetworkManager[54920]: <info>  [1769160379.6028] device (tap3598732e-70): carrier: link connected
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.607 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9d2514-c2e6-44e4-b913-d608be5840d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.619 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec526bc-1b48-407e-b00b-4eb9f9306275]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3598732e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:00:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388508, 'reachable_time': 39700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220558, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.632 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b60ee49-5d09-4686-80dc-5b1ec3b48eef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388508, 'tstamp': 388508}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220559, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.647 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[36f6d94d-4852-4025-b547-365cc8b5684f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3598732e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:00:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388508, 'reachable_time': 39700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220560, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.668 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a3982928-07f8-435c-b216-3ce4ed891f85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.708 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[699e498d-5589-414a-8439-896af932cf43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.709 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3598732e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.709 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.709 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3598732e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:19 compute-0 kernel: tap3598732e-70: entered promiscuous mode
Jan 23 09:26:19 compute-0 NetworkManager[54920]: <info>  [1769160379.7113] manager: (tap3598732e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.714 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.714 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3598732e-70, col_values=(('external_ids', {'iface-id': '9bf071ba-d027-4af7-a154-40b491b7a535'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:19 compute-0 ovn_controller[94697]: 2026-01-23T09:26:19Z|00323|binding|INFO|Releasing lport 9bf071ba-d027-4af7-a154-40b491b7a535 from this chassis (sb_readonly=0)
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.716 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.717 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f219c228-d7cd-46b3-9a5f-e017a3373c72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.717 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/3598732e-75d5-4a2b-8884-521ea92eab7a.pid.haproxy
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 3598732e-75d5-4a2b-8884-521ea92eab7a
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:26:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:19.718 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'env', 'PROCESS_TAG=haproxy-3598732e-75d5-4a2b-8884-521ea92eab7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3598732e-75d5-4a2b-8884-521ea92eab7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:26:19 compute-0 nova_compute[182092]: 2026-01-23 09:26:19.730 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:19 compute-0 podman[220588]: 2026-01-23 09:26:19.998136948 +0000 UTC m=+0.036156713 container create 7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 09:26:20 compute-0 systemd[1]: Started libpod-conmon-7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073.scope.
Jan 23 09:26:20 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:26:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09be5b5aca3455ee6d8d1eea4955dea91042572ac435e80fb3ab94b51c659a20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.054 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160380.054397, 636b203f-360e-4be0-958a-4c24011f0bca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.055 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] VM Started (Lifecycle Event)
Jan 23 09:26:20 compute-0 podman[220588]: 2026-01-23 09:26:20.058374852 +0000 UTC m=+0.096394617 container init 7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:26:20 compute-0 podman[220588]: 2026-01-23 09:26:20.062841408 +0000 UTC m=+0.100861173 container start 7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 09:26:20 compute-0 podman[220588]: 2026-01-23 09:26:19.979184924 +0000 UTC m=+0.017204709 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.075 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.077 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160380.0545857, 636b203f-360e-4be0-958a-4c24011f0bca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.077 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] VM Paused (Lifecycle Event)
Jan 23 09:26:20 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220606]: [NOTICE]   (220611) : New worker (220613) forked
Jan 23 09:26:20 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220606]: [NOTICE]   (220611) : Loading success.
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.091 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.093 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.104 182096 DEBUG nova.network.neutron [req-26f5e64e-c651-4461-960a-03fb23a9ed9b req-075e8e99-0d34-46e6-91b3-ccf86f966e76 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Updated VIF entry in instance network info cache for port bed700ba-856a-4543-bdc3-88c5402aac8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.104 182096 DEBUG nova.network.neutron [req-26f5e64e-c651-4461-960a-03fb23a9ed9b req-075e8e99-0d34-46e6-91b3-ccf86f966e76 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Updating instance_info_cache with network_info: [{"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.106 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:26:20 compute-0 nova_compute[182092]: 2026-01-23 09:26:20.113 182096 DEBUG oslo_concurrency.lockutils [req-26f5e64e-c651-4461-960a-03fb23a9ed9b req-075e8e99-0d34-46e6-91b3-ccf86f966e76 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-636b203f-360e-4be0-958a-4c24011f0bca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.255 182096 DEBUG nova.compute.manager [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received event network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.255 182096 DEBUG oslo_concurrency.lockutils [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "636b203f-360e-4be0-958a-4c24011f0bca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.255 182096 DEBUG oslo_concurrency.lockutils [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.255 182096 DEBUG oslo_concurrency.lockutils [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.255 182096 DEBUG nova.compute.manager [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Processing event network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.255 182096 DEBUG nova.compute.manager [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received event network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.256 182096 DEBUG oslo_concurrency.lockutils [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "636b203f-360e-4be0-958a-4c24011f0bca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.256 182096 DEBUG oslo_concurrency.lockutils [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.256 182096 DEBUG oslo_concurrency.lockutils [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.256 182096 DEBUG nova.compute.manager [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] No waiting events found dispatching network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.256 182096 WARNING nova.compute.manager [req-44ba410c-bfc7-4f22-a9c1-38ae1e8586b5 req-d1077d80-8b33-441c-a47f-5baffd964d0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received unexpected event network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b for instance with vm_state building and task_state spawning.
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.257 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.259 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160381.2593718, 636b203f-360e-4be0-958a-4c24011f0bca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.259 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] VM Resumed (Lifecycle Event)
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.262 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.265 182096 INFO nova.virt.libvirt.driver [-] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Instance spawned successfully.
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.265 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.278 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.282 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.285 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.285 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.285 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.286 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.286 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.286 182096 DEBUG nova.virt.libvirt.driver [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.303 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.339 182096 INFO nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Took 6.53 seconds to spawn the instance on the hypervisor.
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.340 182096 DEBUG nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.389 182096 INFO nova.compute.manager [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Took 6.92 seconds to build instance.
Jan 23 09:26:21 compute-0 nova_compute[182092]: 2026-01-23 09:26:21.399 182096 DEBUG oslo_concurrency.lockutils [None req-6cd8d8c3-1d69-42cb-9872-02ef7ec0a3aa 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:22 compute-0 nova_compute[182092]: 2026-01-23 09:26:22.077 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:23 compute-0 nova_compute[182092]: 2026-01-23 09:26:23.739 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:24 compute-0 podman[220618]: 2026-01-23 09:26:24.225281511 +0000 UTC m=+0.059554105 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.260 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "636b203f-360e-4be0-958a-4c24011f0bca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.260 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.260 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "636b203f-360e-4be0-958a-4c24011f0bca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.261 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.261 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.267 182096 INFO nova.compute.manager [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Terminating instance
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.272 182096 DEBUG nova.compute.manager [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:26:25 compute-0 kernel: tapbed700ba-85 (unregistering): left promiscuous mode
Jan 23 09:26:25 compute-0 NetworkManager[54920]: <info>  [1769160385.2909] device (tapbed700ba-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.297 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:25 compute-0 ovn_controller[94697]: 2026-01-23T09:26:25Z|00324|binding|INFO|Releasing lport bed700ba-856a-4543-bdc3-88c5402aac8b from this chassis (sb_readonly=0)
Jan 23 09:26:25 compute-0 ovn_controller[94697]: 2026-01-23T09:26:25Z|00325|binding|INFO|Setting lport bed700ba-856a-4543-bdc3-88c5402aac8b down in Southbound
Jan 23 09:26:25 compute-0 ovn_controller[94697]: 2026-01-23T09:26:25Z|00326|binding|INFO|Removing iface tapbed700ba-85 ovn-installed in OVS
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.299 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.304 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:b2:00 10.100.0.8'], port_security=['fa:16:3e:d2:b2:00 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '636b203f-360e-4be0-958a-4c24011f0bca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3598732e-75d5-4a2b-8884-521ea92eab7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9afff815c4546ad97f6d3afa2c35483', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7652143e-debd-4a5a-90a5-8ccbe554976b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d5ac5-7f76-4d27-a905-10a9d18c8f4a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=bed700ba-856a-4543-bdc3-88c5402aac8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.305 103978 INFO neutron.agent.ovn.metadata.agent [-] Port bed700ba-856a-4543-bdc3-88c5402aac8b in datapath 3598732e-75d5-4a2b-8884-521ea92eab7a unbound from our chassis
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.306 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3598732e-75d5-4a2b-8884-521ea92eab7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.311 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f51248f9-2db7-4540-88f8-edf0cd3614c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.311 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a namespace which is not needed anymore
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.320 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:25 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000058.scope: Deactivated successfully.
Jan 23 09:26:25 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000058.scope: Consumed 4.577s CPU time.
Jan 23 09:26:25 compute-0 systemd-machined[153562]: Machine qemu-45-instance-00000058 terminated.
Jan 23 09:26:25 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220606]: [NOTICE]   (220611) : haproxy version is 2.8.14-c23fe91
Jan 23 09:26:25 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220606]: [NOTICE]   (220611) : path to executable is /usr/sbin/haproxy
Jan 23 09:26:25 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220606]: [ALERT]    (220611) : Current worker (220613) exited with code 143 (Terminated)
Jan 23 09:26:25 compute-0 neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a[220606]: [WARNING]  (220611) : All workers exited. Exiting... (0)
Jan 23 09:26:25 compute-0 systemd[1]: libpod-7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073.scope: Deactivated successfully.
Jan 23 09:26:25 compute-0 podman[220663]: 2026-01-23 09:26:25.410180781 +0000 UTC m=+0.037401763 container died 7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:26:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-09be5b5aca3455ee6d8d1eea4955dea91042572ac435e80fb3ab94b51c659a20-merged.mount: Deactivated successfully.
Jan 23 09:26:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073-userdata-shm.mount: Deactivated successfully.
Jan 23 09:26:25 compute-0 podman[220663]: 2026-01-23 09:26:25.430127762 +0000 UTC m=+0.057348744 container cleanup 7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:26:25 compute-0 systemd[1]: libpod-conmon-7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073.scope: Deactivated successfully.
Jan 23 09:26:25 compute-0 podman[220685]: 2026-01-23 09:26:25.472407516 +0000 UTC m=+0.025555114 container remove 7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.477 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc12296-ee25-497c-9ecd-2994bc7f911e]: (4, ('Fri Jan 23 09:26:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a (7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073)\n7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073\nFri Jan 23 09:26:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a (7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073)\n7ad8442348dcd5c570db7ba918f206c0d9a379b39f48c43d36c6451eb09b3073\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.479 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[07be5b29-bcbb-42a4-b5a4-432d9b228271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.479 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3598732e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:25 compute-0 kernel: tap3598732e-70: left promiscuous mode
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.481 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:25 compute-0 NetworkManager[54920]: <info>  [1769160385.4871] manager: (tapbed700ba-85): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.498 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.501 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.500 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[26cd7aac-d17f-420b-a735-a6cdc2b30ae7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.508 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[22556b33-a1e2-44bc-98a5-dd058edd0ca9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.509 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[12723960-f6f9-4d13-930d-039b4517da3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.515 182096 DEBUG nova.compute.manager [req-954bd86d-bda4-470b-b86c-0e4c818a8af7 req-487b3f8a-b273-4c04-ba62-5dcbdf5af72c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received event network-vif-unplugged-bed700ba-856a-4543-bdc3-88c5402aac8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.515 182096 DEBUG oslo_concurrency.lockutils [req-954bd86d-bda4-470b-b86c-0e4c818a8af7 req-487b3f8a-b273-4c04-ba62-5dcbdf5af72c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "636b203f-360e-4be0-958a-4c24011f0bca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.515 182096 DEBUG oslo_concurrency.lockutils [req-954bd86d-bda4-470b-b86c-0e4c818a8af7 req-487b3f8a-b273-4c04-ba62-5dcbdf5af72c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.516 182096 DEBUG oslo_concurrency.lockutils [req-954bd86d-bda4-470b-b86c-0e4c818a8af7 req-487b3f8a-b273-4c04-ba62-5dcbdf5af72c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.516 182096 DEBUG nova.compute.manager [req-954bd86d-bda4-470b-b86c-0e4c818a8af7 req-487b3f8a-b273-4c04-ba62-5dcbdf5af72c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] No waiting events found dispatching network-vif-unplugged-bed700ba-856a-4543-bdc3-88c5402aac8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.516 182096 DEBUG nova.compute.manager [req-954bd86d-bda4-470b-b86c-0e4c818a8af7 req-487b3f8a-b273-4c04-ba62-5dcbdf5af72c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received event network-vif-unplugged-bed700ba-856a-4543-bdc3-88c5402aac8b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.522 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[71d1e154-e927-4bc9-80f4-f6e0b07ad43a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388503, 'reachable_time': 15358, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220710, 'error': None, 'target': 'ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:25 compute-0 systemd[1]: run-netns-ovnmeta\x2d3598732e\x2d75d5\x2d4a2b\x2d8884\x2d521ea92eab7a.mount: Deactivated successfully.
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.524 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3598732e-75d5-4a2b-8884-521ea92eab7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:26:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:25.524 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[5710464d-f0cf-4507-bbbd-dcd0b39a333f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.529 182096 INFO nova.virt.libvirt.driver [-] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Instance destroyed successfully.
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.529 182096 DEBUG nova.objects.instance [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lazy-loading 'resources' on Instance uuid 636b203f-360e-4be0-958a-4c24011f0bca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.545 182096 DEBUG nova.virt.libvirt.vif [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:26:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2014843578',display_name='tempest-ServerDiskConfigTestJSON-server-2014843578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2014843578',id=88,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:26:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b9afff815c4546ad97f6d3afa2c35483',ramdisk_id='',reservation_id='r-ag1bvrq3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1935371313',owner_user_name='tempest-ServerDiskConfigTestJSON-1935371313-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:26:23Z,user_data=None,user_id='5edc8d287a9f4ffd90f54ecea19df7e8',uuid=636b203f-360e-4be0-958a-4c24011f0bca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.545 182096 DEBUG nova.network.os_vif_util [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converting VIF {"id": "bed700ba-856a-4543-bdc3-88c5402aac8b", "address": "fa:16:3e:d2:b2:00", "network": {"id": "3598732e-75d5-4a2b-8884-521ea92eab7a", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1607724068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b9afff815c4546ad97f6d3afa2c35483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbed700ba-85", "ovs_interfaceid": "bed700ba-856a-4543-bdc3-88c5402aac8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.546 182096 DEBUG nova.network.os_vif_util [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:b2:00,bridge_name='br-int',has_traffic_filtering=True,id=bed700ba-856a-4543-bdc3-88c5402aac8b,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbed700ba-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.546 182096 DEBUG os_vif [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:b2:00,bridge_name='br-int',has_traffic_filtering=True,id=bed700ba-856a-4543-bdc3-88c5402aac8b,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbed700ba-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.547 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.547 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbed700ba-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.551 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.553 182096 INFO os_vif [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:b2:00,bridge_name='br-int',has_traffic_filtering=True,id=bed700ba-856a-4543-bdc3-88c5402aac8b,network=Network(3598732e-75d5-4a2b-8884-521ea92eab7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbed700ba-85')
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.553 182096 INFO nova.virt.libvirt.driver [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Deleting instance files /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca_del
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.554 182096 INFO nova.virt.libvirt.driver [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Deletion of /var/lib/nova/instances/636b203f-360e-4be0-958a-4c24011f0bca_del complete
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.610 182096 INFO nova.compute.manager [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.611 182096 DEBUG oslo.service.loopingcall [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.611 182096 DEBUG nova.compute.manager [-] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:26:25 compute-0 nova_compute[182092]: 2026-01-23 09:26:25.611 182096 DEBUG nova.network.neutron [-] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.493 182096 DEBUG nova.network.neutron [-] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.504 182096 INFO nova.compute.manager [-] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Took 0.89 seconds to deallocate network for instance.
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.548 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.548 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.598 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160371.5975432, 4e287d24-a3ba-4551-ac59-c2f692b6c9b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.598 182096 INFO nova.compute.manager [-] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] VM Stopped (Lifecycle Event)
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.602 182096 DEBUG nova.compute.provider_tree [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.615 182096 DEBUG nova.compute.manager [None req-a4f75110-9068-4967-b826-f9c73773f499 - - - - - -] [instance: 4e287d24-a3ba-4551-ac59-c2f692b6c9b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.616 182096 DEBUG nova.scheduler.client.report [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.629 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.652 182096 INFO nova.scheduler.client.report [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Deleted allocations for instance 636b203f-360e-4be0-958a-4c24011f0bca
Jan 23 09:26:26 compute-0 nova_compute[182092]: 2026-01-23 09:26:26.725 182096 DEBUG oslo_concurrency.lockutils [None req-06145578-1210-4c20-b33e-f3a3b83fc0e2 5edc8d287a9f4ffd90f54ecea19df7e8 b9afff815c4546ad97f6d3afa2c35483 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:27 compute-0 nova_compute[182092]: 2026-01-23 09:26:27.078 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:27 compute-0 nova_compute[182092]: 2026-01-23 09:26:27.606 182096 DEBUG nova.compute.manager [req-13220d7d-c166-414b-8311-beee936f4f7e req-f85c0f76-c907-4571-94b2-2a8bf51cfd94 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received event network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:27 compute-0 nova_compute[182092]: 2026-01-23 09:26:27.607 182096 DEBUG oslo_concurrency.lockutils [req-13220d7d-c166-414b-8311-beee936f4f7e req-f85c0f76-c907-4571-94b2-2a8bf51cfd94 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "636b203f-360e-4be0-958a-4c24011f0bca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:27 compute-0 nova_compute[182092]: 2026-01-23 09:26:27.607 182096 DEBUG oslo_concurrency.lockutils [req-13220d7d-c166-414b-8311-beee936f4f7e req-f85c0f76-c907-4571-94b2-2a8bf51cfd94 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:27 compute-0 nova_compute[182092]: 2026-01-23 09:26:27.607 182096 DEBUG oslo_concurrency.lockutils [req-13220d7d-c166-414b-8311-beee936f4f7e req-f85c0f76-c907-4571-94b2-2a8bf51cfd94 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "636b203f-360e-4be0-958a-4c24011f0bca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:27 compute-0 nova_compute[182092]: 2026-01-23 09:26:27.607 182096 DEBUG nova.compute.manager [req-13220d7d-c166-414b-8311-beee936f4f7e req-f85c0f76-c907-4571-94b2-2a8bf51cfd94 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] No waiting events found dispatching network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:27 compute-0 nova_compute[182092]: 2026-01-23 09:26:27.607 182096 WARNING nova.compute.manager [req-13220d7d-c166-414b-8311-beee936f4f7e req-f85c0f76-c907-4571-94b2-2a8bf51cfd94 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received unexpected event network-vif-plugged-bed700ba-856a-4543-bdc3-88c5402aac8b for instance with vm_state deleted and task_state None.
Jan 23 09:26:27 compute-0 nova_compute[182092]: 2026-01-23 09:26:27.607 182096 DEBUG nova.compute.manager [req-13220d7d-c166-414b-8311-beee936f4f7e req-f85c0f76-c907-4571-94b2-2a8bf51cfd94 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Received event network-vif-deleted-bed700ba-856a-4543-bdc3-88c5402aac8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:30 compute-0 nova_compute[182092]: 2026-01-23 09:26:30.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:30 compute-0 nova_compute[182092]: 2026-01-23 09:26:30.739 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:31 compute-0 podman[220716]: 2026-01-23 09:26:31.212234165 +0000 UTC m=+0.049665789 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 09:26:31 compute-0 podman[220717]: 2026-01-23 09:26:31.230289298 +0000 UTC m=+0.067511066 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:26:32 compute-0 nova_compute[182092]: 2026-01-23 09:26:32.080 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:26:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:26:35 compute-0 nova_compute[182092]: 2026-01-23 09:26:35.549 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:37 compute-0 nova_compute[182092]: 2026-01-23 09:26:37.081 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:39.861 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:39.861 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:39.862 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:40 compute-0 nova_compute[182092]: 2026-01-23 09:26:40.528 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160385.528035, 636b203f-360e-4be0-958a-4c24011f0bca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:40 compute-0 nova_compute[182092]: 2026-01-23 09:26:40.529 182096 INFO nova.compute.manager [-] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] VM Stopped (Lifecycle Event)
Jan 23 09:26:40 compute-0 nova_compute[182092]: 2026-01-23 09:26:40.543 182096 DEBUG nova.compute.manager [None req-93300174-c617-43f9-b074-b433320fed2f - - - - - -] [instance: 636b203f-360e-4be0-958a-4c24011f0bca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:40 compute-0 nova_compute[182092]: 2026-01-23 09:26:40.550 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.558 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.559 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.581 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.645 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.646 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.650 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.650 182096 INFO nova.compute.claims [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.753 182096 DEBUG nova.compute.provider_tree [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.767 182096 DEBUG nova.scheduler.client.report [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.782 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.783 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.832 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.833 182096 DEBUG nova.network.neutron [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.847 182096 INFO nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.862 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.951 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.952 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.952 182096 INFO nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Creating image(s)
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.953 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.953 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.953 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:41 compute-0 nova_compute[182092]: 2026-01-23 09:26:41.963 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.014 182096 DEBUG nova.policy [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '89c019480e524c04af4d250b1c4051e5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.016 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.016 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.017 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.026 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.073 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.074 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.088 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.099 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.100 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.100 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.158 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.159 182096 DEBUG nova.virt.disk.api [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Checking if we can resize image /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.159 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:42 compute-0 podman[220768]: 2026-01-23 09:26:42.206567681 +0000 UTC m=+0.040796266 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.212 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.213 182096 DEBUG nova.virt.disk.api [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Cannot resize image /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.214 182096 DEBUG nova.objects.instance [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.226 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.227 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Ensure instance console log exists: /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.227 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.227 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.228 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:42 compute-0 podman[220765]: 2026-01-23 09:26:42.237239486 +0000 UTC m=+0.076227370 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:26:42 compute-0 nova_compute[182092]: 2026-01-23 09:26:42.745 182096 DEBUG nova.network.neutron [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Successfully created port: 57bcbf69-c3cd-41cb-abf4-09620c4f7165 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:26:43 compute-0 nova_compute[182092]: 2026-01-23 09:26:43.690 182096 DEBUG nova.network.neutron [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Successfully updated port: 57bcbf69-c3cd-41cb-abf4-09620c4f7165 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:26:43 compute-0 nova_compute[182092]: 2026-01-23 09:26:43.710 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:26:43 compute-0 nova_compute[182092]: 2026-01-23 09:26:43.711 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:26:43 compute-0 nova_compute[182092]: 2026-01-23 09:26:43.711 182096 DEBUG nova.network.neutron [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:26:43 compute-0 nova_compute[182092]: 2026-01-23 09:26:43.774 182096 DEBUG nova.compute.manager [req-f1db705d-8e6e-4ac6-a20d-ff19aa89f549 req-70869626-f72b-4579-92a1-c87ad8519357 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-changed-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:43 compute-0 nova_compute[182092]: 2026-01-23 09:26:43.775 182096 DEBUG nova.compute.manager [req-f1db705d-8e6e-4ac6-a20d-ff19aa89f549 req-70869626-f72b-4579-92a1-c87ad8519357 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Refreshing instance network info cache due to event network-changed-57bcbf69-c3cd-41cb-abf4-09620c4f7165. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:26:43 compute-0 nova_compute[182092]: 2026-01-23 09:26:43.775 182096 DEBUG oslo_concurrency.lockutils [req-f1db705d-8e6e-4ac6-a20d-ff19aa89f549 req-70869626-f72b-4579-92a1-c87ad8519357 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:26:43 compute-0 nova_compute[182092]: 2026-01-23 09:26:43.911 182096 DEBUG nova.network.neutron [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.607 182096 DEBUG nova.network.neutron [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updating instance_info_cache with network_info: [{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.624 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.625 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance network_info: |[{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.625 182096 DEBUG oslo_concurrency.lockutils [req-f1db705d-8e6e-4ac6-a20d-ff19aa89f549 req-70869626-f72b-4579-92a1-c87ad8519357 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.625 182096 DEBUG nova.network.neutron [req-f1db705d-8e6e-4ac6-a20d-ff19aa89f549 req-70869626-f72b-4579-92a1-c87ad8519357 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Refreshing network info cache for port 57bcbf69-c3cd-41cb-abf4-09620c4f7165 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.627 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Start _get_guest_xml network_info=[{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.631 182096 WARNING nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.637 182096 DEBUG nova.virt.libvirt.host [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.637 182096 DEBUG nova.virt.libvirt.host [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.640 182096 DEBUG nova.virt.libvirt.host [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.641 182096 DEBUG nova.virt.libvirt.host [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.642 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.642 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.642 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.642 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.643 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.643 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.643 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.643 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.643 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.644 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.644 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.644 182096 DEBUG nova.virt.hardware [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.647 182096 DEBUG nova.virt.libvirt.vif [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-235035716',display_name='tempest-ServerActionsTestJSON-server-235035716',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-235035716',id=90,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-fwke0l95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:26:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=179ad298-ad18-41dd-9b5e-31064aadf7d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.647 182096 DEBUG nova.network.os_vif_util [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.647 182096 DEBUG nova.network.os_vif_util [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.648 182096 DEBUG nova.objects.instance [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.656 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <uuid>179ad298-ad18-41dd-9b5e-31064aadf7d6</uuid>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <name>instance-0000005a</name>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerActionsTestJSON-server-235035716</nova:name>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:26:44</nova:creationTime>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:26:44 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:26:44 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:26:44 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:26:44 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:26:44 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:26:44 compute-0 nova_compute[182092]:         <nova:user uuid="89c019480e524c04af4d250b1c4051e5">tempest-ServerActionsTestJSON-766366320-project-member</nova:user>
Jan 23 09:26:44 compute-0 nova_compute[182092]:         <nova:project uuid="860ef09b9e6e4866bbe99b6e769733a3">tempest-ServerActionsTestJSON-766366320</nova:project>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:26:44 compute-0 nova_compute[182092]:         <nova:port uuid="57bcbf69-c3cd-41cb-abf4-09620c4f7165">
Jan 23 09:26:44 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <system>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <entry name="serial">179ad298-ad18-41dd-9b5e-31064aadf7d6</entry>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <entry name="uuid">179ad298-ad18-41dd-9b5e-31064aadf7d6</entry>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </system>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <os>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   </os>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <features>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   </features>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.config"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:1e:94:3b"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <target dev="tap57bcbf69-c3"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/console.log" append="off"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <video>
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </video>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:26:44 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:26:44 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:26:44 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:26:44 compute-0 nova_compute[182092]: </domain>
Jan 23 09:26:44 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.657 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Preparing to wait for external event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.658 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.658 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.659 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.659 182096 DEBUG nova.virt.libvirt.vif [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-235035716',display_name='tempest-ServerActionsTestJSON-server-235035716',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-235035716',id=90,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-fwke0l95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:26:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=179ad298-ad18-41dd-9b5e-31064aadf7d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.660 182096 DEBUG nova.network.os_vif_util [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.660 182096 DEBUG nova.network.os_vif_util [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.660 182096 DEBUG os_vif [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.661 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.661 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.662 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.664 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.664 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57bcbf69-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.664 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57bcbf69-c3, col_values=(('external_ids', {'iface-id': '57bcbf69-c3cd-41cb-abf4-09620c4f7165', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:94:3b', 'vm-uuid': '179ad298-ad18-41dd-9b5e-31064aadf7d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.665 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:44 compute-0 NetworkManager[54920]: <info>  [1769160404.6663] manager: (tap57bcbf69-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.668 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.669 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.670 182096 INFO os_vif [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3')
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.721 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.721 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.721 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] No VIF found with MAC fa:16:3e:1e:94:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:26:44 compute-0 nova_compute[182092]: 2026-01-23 09:26:44.722 182096 INFO nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Using config drive
Jan 23 09:26:44 compute-0 podman[220810]: 2026-01-23 09:26:44.733133325 +0000 UTC m=+0.040712498 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.150 182096 INFO nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Creating config drive at /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.config
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.155 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplm1oc_y7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.275 182096 DEBUG oslo_concurrency.processutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplm1oc_y7" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:26:45 compute-0 kernel: tap57bcbf69-c3: entered promiscuous mode
Jan 23 09:26:45 compute-0 ovn_controller[94697]: 2026-01-23T09:26:45Z|00327|binding|INFO|Claiming lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 for this chassis.
Jan 23 09:26:45 compute-0 ovn_controller[94697]: 2026-01-23T09:26:45Z|00328|binding|INFO|57bcbf69-c3cd-41cb-abf4-09620c4f7165: Claiming fa:16:3e:1e:94:3b 10.100.0.7
Jan 23 09:26:45 compute-0 NetworkManager[54920]: <info>  [1769160405.3163] manager: (tap57bcbf69-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/169)
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.316 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.318 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.320 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:94:3b 10.100.0.7'], port_security=['fa:16:3e:1e:94:3b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '179ad298-ad18-41dd-9b5e-31064aadf7d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=57bcbf69-c3cd-41cb-abf4-09620c4f7165) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.321 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 57bcbf69-c3cd-41cb-abf4-09620c4f7165 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.322 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.330 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[280f8c40-8dc0-41f5-bc09-1c5e0ff9279c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.331 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa610e7c-51 in ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.333 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa610e7c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.333 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ee74a4-5d49-4829-b2f6-74eacf8cd8f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.334 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3c842f15-b959-476f-93d0-d8241a763ebb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 systemd-udevd[220847]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.342 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[94011c29-05a9-4c17-a582-3132ded67317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 NetworkManager[54920]: <info>  [1769160405.3525] device (tap57bcbf69-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:26:45 compute-0 NetworkManager[54920]: <info>  [1769160405.3533] device (tap57bcbf69-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:26:45 compute-0 systemd-machined[153562]: New machine qemu-46-instance-0000005a.
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.364 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c542e596-e24c-4414-b810-eabe3326d12c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 systemd[1]: Started Virtual Machine qemu-46-instance-0000005a.
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.378 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:45 compute-0 ovn_controller[94697]: 2026-01-23T09:26:45Z|00329|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 ovn-installed in OVS
Jan 23 09:26:45 compute-0 ovn_controller[94697]: 2026-01-23T09:26:45Z|00330|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 up in Southbound
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.385 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2877ff76-1100-4c4c-b60f-d70b21d3c647]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.385 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:45 compute-0 NetworkManager[54920]: <info>  [1769160405.3893] manager: (tapfa610e7c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/170)
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.388 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[19985fe9-4e75-4c4d-a4a8-7046c8cbd749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 systemd-udevd[220851]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.413 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[36907b27-00d9-4481-b6ab-dc33861ba7e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.415 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbf672d-d4a2-4366-8c07-fad4e426c53c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 NetworkManager[54920]: <info>  [1769160405.4309] device (tapfa610e7c-50): carrier: link connected
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.433 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[020f24a5-995d-4b81-b819-f6e83f8ac000]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.445 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7b836b-775d-4123-80f1-1ef335efc295]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391091, 'reachable_time': 27024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220871, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.456 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[acd80f2d-2a05-4d31-8d52-22f82349c965]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:516c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391091, 'tstamp': 391091}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220872, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.470 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[57e4ce52-6f61-4452-9295-981a2e7177b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391091, 'reachable_time': 27024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220873, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.493 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b44410-00e0-4aa6-af4f-9d0c8392512e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.534 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4a52ed-1731-4a4f-a1d7-d0076b3ecf5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.535 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.535 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.536 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:45 compute-0 NetworkManager[54920]: <info>  [1769160405.5378] manager: (tapfa610e7c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Jan 23 09:26:45 compute-0 kernel: tapfa610e7c-50: entered promiscuous mode
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.541 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.541 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:26:45 compute-0 ovn_controller[94697]: 2026-01-23T09:26:45Z|00331|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.544 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.545 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[36f900ad-6302-4990-9bfd-73e8d3986826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.545 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:26:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:26:45.546 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'env', 'PROCESS_TAG=haproxy-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa610e7c-53f8-4775-b5b8-aa45897b011c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.555 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.578 182096 DEBUG nova.compute.manager [req-b826a58d-67a5-4551-9267-297c5c42a048 req-f7ae3db6-8a3e-495c-a0c8-08e750d36707 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.579 182096 DEBUG oslo_concurrency.lockutils [req-b826a58d-67a5-4551-9267-297c5c42a048 req-f7ae3db6-8a3e-495c-a0c8-08e750d36707 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.579 182096 DEBUG oslo_concurrency.lockutils [req-b826a58d-67a5-4551-9267-297c5c42a048 req-f7ae3db6-8a3e-495c-a0c8-08e750d36707 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.579 182096 DEBUG oslo_concurrency.lockutils [req-b826a58d-67a5-4551-9267-297c5c42a048 req-f7ae3db6-8a3e-495c-a0c8-08e750d36707 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.579 182096 DEBUG nova.compute.manager [req-b826a58d-67a5-4551-9267-297c5c42a048 req-f7ae3db6-8a3e-495c-a0c8-08e750d36707 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Processing event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.588 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160405.5886247, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.589 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Started (Lifecycle Event)
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.590 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.597 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.600 182096 INFO nova.virt.libvirt.driver [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance spawned successfully.
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.600 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.611 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.614 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.621 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.621 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.621 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.621 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.622 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.622 182096 DEBUG nova.virt.libvirt.driver [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.628 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.628 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160405.5887413, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.628 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Paused (Lifecycle Event)
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.647 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.649 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160405.5922558, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.649 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Resumed (Lifecycle Event)
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.666 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.668 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.685 182096 INFO nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Took 3.73 seconds to spawn the instance on the hypervisor.
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.685 182096 DEBUG nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.692 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.734 182096 INFO nova.compute.manager [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Took 4.11 seconds to build instance.
Jan 23 09:26:45 compute-0 nova_compute[182092]: 2026-01-23 09:26:45.744 182096 DEBUG oslo_concurrency.lockutils [None req-192031ce-ffde-4ad6-aaeb-b3cdf493c789 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:45 compute-0 podman[220908]: 2026-01-23 09:26:45.84787679 +0000 UTC m=+0.039011329 container create 50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:26:45 compute-0 systemd[1]: Started libpod-conmon-50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c.scope.
Jan 23 09:26:45 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:26:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df73861b282a63b2b49a372332c2c129116f53821c243bf9495f18bd78a410ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:26:45 compute-0 podman[220908]: 2026-01-23 09:26:45.909334806 +0000 UTC m=+0.100469364 container init 50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:26:45 compute-0 podman[220908]: 2026-01-23 09:26:45.913951856 +0000 UTC m=+0.105086394 container start 50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 09:26:45 compute-0 podman[220908]: 2026-01-23 09:26:45.834104859 +0000 UTC m=+0.025239418 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:26:45 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[220920]: [NOTICE]   (220924) : New worker (220926) forked
Jan 23 09:26:45 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[220920]: [NOTICE]   (220924) : Loading success.
Jan 23 09:26:46 compute-0 nova_compute[182092]: 2026-01-23 09:26:46.164 182096 DEBUG nova.network.neutron [req-f1db705d-8e6e-4ac6-a20d-ff19aa89f549 req-70869626-f72b-4579-92a1-c87ad8519357 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updated VIF entry in instance network info cache for port 57bcbf69-c3cd-41cb-abf4-09620c4f7165. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:26:46 compute-0 nova_compute[182092]: 2026-01-23 09:26:46.164 182096 DEBUG nova.network.neutron [req-f1db705d-8e6e-4ac6-a20d-ff19aa89f549 req-70869626-f72b-4579-92a1-c87ad8519357 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updating instance_info_cache with network_info: [{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:46 compute-0 nova_compute[182092]: 2026-01-23 09:26:46.183 182096 DEBUG oslo_concurrency.lockutils [req-f1db705d-8e6e-4ac6-a20d-ff19aa89f549 req-70869626-f72b-4579-92a1-c87ad8519357 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.083 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:47 compute-0 NetworkManager[54920]: <info>  [1769160407.2136] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Jan 23 09:26:47 compute-0 NetworkManager[54920]: <info>  [1769160407.2141] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.213 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.349 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:47 compute-0 ovn_controller[94697]: 2026-01-23T09:26:47Z|00332|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.363 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.699 182096 DEBUG nova.compute.manager [req-75764a1d-7fbb-478f-8914-663fbc6902f7 req-03bf1085-078b-427d-b6b2-f0080c035b36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.700 182096 DEBUG oslo_concurrency.lockutils [req-75764a1d-7fbb-478f-8914-663fbc6902f7 req-03bf1085-078b-427d-b6b2-f0080c035b36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.700 182096 DEBUG oslo_concurrency.lockutils [req-75764a1d-7fbb-478f-8914-663fbc6902f7 req-03bf1085-078b-427d-b6b2-f0080c035b36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.700 182096 DEBUG oslo_concurrency.lockutils [req-75764a1d-7fbb-478f-8914-663fbc6902f7 req-03bf1085-078b-427d-b6b2-f0080c035b36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.701 182096 DEBUG nova.compute.manager [req-75764a1d-7fbb-478f-8914-663fbc6902f7 req-03bf1085-078b-427d-b6b2-f0080c035b36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:26:47 compute-0 nova_compute[182092]: 2026-01-23 09:26:47.701 182096 WARNING nova.compute.manager [req-75764a1d-7fbb-478f-8914-663fbc6902f7 req-03bf1085-078b-427d-b6b2-f0080c035b36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state active and task_state None.
Jan 23 09:26:48 compute-0 nova_compute[182092]: 2026-01-23 09:26:48.431 182096 DEBUG nova.compute.manager [req-81381e87-e50f-4e04-8e85-94526fe58a4a req-e06b36ad-86b8-424f-88f0-b5127de3e0fa 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-changed-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:26:48 compute-0 nova_compute[182092]: 2026-01-23 09:26:48.432 182096 DEBUG nova.compute.manager [req-81381e87-e50f-4e04-8e85-94526fe58a4a req-e06b36ad-86b8-424f-88f0-b5127de3e0fa 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Refreshing instance network info cache due to event network-changed-57bcbf69-c3cd-41cb-abf4-09620c4f7165. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:26:48 compute-0 nova_compute[182092]: 2026-01-23 09:26:48.432 182096 DEBUG oslo_concurrency.lockutils [req-81381e87-e50f-4e04-8e85-94526fe58a4a req-e06b36ad-86b8-424f-88f0-b5127de3e0fa 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:26:48 compute-0 nova_compute[182092]: 2026-01-23 09:26:48.432 182096 DEBUG oslo_concurrency.lockutils [req-81381e87-e50f-4e04-8e85-94526fe58a4a req-e06b36ad-86b8-424f-88f0-b5127de3e0fa 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:26:48 compute-0 nova_compute[182092]: 2026-01-23 09:26:48.433 182096 DEBUG nova.network.neutron [req-81381e87-e50f-4e04-8e85-94526fe58a4a req-e06b36ad-86b8-424f-88f0-b5127de3e0fa 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Refreshing network info cache for port 57bcbf69-c3cd-41cb-abf4-09620c4f7165 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:26:49 compute-0 nova_compute[182092]: 2026-01-23 09:26:49.665 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:50 compute-0 nova_compute[182092]: 2026-01-23 09:26:50.765 182096 DEBUG nova.network.neutron [req-81381e87-e50f-4e04-8e85-94526fe58a4a req-e06b36ad-86b8-424f-88f0-b5127de3e0fa 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updated VIF entry in instance network info cache for port 57bcbf69-c3cd-41cb-abf4-09620c4f7165. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:26:50 compute-0 nova_compute[182092]: 2026-01-23 09:26:50.766 182096 DEBUG nova.network.neutron [req-81381e87-e50f-4e04-8e85-94526fe58a4a req-e06b36ad-86b8-424f-88f0-b5127de3e0fa 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updating instance_info_cache with network_info: [{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:26:50 compute-0 nova_compute[182092]: 2026-01-23 09:26:50.780 182096 DEBUG oslo_concurrency.lockutils [req-81381e87-e50f-4e04-8e85-94526fe58a4a req-e06b36ad-86b8-424f-88f0-b5127de3e0fa 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:26:51 compute-0 nova_compute[182092]: 2026-01-23 09:26:51.434 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:52 compute-0 nova_compute[182092]: 2026-01-23 09:26:52.084 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:54 compute-0 nova_compute[182092]: 2026-01-23 09:26:54.667 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:55 compute-0 podman[220932]: 2026-01-23 09:26:55.230959442 +0000 UTC m=+0.063130430 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Jan 23 09:26:56 compute-0 ovn_controller[94697]: 2026-01-23T09:26:56Z|00333|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:26:56 compute-0 nova_compute[182092]: 2026-01-23 09:26:56.210 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:57 compute-0 ovn_controller[94697]: 2026-01-23T09:26:57Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:94:3b 10.100.0.7
Jan 23 09:26:57 compute-0 ovn_controller[94697]: 2026-01-23T09:26:57Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:94:3b 10.100.0.7
Jan 23 09:26:57 compute-0 nova_compute[182092]: 2026-01-23 09:26:57.085 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:26:59 compute-0 nova_compute[182092]: 2026-01-23 09:26:59.670 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:00 compute-0 nova_compute[182092]: 2026-01-23 09:27:00.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.086 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:02 compute-0 podman[220968]: 2026-01-23 09:27:02.212403895 +0000 UTC m=+0.044145916 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:27:02 compute-0 podman[220967]: 2026-01-23 09:27:02.222489592 +0000 UTC m=+0.055863752 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.668 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.717 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.764 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.765 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:02 compute-0 nova_compute[182092]: 2026-01-23 09:27:02.813 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.039 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.040 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5525MB free_disk=73.23473358154297GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.040 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.041 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.102 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 179ad298-ad18-41dd-9b5e-31064aadf7d6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.103 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.104 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.106 182096 DEBUG oslo_concurrency.lockutils [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.106 182096 DEBUG oslo_concurrency.lockutils [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.107 182096 DEBUG nova.compute.manager [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.109 182096 DEBUG nova.compute.manager [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.110 182096 DEBUG nova.objects.instance [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'flavor' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.127 182096 DEBUG nova.objects.instance [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'info_cache' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.141 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.157 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.160 182096 DEBUG nova.virt.libvirt.driver [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.175 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:27:03 compute-0 nova_compute[182092]: 2026-01-23 09:27:03.175 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:04 compute-0 ovn_controller[94697]: 2026-01-23T09:27:04Z|00334|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:27:04 compute-0 nova_compute[182092]: 2026-01-23 09:27:04.245 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:04 compute-0 nova_compute[182092]: 2026-01-23 09:27:04.671 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.175 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.176 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.176 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.189 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.189 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.189 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.190 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:05 compute-0 kernel: tap57bcbf69-c3 (unregistering): left promiscuous mode
Jan 23 09:27:05 compute-0 NetworkManager[54920]: <info>  [1769160425.2908] device (tap57bcbf69-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.295 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:05 compute-0 ovn_controller[94697]: 2026-01-23T09:27:05Z|00335|binding|INFO|Releasing lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 from this chassis (sb_readonly=0)
Jan 23 09:27:05 compute-0 ovn_controller[94697]: 2026-01-23T09:27:05Z|00336|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 down in Southbound
Jan 23 09:27:05 compute-0 ovn_controller[94697]: 2026-01-23T09:27:05Z|00337|binding|INFO|Removing iface tap57bcbf69-c3 ovn-installed in OVS
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.296 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.307 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:94:3b 10.100.0.7'], port_security=['fa:16:3e:1e:94:3b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '179ad298-ad18-41dd-9b5e-31064aadf7d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=57bcbf69-c3cd-41cb-abf4-09620c4f7165) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.308 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 57bcbf69-c3cd-41cb-abf4-09620c4f7165 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.309 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa610e7c-53f8-4775-b5b8-aa45897b011c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.310 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3b52eb0a-69bf-46e0-b7b5-afb74cd67fdb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.311 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace which is not needed anymore
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.315 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:05 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 23 09:27:05 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000005a.scope: Consumed 11.263s CPU time.
Jan 23 09:27:05 compute-0 systemd-machined[153562]: Machine qemu-46-instance-0000005a terminated.
Jan 23 09:27:05 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[220920]: [NOTICE]   (220924) : haproxy version is 2.8.14-c23fe91
Jan 23 09:27:05 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[220920]: [NOTICE]   (220924) : path to executable is /usr/sbin/haproxy
Jan 23 09:27:05 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[220920]: [ALERT]    (220924) : Current worker (220926) exited with code 143 (Terminated)
Jan 23 09:27:05 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[220920]: [WARNING]  (220924) : All workers exited. Exiting... (0)
Jan 23 09:27:05 compute-0 systemd[1]: libpod-50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c.scope: Deactivated successfully.
Jan 23 09:27:05 compute-0 podman[221033]: 2026-01-23 09:27:05.416853397 +0000 UTC m=+0.034808930 container died 50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:27:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c-userdata-shm.mount: Deactivated successfully.
Jan 23 09:27:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-df73861b282a63b2b49a372332c2c129116f53821c243bf9495f18bd78a410ab-merged.mount: Deactivated successfully.
Jan 23 09:27:05 compute-0 podman[221033]: 2026-01-23 09:27:05.442528565 +0000 UTC m=+0.060484098 container cleanup 50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:27:05 compute-0 systemd[1]: libpod-conmon-50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c.scope: Deactivated successfully.
Jan 23 09:27:05 compute-0 podman[221057]: 2026-01-23 09:27:05.482150565 +0000 UTC m=+0.024332907 container remove 50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.485 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d86acbc-dd79-436c-8e68-b1e22d117523]: (4, ('Fri Jan 23 09:27:05 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c)\n50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c\nFri Jan 23 09:27:05 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c)\n50fe2b076267604391de642da96e9377d148a59ac60f01e7634dbc84272e000c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.486 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b06a9926-9eab-418d-b419-73047cd2506f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.487 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.488 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:05 compute-0 kernel: tapfa610e7c-50: left promiscuous mode
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.504 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.506 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c6784eb5-5aed-4032-91a5-5b8f7484b392]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.516 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[62f2f4f6-d8cc-4639-8667-032f0e6392f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.516 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[255d2cd8-8741-4825-b2e9-60b2d82f59ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.529 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8fbaca-d355-4db9-a036-630e76d72a31]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391086, 'reachable_time': 16281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221075, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.530 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:27:05 compute-0 systemd[1]: run-netns-ovnmeta\x2dfa610e7c\x2d53f8\x2d4775\x2db5b8\x2daa45897b011c.mount: Deactivated successfully.
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.530 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[0eef8105-0bc0-414f-9a0b-6725c76a8d42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.693 182096 DEBUG nova.compute.manager [req-51899fce-7485-4a5c-9573-efd953d508d6 req-89a06b46-3b95-403a-9ea5-98dd1cc7cd48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.694 182096 DEBUG oslo_concurrency.lockutils [req-51899fce-7485-4a5c-9573-efd953d508d6 req-89a06b46-3b95-403a-9ea5-98dd1cc7cd48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.694 182096 DEBUG oslo_concurrency.lockutils [req-51899fce-7485-4a5c-9573-efd953d508d6 req-89a06b46-3b95-403a-9ea5-98dd1cc7cd48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.694 182096 DEBUG oslo_concurrency.lockutils [req-51899fce-7485-4a5c-9573-efd953d508d6 req-89a06b46-3b95-403a-9ea5-98dd1cc7cd48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.694 182096 DEBUG nova.compute.manager [req-51899fce-7485-4a5c-9573-efd953d508d6 req-89a06b46-3b95-403a-9ea5-98dd1cc7cd48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.694 182096 WARNING nova.compute.manager [req-51899fce-7485-4a5c-9573-efd953d508d6 req-89a06b46-3b95-403a-9ea5-98dd1cc7cd48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state active and task_state powering-off.
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.842 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.842 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:27:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:05.843 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:27:05 compute-0 nova_compute[182092]: 2026-01-23 09:27:05.949 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:06 compute-0 nova_compute[182092]: 2026-01-23 09:27:06.174 182096 INFO nova.virt.libvirt.driver [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance shutdown successfully after 3 seconds.
Jan 23 09:27:06 compute-0 nova_compute[182092]: 2026-01-23 09:27:06.179 182096 INFO nova.virt.libvirt.driver [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance destroyed successfully.
Jan 23 09:27:06 compute-0 nova_compute[182092]: 2026-01-23 09:27:06.179 182096 DEBUG nova.objects.instance [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:06 compute-0 nova_compute[182092]: 2026-01-23 09:27:06.193 182096 DEBUG nova.compute.manager [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:06 compute-0 nova_compute[182092]: 2026-01-23 09:27:06.242 182096 DEBUG oslo_concurrency.lockutils [None req-14c505cb-0bbb-4d21-ae2f-4cbce0f78ad0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.089 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.500 182096 DEBUG nova.objects.instance [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'flavor' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.517 182096 DEBUG nova.objects.instance [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'info_cache' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.534 182096 DEBUG oslo_concurrency.lockutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.969 182096 DEBUG nova.compute.manager [req-8ce14aa6-b925-412c-b217-c0eb4a94d2ac req-d5385d42-d180-452d-be2d-b96befe911ec 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.969 182096 DEBUG oslo_concurrency.lockutils [req-8ce14aa6-b925-412c-b217-c0eb4a94d2ac req-d5385d42-d180-452d-be2d-b96befe911ec 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.970 182096 DEBUG oslo_concurrency.lockutils [req-8ce14aa6-b925-412c-b217-c0eb4a94d2ac req-d5385d42-d180-452d-be2d-b96befe911ec 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.970 182096 DEBUG oslo_concurrency.lockutils [req-8ce14aa6-b925-412c-b217-c0eb4a94d2ac req-d5385d42-d180-452d-be2d-b96befe911ec 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.970 182096 DEBUG nova.compute.manager [req-8ce14aa6-b925-412c-b217-c0eb4a94d2ac req-d5385d42-d180-452d-be2d-b96befe911ec 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:07 compute-0 nova_compute[182092]: 2026-01-23 09:27:07.970 182096 WARNING nova.compute.manager [req-8ce14aa6-b925-412c-b217-c0eb4a94d2ac req-d5385d42-d180-452d-be2d-b96befe911ec 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state stopped and task_state powering-on.
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.055 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updating instance_info_cache with network_info: [{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.074 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.074 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.075 182096 DEBUG oslo_concurrency.lockutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.075 182096 DEBUG nova.network.neutron [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.075 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.076 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.076 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.076 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:27:08 compute-0 nova_compute[182092]: 2026-01-23 09:27:08.546 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.140 182096 DEBUG nova.network.neutron [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updating instance_info_cache with network_info: [{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.157 182096 DEBUG oslo_concurrency.lockutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.174 182096 INFO nova.virt.libvirt.driver [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance destroyed successfully.
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.174 182096 DEBUG nova.objects.instance [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.182 182096 DEBUG nova.objects.instance [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'resources' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.190 182096 DEBUG nova.virt.libvirt.vif [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-235035716',display_name='tempest-ServerActionsTestJSON-server-235035716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-235035716',id=90,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:26:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-fwke0l95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:27:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=179ad298-ad18-41dd-9b5e-31064aadf7d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.190 182096 DEBUG nova.network.os_vif_util [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.191 182096 DEBUG nova.network.os_vif_util [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.191 182096 DEBUG os_vif [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.192 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.193 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57bcbf69-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.194 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.195 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.198 182096 INFO os_vif [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3')
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.202 182096 DEBUG nova.virt.libvirt.driver [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Start _get_guest_xml network_info=[{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.204 182096 WARNING nova.virt.libvirt.driver [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.208 182096 DEBUG nova.virt.libvirt.host [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.208 182096 DEBUG nova.virt.libvirt.host [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.211 182096 DEBUG nova.virt.libvirt.host [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.211 182096 DEBUG nova.virt.libvirt.host [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.212 182096 DEBUG nova.virt.libvirt.driver [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.212 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.212 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.213 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.213 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.213 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.213 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.213 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.214 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.214 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.214 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.214 182096 DEBUG nova.virt.hardware [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.214 182096 DEBUG nova.objects.instance [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.225 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.274 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.config --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.275 182096 DEBUG oslo_concurrency.lockutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.275 182096 DEBUG oslo_concurrency.lockutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.276 182096 DEBUG oslo_concurrency.lockutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.277 182096 DEBUG nova.virt.libvirt.vif [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-235035716',display_name='tempest-ServerActionsTestJSON-server-235035716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-235035716',id=90,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:26:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-fwke0l95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:27:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=179ad298-ad18-41dd-9b5e-31064aadf7d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.277 182096 DEBUG nova.network.os_vif_util [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.278 182096 DEBUG nova.network.os_vif_util [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.279 182096 DEBUG nova.objects.instance [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.287 182096 DEBUG nova.virt.libvirt.driver [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <uuid>179ad298-ad18-41dd-9b5e-31064aadf7d6</uuid>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <name>instance-0000005a</name>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerActionsTestJSON-server-235035716</nova:name>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:27:09</nova:creationTime>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:27:09 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:27:09 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:27:09 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:27:09 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:27:09 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:27:09 compute-0 nova_compute[182092]:         <nova:user uuid="89c019480e524c04af4d250b1c4051e5">tempest-ServerActionsTestJSON-766366320-project-member</nova:user>
Jan 23 09:27:09 compute-0 nova_compute[182092]:         <nova:project uuid="860ef09b9e6e4866bbe99b6e769733a3">tempest-ServerActionsTestJSON-766366320</nova:project>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:27:09 compute-0 nova_compute[182092]:         <nova:port uuid="57bcbf69-c3cd-41cb-abf4-09620c4f7165">
Jan 23 09:27:09 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <system>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <entry name="serial">179ad298-ad18-41dd-9b5e-31064aadf7d6</entry>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <entry name="uuid">179ad298-ad18-41dd-9b5e-31064aadf7d6</entry>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </system>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <os>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   </os>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <features>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   </features>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk.config"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:1e:94:3b"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <target dev="tap57bcbf69-c3"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/console.log" append="off"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <video>
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </video>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <input type="keyboard" bus="usb"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:27:09 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:27:09 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:27:09 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:27:09 compute-0 nova_compute[182092]: </domain>
Jan 23 09:27:09 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.288 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.336 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.337 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.383 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.384 182096 DEBUG nova.objects.instance [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.400 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.446 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.446 182096 DEBUG nova.virt.disk.api [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Checking if we can resize image /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.447 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.493 182096 DEBUG oslo_concurrency.processutils [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.494 182096 DEBUG nova.virt.disk.api [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Cannot resize image /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.494 182096 DEBUG nova.objects.instance [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.508 182096 DEBUG nova.virt.libvirt.vif [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-235035716',display_name='tempest-ServerActionsTestJSON-server-235035716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-235035716',id=90,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:26:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-fwke0l95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:27:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=179ad298-ad18-41dd-9b5e-31064aadf7d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.508 182096 DEBUG nova.network.os_vif_util [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.509 182096 DEBUG nova.network.os_vif_util [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.509 182096 DEBUG os_vif [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.510 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.510 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.511 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.513 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.513 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57bcbf69-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.513 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57bcbf69-c3, col_values=(('external_ids', {'iface-id': '57bcbf69-c3cd-41cb-abf4-09620c4f7165', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:94:3b', 'vm-uuid': '179ad298-ad18-41dd-9b5e-31064aadf7d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:09 compute-0 NetworkManager[54920]: <info>  [1769160429.5156] manager: (tap57bcbf69-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.515 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.518 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.520 182096 INFO os_vif [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3')
Jan 23 09:27:09 compute-0 kernel: tap57bcbf69-c3: entered promiscuous mode
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.572 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 ovn_controller[94697]: 2026-01-23T09:27:09Z|00338|binding|INFO|Claiming lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 for this chassis.
Jan 23 09:27:09 compute-0 ovn_controller[94697]: 2026-01-23T09:27:09Z|00339|binding|INFO|57bcbf69-c3cd-41cb-abf4-09620c4f7165: Claiming fa:16:3e:1e:94:3b 10.100.0.7
Jan 23 09:27:09 compute-0 NetworkManager[54920]: <info>  [1769160429.5739] manager: (tap57bcbf69-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.579 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:94:3b 10.100.0.7'], port_security=['fa:16:3e:1e:94:3b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '179ad298-ad18-41dd-9b5e-31064aadf7d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=57bcbf69-c3cd-41cb-abf4-09620c4f7165) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.580 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 57bcbf69-c3cd-41cb-abf4-09620c4f7165 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.582 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:27:09 compute-0 ovn_controller[94697]: 2026-01-23T09:27:09Z|00340|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 ovn-installed in OVS
Jan 23 09:27:09 compute-0 ovn_controller[94697]: 2026-01-23T09:27:09Z|00341|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 up in Southbound
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.588 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.590 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1a903f61-c687-4f39-bb7f-6a30bddec418]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.590 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa610e7c-51 in ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.593 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa610e7c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.593 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb57635-0431-49d2-8497-ce384c728443]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.594 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c902babf-1828-4dc4-981b-e8523ddc6603]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 systemd-udevd[221124]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.605 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc29095-7616-4e55-a608-f89f3538c722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 NetworkManager[54920]: <info>  [1769160429.6138] device (tap57bcbf69-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:27:09 compute-0 NetworkManager[54920]: <info>  [1769160429.6145] device (tap57bcbf69-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:27:09 compute-0 systemd-machined[153562]: New machine qemu-47-instance-0000005a.
Jan 23 09:27:09 compute-0 systemd[1]: Started Virtual Machine qemu-47-instance-0000005a.
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.627 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[82d69e7a-eef0-4f48-a108-325171b787c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.648 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[382141bf-bbf1-42df-84ef-ffd5d77c1a11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 NetworkManager[54920]: <info>  [1769160429.6525] manager: (tapfa610e7c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.654 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fa391a1e-1df9-4015-8446-72617cfea1bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.683 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b51b97-82e0-4653-a01f-87c34d9d36a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.685 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2de67776-7c5e-4a3e-a0d5-ceec4a9dd552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 NetworkManager[54920]: <info>  [1769160429.7038] device (tapfa610e7c-50): carrier: link connected
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.708 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[62d579b6-6920-4a96-acf9-e6df4312dc0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.720 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[921a0ee5-650d-4932-a10d-11772c6c36de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393518, 'reachable_time': 42756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221148, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.732 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[487b6f74-d7f9-496d-a6f9-de71b6a89734]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:516c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 393518, 'tstamp': 393518}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221149, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.745 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9982f4-8aaf-4a54-bf76-46ef9106f07e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393518, 'reachable_time': 42756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221150, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.766 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[40f44cbe-1a84-40de-894f-b5c81dc29fc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.808 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[60eb4fd8-fa29-4ba9-84e9-52ffe9703246]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.809 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.809 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.810 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.811 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 kernel: tapfa610e7c-50: entered promiscuous mode
Jan 23 09:27:09 compute-0 NetworkManager[54920]: <info>  [1769160429.8120] manager: (tapfa610e7c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.818 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:09 compute-0 ovn_controller[94697]: 2026-01-23T09:27:09Z|00342|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.819 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.822 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.830 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.830 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[97d370f9-41e8-48d3-8a2c-93c6cdd5899b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.832 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:27:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:09.833 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'env', 'PROCESS_TAG=haproxy-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa610e7c-53f8-4775-b5b8-aa45897b011c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.928 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 179ad298-ad18-41dd-9b5e-31064aadf7d6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.933 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160429.9284945, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.934 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Resumed (Lifecycle Event)
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.936 182096 DEBUG nova.compute.manager [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.942 182096 INFO nova.virt.libvirt.driver [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance rebooted successfully.
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.942 182096 DEBUG nova.compute.manager [None req-41a42621-dc0f-4a05-9f31-b681e47e403b 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.970 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.972 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.996 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] During sync_power_state the instance has a pending task (powering-on). Skip.
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.997 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160429.9292326, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:09 compute-0 nova_compute[182092]: 2026-01-23 09:27:09.997 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Started (Lifecycle Event)
Jan 23 09:27:10 compute-0 nova_compute[182092]: 2026-01-23 09:27:10.018 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:10 compute-0 nova_compute[182092]: 2026-01-23 09:27:10.029 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:27:10 compute-0 podman[221185]: 2026-01-23 09:27:10.128579948 +0000 UTC m=+0.030595803 container create 5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:27:10 compute-0 systemd[1]: Started libpod-conmon-5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d.scope.
Jan 23 09:27:10 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:27:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74ede6c78723dbd9f503fc295184c28254125129ccf5c271d0306cb3d2ebb8a1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:27:10 compute-0 podman[221185]: 2026-01-23 09:27:10.185120246 +0000 UTC m=+0.087136100 container init 5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:27:10 compute-0 podman[221185]: 2026-01-23 09:27:10.189808211 +0000 UTC m=+0.091824055 container start 5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 09:27:10 compute-0 podman[221185]: 2026-01-23 09:27:10.11500567 +0000 UTC m=+0.017021524 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:27:10 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221198]: [NOTICE]   (221202) : New worker (221204) forked
Jan 23 09:27:10 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221198]: [NOTICE]   (221202) : Loading success.
Jan 23 09:27:10 compute-0 nova_compute[182092]: 2026-01-23 09:27:10.401 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:11 compute-0 nova_compute[182092]: 2026-01-23 09:27:11.482 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:12 compute-0 nova_compute[182092]: 2026-01-23 09:27:12.091 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:13 compute-0 podman[221210]: 2026-01-23 09:27:13.228215551 +0000 UTC m=+0.068961993 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 09:27:13 compute-0 podman[221211]: 2026-01-23 09:27:13.228355876 +0000 UTC m=+0.066618702 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.100 182096 DEBUG nova.objects.instance [None req-ee61c14e-043e-478c-9d67-6d5045a76b71 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.116 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160434.1161036, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.116 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Paused (Lifecycle Event)
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.127 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.130 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.143 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.338 182096 DEBUG nova.compute.manager [req-f479cba2-020e-4f05-aa5e-57621b228498 req-03f3b11d-ce25-4842-a8e0-7f6b366dee3a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.338 182096 DEBUG oslo_concurrency.lockutils [req-f479cba2-020e-4f05-aa5e-57621b228498 req-03f3b11d-ce25-4842-a8e0-7f6b366dee3a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.339 182096 DEBUG oslo_concurrency.lockutils [req-f479cba2-020e-4f05-aa5e-57621b228498 req-03f3b11d-ce25-4842-a8e0-7f6b366dee3a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.339 182096 DEBUG oslo_concurrency.lockutils [req-f479cba2-020e-4f05-aa5e-57621b228498 req-03f3b11d-ce25-4842-a8e0-7f6b366dee3a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.339 182096 DEBUG nova.compute.manager [req-f479cba2-020e-4f05-aa5e-57621b228498 req-03f3b11d-ce25-4842-a8e0-7f6b366dee3a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.339 182096 WARNING nova.compute.manager [req-f479cba2-020e-4f05-aa5e-57621b228498 req-03f3b11d-ce25-4842-a8e0-7f6b366dee3a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state active and task_state suspending.
Jan 23 09:27:14 compute-0 kernel: tap57bcbf69-c3 (unregistering): left promiscuous mode
Jan 23 09:27:14 compute-0 NetworkManager[54920]: <info>  [1769160434.5033] device (tap57bcbf69-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:27:14 compute-0 ovn_controller[94697]: 2026-01-23T09:27:14Z|00343|binding|INFO|Releasing lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 from this chassis (sb_readonly=0)
Jan 23 09:27:14 compute-0 ovn_controller[94697]: 2026-01-23T09:27:14Z|00344|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 down in Southbound
Jan 23 09:27:14 compute-0 ovn_controller[94697]: 2026-01-23T09:27:14Z|00345|binding|INFO|Removing iface tap57bcbf69-c3 ovn-installed in OVS
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.511 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.515 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.521 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:94:3b 10.100.0.7'], port_security=['fa:16:3e:1e:94:3b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '179ad298-ad18-41dd-9b5e-31064aadf7d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=57bcbf69-c3cd-41cb-abf4-09620c4f7165) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.522 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 57bcbf69-c3cd-41cb-abf4-09620c4f7165 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.523 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa610e7c-53f8-4775-b5b8-aa45897b011c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.524 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4eef8c-2f40-4e21-88a5-6fda8f799717]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.525 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace which is not needed anymore
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.526 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:14 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 23 09:27:14 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000005a.scope: Consumed 4.619s CPU time.
Jan 23 09:27:14 compute-0 systemd-machined[153562]: Machine qemu-47-instance-0000005a terminated.
Jan 23 09:27:14 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221198]: [NOTICE]   (221202) : haproxy version is 2.8.14-c23fe91
Jan 23 09:27:14 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221198]: [NOTICE]   (221202) : path to executable is /usr/sbin/haproxy
Jan 23 09:27:14 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221198]: [WARNING]  (221202) : Exiting Master process...
Jan 23 09:27:14 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221198]: [WARNING]  (221202) : Exiting Master process...
Jan 23 09:27:14 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221198]: [ALERT]    (221202) : Current worker (221204) exited with code 143 (Terminated)
Jan 23 09:27:14 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221198]: [WARNING]  (221202) : All workers exited. Exiting... (0)
Jan 23 09:27:14 compute-0 systemd[1]: libpod-5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d.scope: Deactivated successfully.
Jan 23 09:27:14 compute-0 podman[221272]: 2026-01-23 09:27:14.623304224 +0000 UTC m=+0.034161150 container died 5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:27:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d-userdata-shm.mount: Deactivated successfully.
Jan 23 09:27:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-74ede6c78723dbd9f503fc295184c28254125129ccf5c271d0306cb3d2ebb8a1-merged.mount: Deactivated successfully.
Jan 23 09:27:14 compute-0 podman[221272]: 2026-01-23 09:27:14.642722844 +0000 UTC m=+0.053579760 container cleanup 5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:27:14 compute-0 systemd[1]: libpod-conmon-5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d.scope: Deactivated successfully.
Jan 23 09:27:14 compute-0 podman[221295]: 2026-01-23 09:27:14.683035437 +0000 UTC m=+0.023414218 container remove 5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.686 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0509155f-73ab-4d52-b117-c20addbb1558]: (4, ('Fri Jan 23 09:27:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d)\n5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d\nFri Jan 23 09:27:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d)\n5cb0252e2fb7c3437daabdba7dfb8076cad5b080d1df29fd65c61e2bc707b33d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.688 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d29a3a45-5f6a-4f2e-be5b-3b5fc5648873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.689 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:14 compute-0 kernel: tapfa610e7c-50: left promiscuous mode
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.690 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:14 compute-0 NetworkManager[54920]: <error> [1769160434.7000] platform-linux: error reading net:/sys/class/net/tap57bcbf69-c3/dev_id: error reading 4096 bytes from file descriptor: Invalid argument
Jan 23 09:27:14 compute-0 NetworkManager[54920]: <info>  [1769160434.7007] manager: (tap57bcbf69-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/178)
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.711 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.712 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.713 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c013cd77-991f-46c1-a2c2-f07189b09811]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.721 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba634c4-1402-43dc-a52f-49acd74979a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.722 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0edbcd5b-8706-4df3-b86b-485e02a1fa3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:14 compute-0 nova_compute[182092]: 2026-01-23 09:27:14.736 182096 DEBUG nova.compute.manager [None req-ee61c14e-043e-478c-9d67-6d5045a76b71 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.737 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f2393535-7824-4ebc-b4d0-3b1d3522a223]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393513, 'reachable_time': 24456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221319, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:14 compute-0 systemd[1]: run-netns-ovnmeta\x2dfa610e7c\x2d53f8\x2d4775\x2db5b8\x2daa45897b011c.mount: Deactivated successfully.
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.739 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.739 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[e99e2092-7a82-4339-86c0-ebfb57133c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:14.844 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:15 compute-0 podman[221325]: 2026-01-23 09:27:15.204188223 +0000 UTC m=+0.041932449 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 23 09:27:15 compute-0 nova_compute[182092]: 2026-01-23 09:27:15.333 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.093 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.219 182096 DEBUG nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.219 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.219 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.219 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.220 182096 DEBUG nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.220 182096 WARNING nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state suspended and task_state None.
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.220 182096 DEBUG nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.220 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.220 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.220 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.220 182096 DEBUG nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.220 182096 WARNING nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state suspended and task_state None.
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.221 182096 DEBUG nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.221 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.221 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.221 182096 DEBUG oslo_concurrency.lockutils [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.221 182096 DEBUG nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.221 182096 WARNING nova.compute.manager [req-d5d5bcee-1c6a-437b-a166-36c3c355b829 req-f541956c-d948-4c95-a251-94a4f424ecf2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state suspended and task_state None.
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.339 182096 INFO nova.compute.manager [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Resuming
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.340 182096 DEBUG nova.objects.instance [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'flavor' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.371 182096 DEBUG oslo_concurrency.lockutils [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.372 182096 DEBUG oslo_concurrency.lockutils [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquired lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:27:17 compute-0 nova_compute[182092]: 2026-01-23 09:27:17.373 182096 DEBUG nova.network.neutron [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.439 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "2efdc376-bfd9-4146-80d0-89cad9bf5107" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.439 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.451 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.517 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.531 182096 DEBUG nova.network.neutron [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updating instance_info_cache with network_info: [{"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.533 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.533 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.538 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.538 182096 INFO nova.compute.claims [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.542 182096 DEBUG oslo_concurrency.lockutils [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Releasing lock "refresh_cache-179ad298-ad18-41dd-9b5e-31064aadf7d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.545 182096 DEBUG nova.virt.libvirt.vif [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-235035716',display_name='tempest-ServerActionsTestJSON-server-235035716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-235035716',id=90,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:26:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-fwke0l95',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:27:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=179ad298-ad18-41dd-9b5e-31064aadf7d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.545 182096 DEBUG nova.network.os_vif_util [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.545 182096 DEBUG nova.network.os_vif_util [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.546 182096 DEBUG os_vif [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.546 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.546 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.546 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.548 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57bcbf69-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.548 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57bcbf69-c3, col_values=(('external_ids', {'iface-id': '57bcbf69-c3cd-41cb-abf4-09620c4f7165', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:94:3b', 'vm-uuid': '179ad298-ad18-41dd-9b5e-31064aadf7d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.548 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.549 182096 INFO os_vif [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3')
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.562 182096 DEBUG nova.objects.instance [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:19 compute-0 kernel: tap57bcbf69-c3: entered promiscuous mode
Jan 23 09:27:19 compute-0 NetworkManager[54920]: <info>  [1769160439.6180] manager: (tap57bcbf69-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/179)
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.619 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 ovn_controller[94697]: 2026-01-23T09:27:19Z|00346|binding|INFO|Claiming lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 for this chassis.
Jan 23 09:27:19 compute-0 ovn_controller[94697]: 2026-01-23T09:27:19Z|00347|binding|INFO|57bcbf69-c3cd-41cb-abf4-09620c4f7165: Claiming fa:16:3e:1e:94:3b 10.100.0.7
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.626 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:94:3b 10.100.0.7'], port_security=['fa:16:3e:1e:94:3b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '179ad298-ad18-41dd-9b5e-31064aadf7d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=57bcbf69-c3cd-41cb-abf4-09620c4f7165) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.627 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 57bcbf69-c3cd-41cb-abf4-09620c4f7165 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c bound to our chassis
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.628 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:27:19 compute-0 ovn_controller[94697]: 2026-01-23T09:27:19Z|00348|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 ovn-installed in OVS
Jan 23 09:27:19 compute-0 ovn_controller[94697]: 2026-01-23T09:27:19Z|00349|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 up in Southbound
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.635 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.637 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0e8eb55e-0f28-465c-92e3-2809a1826e17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.638 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa610e7c-51 in ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.640 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.639 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa610e7c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.640 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[35f47712-4687-442a-b3c6-be4dd9f65e4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.641 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c328e911-eb5e-4be7-869d-2b95328aeb68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 systemd-udevd[221359]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.649 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[eda28f37-531a-4c26-84be-4ac424102e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 NetworkManager[54920]: <info>  [1769160439.6580] device (tap57bcbf69-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:27:19 compute-0 NetworkManager[54920]: <info>  [1769160439.6585] device (tap57bcbf69-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:27:19 compute-0 systemd-machined[153562]: New machine qemu-48-instance-0000005a.
Jan 23 09:27:19 compute-0 systemd[1]: Started Virtual Machine qemu-48-instance-0000005a.
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.670 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[45a9a233-83fd-4404-aca5-654cb5fb7e0f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.691 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e288613f-620f-48f9-b316-9b44a8231545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 NetworkManager[54920]: <info>  [1769160439.6964] manager: (tapfa610e7c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/180)
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.695 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c47f3053-4737-41d9-bc00-e4d7b4dc9a16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.721 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e7bc78-cb1d-47f4-abc6-61fd93bac1bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.724 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c5fdab92-f580-4948-827c-772f5dcfa684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 NetworkManager[54920]: <info>  [1769160439.7388] device (tapfa610e7c-50): carrier: link connected
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.743 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d16b4d18-ec7e-46bf-b0de-77cc260659b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.756 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4452d1de-1485-438f-9684-31ad547d1b35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394522, 'reachable_time': 19036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221384, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.767 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[196a48a1-7d69-4ed6-b3a0-1c9f2c9ce580]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:516c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 394522, 'tstamp': 394522}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221385, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.780 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[884c9ba5-66c9-4669-89b1-2b3b727e9101]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa610e7c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:51:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394522, 'reachable_time': 19036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221386, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.802 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d394eb-18c4-4aed-a127-cf58587268e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.843 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6fb10d82-7d25-4cf0-b5c2-f91d7e207b3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.844 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.845 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.845 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa610e7c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:19 compute-0 NetworkManager[54920]: <info>  [1769160439.8471] manager: (tapfa610e7c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Jan 23 09:27:19 compute-0 kernel: tapfa610e7c-50: entered promiscuous mode
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.850 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.854 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa610e7c-50, col_values=(('external_ids', {'iface-id': '087749ae-c7aa-4e41-83a4-068ced6791c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:19 compute-0 ovn_controller[94697]: 2026-01-23T09:27:19Z|00350|binding|INFO|Releasing lport 087749ae-c7aa-4e41-83a4-068ced6791c0 from this chassis (sb_readonly=0)
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.855 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.868 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.869 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.870 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9e2cdc12-493e-4992-b9d9-21fdfda4cc52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.871 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/fa610e7c-53f8-4775-b5b8-aa45897b011c.pid.haproxy
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID fa610e7c-53f8-4775-b5b8-aa45897b011c
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:27:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:19.872 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'env', 'PROCESS_TAG=haproxy-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa610e7c-53f8-4775-b5b8-aa45897b011c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.914 182096 DEBUG nova.compute.provider_tree [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.927 182096 DEBUG nova.scheduler.client.report [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.942 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.942 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.976 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.976 182096 DEBUG nova.network.neutron [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:27:19 compute-0 nova_compute[182092]: 2026-01-23 09:27:19.990 182096 INFO nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.001 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.047 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 179ad298-ad18-41dd-9b5e-31064aadf7d6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.047 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160440.0473378, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.047 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Started (Lifecycle Event)
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.059 182096 DEBUG nova.compute.manager [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.060 182096 DEBUG nova.objects.instance [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.082 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.085 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.087 182096 INFO nova.virt.libvirt.driver [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance running successfully.
Jan 23 09:27:20 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.089 182096 DEBUG nova.virt.libvirt.guest [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.089 182096 DEBUG nova.compute.manager [None req-62523235-0e69-4ec6-a812-69087cd9ac42 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.094 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.095 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.095 182096 INFO nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Creating image(s)
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.095 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "/var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.096 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "/var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.097 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "/var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.107 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.121 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.121 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160440.0518906, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.121 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Resumed (Lifecycle Event)
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.157 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.161 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.161 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.162 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.175 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:20 compute-0 podman[221422]: 2026-01-23 09:27:20.184499368 +0000 UTC m=+0.032179422 container create 4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.194 182096 DEBUG nova.policy [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'da28b720673344f98f7bca36bd875832', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2a5b852465da47d2828e9d8bfdd0a944', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.203 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:27:20 compute-0 systemd[1]: Started libpod-conmon-4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57.scope.
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.223 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.224 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:20 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:27:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3be851f5af584a0c7147efac0718130719a50af778707bef815f93822a1a06c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:27:20 compute-0 podman[221422]: 2026-01-23 09:27:20.248404373 +0000 UTC m=+0.096084448 container init 4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:27:20 compute-0 podman[221422]: 2026-01-23 09:27:20.252712029 +0000 UTC m=+0.100392082 container start 4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.252 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:20 compute-0 podman[221422]: 2026-01-23 09:27:20.169751498 +0000 UTC m=+0.017431572 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.255 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.256 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:20 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221436]: [NOTICE]   (221446) : New worker (221449) forked
Jan 23 09:27:20 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221436]: [NOTICE]   (221446) : Loading success.
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.309 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.310 182096 DEBUG nova.virt.disk.api [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Checking if we can resize image /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.311 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.368 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.369 182096 DEBUG nova.virt.disk.api [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Cannot resize image /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.370 182096 DEBUG nova.objects.instance [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lazy-loading 'migration_context' on Instance uuid 2efdc376-bfd9-4146-80d0-89cad9bf5107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.383 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.383 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Ensure instance console log exists: /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.383 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.383 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.384 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.407 182096 DEBUG nova.compute.manager [req-2e2ede44-f3d9-4eef-901e-990f32673b16 req-9d5127c2-9492-41fb-b1e1-ed036a3d8e5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.407 182096 DEBUG oslo_concurrency.lockutils [req-2e2ede44-f3d9-4eef-901e-990f32673b16 req-9d5127c2-9492-41fb-b1e1-ed036a3d8e5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.407 182096 DEBUG oslo_concurrency.lockutils [req-2e2ede44-f3d9-4eef-901e-990f32673b16 req-9d5127c2-9492-41fb-b1e1-ed036a3d8e5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.408 182096 DEBUG oslo_concurrency.lockutils [req-2e2ede44-f3d9-4eef-901e-990f32673b16 req-9d5127c2-9492-41fb-b1e1-ed036a3d8e5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.408 182096 DEBUG nova.compute.manager [req-2e2ede44-f3d9-4eef-901e-990f32673b16 req-9d5127c2-9492-41fb-b1e1-ed036a3d8e5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.408 182096 WARNING nova.compute.manager [req-2e2ede44-f3d9-4eef-901e-990f32673b16 req-9d5127c2-9492-41fb-b1e1-ed036a3d8e5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state active and task_state None.
Jan 23 09:27:20 compute-0 nova_compute[182092]: 2026-01-23 09:27:20.811 182096 DEBUG nova.network.neutron [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Successfully created port: 89cf9c6f-f3bf-489a-afce-46b308954126 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.095 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.134 182096 DEBUG nova.network.neutron [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Successfully updated port: 89cf9c6f-f3bf-489a-afce-46b308954126 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.148 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "refresh_cache-2efdc376-bfd9-4146-80d0-89cad9bf5107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.148 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquired lock "refresh_cache-2efdc376-bfd9-4146-80d0-89cad9bf5107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.149 182096 DEBUG nova.network.neutron [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.284 182096 DEBUG nova.network.neutron [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.483 182096 DEBUG nova.compute.manager [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.483 182096 DEBUG oslo_concurrency.lockutils [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.484 182096 DEBUG oslo_concurrency.lockutils [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.484 182096 DEBUG oslo_concurrency.lockutils [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.484 182096 DEBUG nova.compute.manager [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.484 182096 WARNING nova.compute.manager [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state active and task_state None.
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.484 182096 DEBUG nova.compute.manager [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received event network-changed-89cf9c6f-f3bf-489a-afce-46b308954126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.485 182096 DEBUG nova.compute.manager [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Refreshing instance network info cache due to event network-changed-89cf9c6f-f3bf-489a-afce-46b308954126. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.485 182096 DEBUG oslo_concurrency.lockutils [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2efdc376-bfd9-4146-80d0-89cad9bf5107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.621 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.622 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.622 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.622 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.622 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.628 182096 INFO nova.compute.manager [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Terminating instance
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.634 182096 DEBUG nova.compute.manager [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:27:22 compute-0 kernel: tap57bcbf69-c3 (unregistering): left promiscuous mode
Jan 23 09:27:22 compute-0 NetworkManager[54920]: <info>  [1769160442.6565] device (tap57bcbf69-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:27:22 compute-0 ovn_controller[94697]: 2026-01-23T09:27:22Z|00351|binding|INFO|Releasing lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 from this chassis (sb_readonly=0)
Jan 23 09:27:22 compute-0 ovn_controller[94697]: 2026-01-23T09:27:22Z|00352|binding|INFO|Setting lport 57bcbf69-c3cd-41cb-abf4-09620c4f7165 down in Southbound
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.662 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 ovn_controller[94697]: 2026-01-23T09:27:22Z|00353|binding|INFO|Removing iface tap57bcbf69-c3 ovn-installed in OVS
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.668 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.678 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:94:3b 10.100.0.7'], port_security=['fa:16:3e:1e:94:3b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '179ad298-ad18-41dd-9b5e-31064aadf7d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '860ef09b9e6e4866bbe99b6e769733a3', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c73e3ccc-853f-4a55-921b-7a5406854b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a383db79-01c1-4214-aaf0-757b3e67012d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=57bcbf69-c3cd-41cb-abf4-09620c4f7165) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.679 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 57bcbf69-c3cd-41cb-abf4-09620c4f7165 in datapath fa610e7c-53f8-4775-b5b8-aa45897b011c unbound from our chassis
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.681 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa610e7c-53f8-4775-b5b8-aa45897b011c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.681 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.682 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[121427b4-736e-4040-a235-dd03b04208ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.682 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c namespace which is not needed anymore
Jan 23 09:27:22 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 23 09:27:22 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005a.scope: Consumed 2.927s CPU time.
Jan 23 09:27:22 compute-0 systemd-machined[153562]: Machine qemu-48-instance-0000005a terminated.
Jan 23 09:27:22 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221436]: [NOTICE]   (221446) : haproxy version is 2.8.14-c23fe91
Jan 23 09:27:22 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221436]: [NOTICE]   (221446) : path to executable is /usr/sbin/haproxy
Jan 23 09:27:22 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221436]: [WARNING]  (221446) : Exiting Master process...
Jan 23 09:27:22 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221436]: [ALERT]    (221446) : Current worker (221449) exited with code 143 (Terminated)
Jan 23 09:27:22 compute-0 neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c[221436]: [WARNING]  (221446) : All workers exited. Exiting... (0)
Jan 23 09:27:22 compute-0 systemd[1]: libpod-4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57.scope: Deactivated successfully.
Jan 23 09:27:22 compute-0 conmon[221436]: conmon 4349b05160b8248a453a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57.scope/container/memory.events
Jan 23 09:27:22 compute-0 podman[221479]: 2026-01-23 09:27:22.778694301 +0000 UTC m=+0.034054460 container died 4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:27:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57-userdata-shm.mount: Deactivated successfully.
Jan 23 09:27:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-3be851f5af584a0c7147efac0718130719a50af778707bef815f93822a1a06c6-merged.mount: Deactivated successfully.
Jan 23 09:27:22 compute-0 podman[221479]: 2026-01-23 09:27:22.800464197 +0000 UTC m=+0.055824355 container cleanup 4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 09:27:22 compute-0 systemd[1]: libpod-conmon-4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57.scope: Deactivated successfully.
Jan 23 09:27:22 compute-0 podman[221502]: 2026-01-23 09:27:22.84296176 +0000 UTC m=+0.026216574 container remove 4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.846 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[44a61810-7002-4a60-b2f3-44d2410559d8]: (4, ('Fri Jan 23 09:27:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57)\n4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57\nFri Jan 23 09:27:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c (4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57)\n4349b05160b8248a453adae444fafac342b140468502a8580f1272e36f5abf57\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.849 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[41b9d050-7803-4dbf-afde-13af9ca0b683]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.850 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa610e7c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.852 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 kernel: tapfa610e7c-50: left promiscuous mode
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.866 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.870 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.872 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[76fa1c64-30ca-4c4a-a664-ad2736c59592]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.880 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f85b0cf9-e453-4bc9-be3a-b375833fa11e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.881 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[34194970-ec93-4ccc-ace0-5f0e40b42026]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.891 182096 INFO nova.virt.libvirt.driver [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Instance destroyed successfully.
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.891 182096 DEBUG nova.objects.instance [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lazy-loading 'resources' on Instance uuid 179ad298-ad18-41dd-9b5e-31064aadf7d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.893 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1ba35e-b145-4905-a4d1-0f69ee5849a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394517, 'reachable_time': 44277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221531, 'error': None, 'target': 'ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:22 compute-0 systemd[1]: run-netns-ovnmeta\x2dfa610e7c\x2d53f8\x2d4775\x2db5b8\x2daa45897b011c.mount: Deactivated successfully.
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.896 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa610e7c-53f8-4775-b5b8-aa45897b011c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:27:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:22.896 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b576bcab-9455-417e-8d61-58becee0c782]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.906 182096 DEBUG nova.virt.libvirt.vif [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-235035716',display_name='tempest-ServerActionsTestJSON-server-235035716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-235035716',id=90,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFxdNHA51Dj3D+6o372O6t+nlYQ+lrsYYw8Wjt3F0Br7FW9Uzi4a/vhPnWm+K3oTWkuVi8eitSQfz1oWu+auxxSGZ4/CzTI8nGjTH9r41UxFBCrk+Q7fka6ukPQCbR2MBw==',key_name='tempest-keypair-1903779778',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:26:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='860ef09b9e6e4866bbe99b6e769733a3',ramdisk_id='',reservation_id='r-fwke0l95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-766366320',owner_user_name='tempest-ServerActionsTestJSON-766366320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:27:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='89c019480e524c04af4d250b1c4051e5',uuid=179ad298-ad18-41dd-9b5e-31064aadf7d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.907 182096 DEBUG nova.network.os_vif_util [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converting VIF {"id": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "address": "fa:16:3e:1e:94:3b", "network": {"id": "fa610e7c-53f8-4775-b5b8-aa45897b011c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-549684817-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "860ef09b9e6e4866bbe99b6e769733a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57bcbf69-c3", "ovs_interfaceid": "57bcbf69-c3cd-41cb-abf4-09620c4f7165", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.908 182096 DEBUG nova.network.os_vif_util [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.908 182096 DEBUG os_vif [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.909 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.910 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57bcbf69-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.911 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.913 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.915 182096 INFO os_vif [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:94:3b,bridge_name='br-int',has_traffic_filtering=True,id=57bcbf69-c3cd-41cb-abf4-09620c4f7165,network=Network(fa610e7c-53f8-4775-b5b8-aa45897b011c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57bcbf69-c3')
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.915 182096 INFO nova.virt.libvirt.driver [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Deleting instance files /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6_del
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.915 182096 INFO nova.virt.libvirt.driver [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Deletion of /var/lib/nova/instances/179ad298-ad18-41dd-9b5e-31064aadf7d6_del complete
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.943 182096 DEBUG nova.network.neutron [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Updating instance_info_cache with network_info: [{"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.970 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Releasing lock "refresh_cache-2efdc376-bfd9-4146-80d0-89cad9bf5107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.971 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Instance network_info: |[{"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.971 182096 DEBUG oslo_concurrency.lockutils [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2efdc376-bfd9-4146-80d0-89cad9bf5107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.971 182096 DEBUG nova.network.neutron [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Refreshing network info cache for port 89cf9c6f-f3bf-489a-afce-46b308954126 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.974 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Start _get_guest_xml network_info=[{"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.978 182096 WARNING nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.983 182096 DEBUG nova.virt.libvirt.host [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.983 182096 DEBUG nova.virt.libvirt.host [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.987 182096 DEBUG nova.virt.libvirt.host [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.987 182096 DEBUG nova.virt.libvirt.host [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.988 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.989 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.989 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.989 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.990 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.990 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.990 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.990 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.991 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.991 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.991 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.991 182096 DEBUG nova.virt.hardware [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.994 182096 DEBUG nova.virt.libvirt.vif [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:27:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-772147004',display_name='tempest-ServerAddressesTestJSON-server-772147004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-772147004',id=94,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a5b852465da47d2828e9d8bfdd0a944',ramdisk_id='',reservation_id='r-ck6tn3dv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-976814287',owner_user_name='tempest-ServerAddressesTestJSON-976814287-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:27:20Z,user_data=None,user_id='da28b720673344f98f7bca36bd875832',uuid=2efdc376-bfd9-4146-80d0-89cad9bf5107,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.995 182096 DEBUG nova.network.os_vif_util [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Converting VIF {"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.995 182096 DEBUG nova.network.os_vif_util [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4a:c5,bridge_name='br-int',has_traffic_filtering=True,id=89cf9c6f-f3bf-489a-afce-46b308954126,network=Network(3d4e04ad-2f9e-41ad-a89f-c09560078c02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89cf9c6f-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:27:22 compute-0 nova_compute[182092]: 2026-01-23 09:27:22.996 182096 DEBUG nova.objects.instance [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2efdc376-bfd9-4146-80d0-89cad9bf5107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.002 182096 INFO nova.compute.manager [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.003 182096 DEBUG oslo.service.loopingcall [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.003 182096 DEBUG nova.compute.manager [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.003 182096 DEBUG nova.network.neutron [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.007 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <uuid>2efdc376-bfd9-4146-80d0-89cad9bf5107</uuid>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <name>instance-0000005e</name>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerAddressesTestJSON-server-772147004</nova:name>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:27:22</nova:creationTime>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:27:23 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:27:23 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:27:23 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:27:23 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:27:23 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:27:23 compute-0 nova_compute[182092]:         <nova:user uuid="da28b720673344f98f7bca36bd875832">tempest-ServerAddressesTestJSON-976814287-project-member</nova:user>
Jan 23 09:27:23 compute-0 nova_compute[182092]:         <nova:project uuid="2a5b852465da47d2828e9d8bfdd0a944">tempest-ServerAddressesTestJSON-976814287</nova:project>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:27:23 compute-0 nova_compute[182092]:         <nova:port uuid="89cf9c6f-f3bf-489a-afce-46b308954126">
Jan 23 09:27:23 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <system>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <entry name="serial">2efdc376-bfd9-4146-80d0-89cad9bf5107</entry>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <entry name="uuid">2efdc376-bfd9-4146-80d0-89cad9bf5107</entry>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </system>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <os>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   </os>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <features>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   </features>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk.config"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:8b:4a:c5"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <target dev="tap89cf9c6f-f3"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/console.log" append="off"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <video>
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </video>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:27:23 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:27:23 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:27:23 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:27:23 compute-0 nova_compute[182092]: </domain>
Jan 23 09:27:23 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.008 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Preparing to wait for external event network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.009 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.009 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.009 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.010 182096 DEBUG nova.virt.libvirt.vif [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:27:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-772147004',display_name='tempest-ServerAddressesTestJSON-server-772147004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-772147004',id=94,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a5b852465da47d2828e9d8bfdd0a944',ramdisk_id='',reservation_id='r-ck6tn3dv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-976814287',owner_user_name='tempest-ServerAddressesTestJSON-976814287-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:27:20Z,user_data=None,user_id='da28b720673344f98f7bca36bd875832',uuid=2efdc376-bfd9-4146-80d0-89cad9bf5107,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.010 182096 DEBUG nova.network.os_vif_util [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Converting VIF {"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.010 182096 DEBUG nova.network.os_vif_util [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4a:c5,bridge_name='br-int',has_traffic_filtering=True,id=89cf9c6f-f3bf-489a-afce-46b308954126,network=Network(3d4e04ad-2f9e-41ad-a89f-c09560078c02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89cf9c6f-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.011 182096 DEBUG os_vif [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4a:c5,bridge_name='br-int',has_traffic_filtering=True,id=89cf9c6f-f3bf-489a-afce-46b308954126,network=Network(3d4e04ad-2f9e-41ad-a89f-c09560078c02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89cf9c6f-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.011 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.012 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.012 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.014 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.014 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89cf9c6f-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.014 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap89cf9c6f-f3, col_values=(('external_ids', {'iface-id': '89cf9c6f-f3bf-489a-afce-46b308954126', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:4a:c5', 'vm-uuid': '2efdc376-bfd9-4146-80d0-89cad9bf5107'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.016 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 NetworkManager[54920]: <info>  [1769160443.0165] manager: (tap89cf9c6f-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.018 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.020 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.020 182096 INFO os_vif [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4a:c5,bridge_name='br-int',has_traffic_filtering=True,id=89cf9c6f-f3bf-489a-afce-46b308954126,network=Network(3d4e04ad-2f9e-41ad-a89f-c09560078c02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89cf9c6f-f3')
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.060 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.061 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.061 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] No VIF found with MAC fa:16:3e:8b:4a:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.061 182096 INFO nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Using config drive
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.369 182096 INFO nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Creating config drive at /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk.config
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.373 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_r7b4so7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.492 182096 DEBUG oslo_concurrency.processutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_r7b4so7" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:23 compute-0 kernel: tap89cf9c6f-f3: entered promiscuous mode
Jan 23 09:27:23 compute-0 NetworkManager[54920]: <info>  [1769160443.5282] manager: (tap89cf9c6f-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Jan 23 09:27:23 compute-0 ovn_controller[94697]: 2026-01-23T09:27:23Z|00354|binding|INFO|Claiming lport 89cf9c6f-f3bf-489a-afce-46b308954126 for this chassis.
Jan 23 09:27:23 compute-0 ovn_controller[94697]: 2026-01-23T09:27:23Z|00355|binding|INFO|89cf9c6f-f3bf-489a-afce-46b308954126: Claiming fa:16:3e:8b:4a:c5 10.100.0.11
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.529 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 systemd-udevd[221527]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.541 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:4a:c5 10.100.0.11'], port_security=['fa:16:3e:8b:4a:c5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2efdc376-bfd9-4146-80d0-89cad9bf5107', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d4e04ad-2f9e-41ad-a89f-c09560078c02', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a5b852465da47d2828e9d8bfdd0a944', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e2b39e56-b509-49e6-a69a-d99981fbafa0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb792439-35c1-4659-ac1f-183a0fed8110, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=89cf9c6f-f3bf-489a-afce-46b308954126) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.542 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 89cf9c6f-f3bf-489a-afce-46b308954126 in datapath 3d4e04ad-2f9e-41ad-a89f-c09560078c02 bound to our chassis
Jan 23 09:27:23 compute-0 NetworkManager[54920]: <info>  [1769160443.5430] device (tap89cf9c6f-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:27:23 compute-0 NetworkManager[54920]: <info>  [1769160443.5437] device (tap89cf9c6f-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:27:23 compute-0 ovn_controller[94697]: 2026-01-23T09:27:23Z|00356|binding|INFO|Setting lport 89cf9c6f-f3bf-489a-afce-46b308954126 ovn-installed in OVS
Jan 23 09:27:23 compute-0 ovn_controller[94697]: 2026-01-23T09:27:23Z|00357|binding|INFO|Setting lport 89cf9c6f-f3bf-489a-afce-46b308954126 up in Southbound
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.543 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3d4e04ad-2f9e-41ad-a89f-c09560078c02
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.546 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.551 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.552 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ac450af4-b751-4899-abaf-50af75e13179]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.552 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3d4e04ad-21 in ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.554 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3d4e04ad-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.554 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[abfea42e-bfde-4848-99ab-299e84dc661f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.554 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4838ec00-b74f-448f-8c60-a448a36cd497]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.563 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[60d52cdc-8ea7-40f3-93e5-e20d20b0708a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 systemd-machined[153562]: New machine qemu-49-instance-0000005e.
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.572 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5ffc2e70-635f-4ce7-8e49-bccc3fe66caa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 systemd[1]: Started Virtual Machine qemu-49-instance-0000005e.
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.600 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc3ac1e-f20b-4881-9fcd-fb342d9588df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.604 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aee26800-49e4-4057-b8ed-e4fb1c7d5635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 NetworkManager[54920]: <info>  [1769160443.6051] manager: (tap3d4e04ad-20): new Veth device (/org/freedesktop/NetworkManager/Devices/184)
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.629 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf2300a-91b9-445f-b410-6e199652eeb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.631 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9db12c-66fa-4cdf-80fa-31cbe05f4acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 NetworkManager[54920]: <info>  [1769160443.6472] device (tap3d4e04ad-20): carrier: link connected
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.650 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[43006ff1-3d60-4edf-8b91-c2086f799d2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.663 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e9235aaf-7231-44ea-b3eb-bd3f83f665f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d4e04ad-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:af:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394913, 'reachable_time': 41313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221575, 'error': None, 'target': 'ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.674 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[233eb748-b2c1-4503-9544-71d7cbfd8afd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:afde'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 394913, 'tstamp': 394913}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221576, 'error': None, 'target': 'ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.689 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fd271897-4f0b-4813-bec2-5991cd87b21e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d4e04ad-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:af:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394913, 'reachable_time': 41313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221577, 'error': None, 'target': 'ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.713 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c774d565-9d25-4a6f-98ad-c1dd9910a5c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.725 182096 DEBUG nova.network.neutron [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.754 182096 INFO nova.compute.manager [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Took 0.75 seconds to deallocate network for instance.
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.756 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7b05b0b7-a9c0-4d71-91ac-dc45fe7f5475]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.758 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d4e04ad-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.758 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.758 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d4e04ad-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:23 compute-0 kernel: tap3d4e04ad-20: entered promiscuous mode
Jan 23 09:27:23 compute-0 NetworkManager[54920]: <info>  [1769160443.7604] manager: (tap3d4e04ad-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.761 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.764 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3d4e04ad-20, col_values=(('external_ids', {'iface-id': '3e8abd57-3c9f-4466-9a86-16e712f023ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.765 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 ovn_controller[94697]: 2026-01-23T09:27:23Z|00358|binding|INFO|Releasing lport 3e8abd57-3c9f-4466-9a86-16e712f023ea from this chassis (sb_readonly=0)
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.766 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.768 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3d4e04ad-2f9e-41ad-a89f-c09560078c02.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3d4e04ad-2f9e-41ad-a89f-c09560078c02.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.778 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.777 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b10df08f-db2e-49a5-bd08-a372f446b456]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.779 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-3d4e04ad-2f9e-41ad-a89f-c09560078c02
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/3d4e04ad-2f9e-41ad-a89f-c09560078c02.pid.haproxy
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 3d4e04ad-2f9e-41ad-a89f-c09560078c02
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:27:23 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:23.780 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02', 'env', 'PROCESS_TAG=haproxy-3d4e04ad-2f9e-41ad-a89f-c09560078c02', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3d4e04ad-2f9e-41ad-a89f-c09560078c02.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.824 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.824 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.834 182096 DEBUG nova.compute.manager [req-d94dac14-2658-48ab-bde5-07c956093495 req-e105745e-8a60-45fa-ae74-a903e2f790e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received event network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.834 182096 DEBUG oslo_concurrency.lockutils [req-d94dac14-2658-48ab-bde5-07c956093495 req-e105745e-8a60-45fa-ae74-a903e2f790e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.835 182096 DEBUG oslo_concurrency.lockutils [req-d94dac14-2658-48ab-bde5-07c956093495 req-e105745e-8a60-45fa-ae74-a903e2f790e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.835 182096 DEBUG oslo_concurrency.lockutils [req-d94dac14-2658-48ab-bde5-07c956093495 req-e105745e-8a60-45fa-ae74-a903e2f790e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.835 182096 DEBUG nova.compute.manager [req-d94dac14-2658-48ab-bde5-07c956093495 req-e105745e-8a60-45fa-ae74-a903e2f790e7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Processing event network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.836 182096 DEBUG nova.compute.manager [req-17e573f9-dcc9-443a-ac73-5d604a9acd87 req-6918d9e6-d69a-4582-a0d0-88bef74a29ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-deleted-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.867 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160443.8662944, 2efdc376-bfd9-4146-80d0-89cad9bf5107 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.867 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] VM Started (Lifecycle Event)
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.868 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.871 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.873 182096 INFO nova.virt.libvirt.driver [-] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Instance spawned successfully.
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.873 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.887 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.894 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.897 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.898 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.898 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.898 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.899 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.899 182096 DEBUG nova.virt.libvirt.driver [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.903 182096 DEBUG nova.compute.provider_tree [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.928 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.928 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160443.8669448, 2efdc376-bfd9-4146-80d0-89cad9bf5107 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.928 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] VM Paused (Lifecycle Event)
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.934 182096 DEBUG nova.scheduler.client.report [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.961 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.962 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.965 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160443.8712714, 2efdc376-bfd9-4146-80d0-89cad9bf5107 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.965 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] VM Resumed (Lifecycle Event)
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.988 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.989 182096 INFO nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Took 3.90 seconds to spawn the instance on the hypervisor.
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.990 182096 DEBUG nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.991 182096 INFO nova.scheduler.client.report [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Deleted allocations for instance 179ad298-ad18-41dd-9b5e-31064aadf7d6
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.992 182096 DEBUG nova.network.neutron [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Updated VIF entry in instance network info cache for port 89cf9c6f-f3bf-489a-afce-46b308954126. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.992 182096 DEBUG nova.network.neutron [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Updating instance_info_cache with network_info: [{"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:27:23 compute-0 nova_compute[182092]: 2026-01-23 09:27:23.995 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.020 182096 DEBUG oslo_concurrency.lockutils [req-cfd5673a-e820-42db-b89c-0aac8fb46f62 req-c1e71991-12ed-4fed-8b51-6cdc24393393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2efdc376-bfd9-4146-80d0-89cad9bf5107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.021 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:27:24 compute-0 podman[221613]: 2026-01-23 09:27:24.061503546 +0000 UTC m=+0.032594445 container create 4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.085 182096 INFO nova.compute.manager [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Took 4.58 seconds to build instance.
Jan 23 09:27:24 compute-0 systemd[1]: Started libpod-conmon-4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46.scope.
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.111 182096 DEBUG oslo_concurrency.lockutils [None req-ef8c3832-c2ca-45f7-9bc4-aa4934e4f1d0 89c019480e524c04af4d250b1c4051e5 860ef09b9e6e4866bbe99b6e769733a3 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:24 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:27:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e90987049554a2ac0479a21d15b2bb2b9e1f57f166b03a970b92b481e361e19e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.121 182096 DEBUG oslo_concurrency.lockutils [None req-daa35063-c326-41ec-8190-64523bc1996f da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:24 compute-0 podman[221613]: 2026-01-23 09:27:24.127933966 +0000 UTC m=+0.099024865 container init 4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:27:24 compute-0 podman[221613]: 2026-01-23 09:27:24.132709475 +0000 UTC m=+0.103800373 container start 4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:27:24 compute-0 podman[221613]: 2026-01-23 09:27:24.046430733 +0000 UTC m=+0.017521652 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:27:24 compute-0 neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02[221625]: [NOTICE]   (221629) : New worker (221631) forked
Jan 23 09:27:24 compute-0 neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02[221625]: [NOTICE]   (221629) : Loading success.
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.763 182096 DEBUG nova.compute.manager [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.764 182096 DEBUG oslo_concurrency.lockutils [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.764 182096 DEBUG oslo_concurrency.lockutils [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.765 182096 DEBUG oslo_concurrency.lockutils [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.765 182096 DEBUG nova.compute.manager [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.765 182096 WARNING nova.compute.manager [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-unplugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state deleted and task_state None.
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.765 182096 DEBUG nova.compute.manager [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.766 182096 DEBUG oslo_concurrency.lockutils [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.766 182096 DEBUG oslo_concurrency.lockutils [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.766 182096 DEBUG oslo_concurrency.lockutils [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "179ad298-ad18-41dd-9b5e-31064aadf7d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.766 182096 DEBUG nova.compute.manager [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] No waiting events found dispatching network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:24 compute-0 nova_compute[182092]: 2026-01-23 09:27:24.767 182096 WARNING nova.compute.manager [req-dbf60e88-adaa-4e91-9ec1-7690e50ddfb9 req-655b3fcf-7d1b-4ba1-bc08-03fef3f7687e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Received unexpected event network-vif-plugged-57bcbf69-c3cd-41cb-abf4-09620c4f7165 for instance with vm_state deleted and task_state None.
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.966 182096 DEBUG nova.compute.manager [req-09586104-8e12-4ee7-9bd5-a893520bc910 req-4b02d48e-3fa2-4710-b52e-aa79366250b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received event network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.966 182096 DEBUG oslo_concurrency.lockutils [req-09586104-8e12-4ee7-9bd5-a893520bc910 req-4b02d48e-3fa2-4710-b52e-aa79366250b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.966 182096 DEBUG oslo_concurrency.lockutils [req-09586104-8e12-4ee7-9bd5-a893520bc910 req-4b02d48e-3fa2-4710-b52e-aa79366250b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.966 182096 DEBUG oslo_concurrency.lockutils [req-09586104-8e12-4ee7-9bd5-a893520bc910 req-4b02d48e-3fa2-4710-b52e-aa79366250b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.967 182096 DEBUG nova.compute.manager [req-09586104-8e12-4ee7-9bd5-a893520bc910 req-4b02d48e-3fa2-4710-b52e-aa79366250b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] No waiting events found dispatching network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.967 182096 WARNING nova.compute.manager [req-09586104-8e12-4ee7-9bd5-a893520bc910 req-4b02d48e-3fa2-4710-b52e-aa79366250b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received unexpected event network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 for instance with vm_state active and task_state deleting.
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.982 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "2efdc376-bfd9-4146-80d0-89cad9bf5107" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.982 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.982 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.982 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.983 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:25 compute-0 nova_compute[182092]: 2026-01-23 09:27:25.989 182096 INFO nova.compute.manager [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Terminating instance
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.001 182096 DEBUG nova.compute.manager [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:27:26 compute-0 kernel: tap89cf9c6f-f3 (unregistering): left promiscuous mode
Jan 23 09:27:26 compute-0 NetworkManager[54920]: <info>  [1769160446.0219] device (tap89cf9c6f-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:27:26 compute-0 ovn_controller[94697]: 2026-01-23T09:27:26Z|00359|binding|INFO|Releasing lport 89cf9c6f-f3bf-489a-afce-46b308954126 from this chassis (sb_readonly=0)
Jan 23 09:27:26 compute-0 ovn_controller[94697]: 2026-01-23T09:27:26Z|00360|binding|INFO|Setting lport 89cf9c6f-f3bf-489a-afce-46b308954126 down in Southbound
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.027 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 ovn_controller[94697]: 2026-01-23T09:27:26Z|00361|binding|INFO|Removing iface tap89cf9c6f-f3 ovn-installed in OVS
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.029 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.050 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 23 09:27:26 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005e.scope: Consumed 2.382s CPU time.
Jan 23 09:27:26 compute-0 systemd-machined[153562]: Machine qemu-49-instance-0000005e terminated.
Jan 23 09:27:26 compute-0 podman[221636]: 2026-01-23 09:27:26.106689807 +0000 UTC m=+0.066363244 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.158 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:4a:c5 10.100.0.11'], port_security=['fa:16:3e:8b:4a:c5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2efdc376-bfd9-4146-80d0-89cad9bf5107', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d4e04ad-2f9e-41ad-a89f-c09560078c02', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a5b852465da47d2828e9d8bfdd0a944', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e2b39e56-b509-49e6-a69a-d99981fbafa0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb792439-35c1-4659-ac1f-183a0fed8110, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=89cf9c6f-f3bf-489a-afce-46b308954126) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.159 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 89cf9c6f-f3bf-489a-afce-46b308954126 in datapath 3d4e04ad-2f9e-41ad-a89f-c09560078c02 unbound from our chassis
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.161 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d4e04ad-2f9e-41ad-a89f-c09560078c02, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.161 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[835d52c8-4f46-4fa0-b97a-f7a531773b5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.162 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02 namespace which is not needed anymore
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.217 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.243 182096 INFO nova.virt.libvirt.driver [-] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Instance destroyed successfully.
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.244 182096 DEBUG nova.objects.instance [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lazy-loading 'resources' on Instance uuid 2efdc376-bfd9-4146-80d0-89cad9bf5107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:26 compute-0 neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02[221625]: [NOTICE]   (221629) : haproxy version is 2.8.14-c23fe91
Jan 23 09:27:26 compute-0 neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02[221625]: [NOTICE]   (221629) : path to executable is /usr/sbin/haproxy
Jan 23 09:27:26 compute-0 neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02[221625]: [WARNING]  (221629) : Exiting Master process...
Jan 23 09:27:26 compute-0 neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02[221625]: [WARNING]  (221629) : Exiting Master process...
Jan 23 09:27:26 compute-0 neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02[221625]: [ALERT]    (221629) : Current worker (221631) exited with code 143 (Terminated)
Jan 23 09:27:26 compute-0 neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02[221625]: [WARNING]  (221629) : All workers exited. Exiting... (0)
Jan 23 09:27:26 compute-0 systemd[1]: libpod-4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46.scope: Deactivated successfully.
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.255 182096 DEBUG nova.virt.libvirt.vif [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:27:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-772147004',display_name='tempest-ServerAddressesTestJSON-server-772147004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-772147004',id=94,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:27:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2a5b852465da47d2828e9d8bfdd0a944',ramdisk_id='',reservation_id='r-ck6tn3dv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-976814287',owner_user_name='tempest-ServerAddressesTestJSON-976814287-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:27:24Z,user_data=None,user_id='da28b720673344f98f7bca36bd875832',uuid=2efdc376-bfd9-4146-80d0-89cad9bf5107,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.255 182096 DEBUG nova.network.os_vif_util [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Converting VIF {"id": "89cf9c6f-f3bf-489a-afce-46b308954126", "address": "fa:16:3e:8b:4a:c5", "network": {"id": "3d4e04ad-2f9e-41ad-a89f-c09560078c02", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1381125948-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a5b852465da47d2828e9d8bfdd0a944", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89cf9c6f-f3", "ovs_interfaceid": "89cf9c6f-f3bf-489a-afce-46b308954126", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.256 182096 DEBUG nova.network.os_vif_util [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4a:c5,bridge_name='br-int',has_traffic_filtering=True,id=89cf9c6f-f3bf-489a-afce-46b308954126,network=Network(3d4e04ad-2f9e-41ad-a89f-c09560078c02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89cf9c6f-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.257 182096 DEBUG os_vif [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4a:c5,bridge_name='br-int',has_traffic_filtering=True,id=89cf9c6f-f3bf-489a-afce-46b308954126,network=Network(3d4e04ad-2f9e-41ad-a89f-c09560078c02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89cf9c6f-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.259 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.259 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89cf9c6f-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.260 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 podman[221684]: 2026-01-23 09:27:26.261032939 +0000 UTC m=+0.037547448 container died 4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.261 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.263 182096 INFO os_vif [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:4a:c5,bridge_name='br-int',has_traffic_filtering=True,id=89cf9c6f-f3bf-489a-afce-46b308954126,network=Network(3d4e04ad-2f9e-41ad-a89f-c09560078c02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89cf9c6f-f3')
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.264 182096 INFO nova.virt.libvirt.driver [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Deleting instance files /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107_del
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.264 182096 INFO nova.virt.libvirt.driver [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Deletion of /var/lib/nova/instances/2efdc376-bfd9-4146-80d0-89cad9bf5107_del complete
Jan 23 09:27:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46-userdata-shm.mount: Deactivated successfully.
Jan 23 09:27:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-e90987049554a2ac0479a21d15b2bb2b9e1f57f166b03a970b92b481e361e19e-merged.mount: Deactivated successfully.
Jan 23 09:27:26 compute-0 podman[221684]: 2026-01-23 09:27:26.28055242 +0000 UTC m=+0.057066929 container cleanup 4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:27:26 compute-0 systemd[1]: libpod-conmon-4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46.scope: Deactivated successfully.
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.315 182096 INFO nova.compute.manager [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Took 0.31 seconds to destroy the instance on the hypervisor.
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.316 182096 DEBUG oslo.service.loopingcall [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.316 182096 DEBUG nova.compute.manager [-] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.316 182096 DEBUG nova.network.neutron [-] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:27:26 compute-0 podman[221723]: 2026-01-23 09:27:26.320364118 +0000 UTC m=+0.023861631 container remove 4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.323 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[58f92dd4-bd95-4fcd-9ea6-0437a88291e9]: (4, ('Fri Jan 23 09:27:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02 (4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46)\n4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46\nFri Jan 23 09:27:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02 (4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46)\n4ef365f775a2f5002bd58740f9bc849ad088290d9c47e43a017e84671baa1a46\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.324 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4d1e4d-9c9f-4a1e-a956-5239824fa06d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.325 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d4e04ad-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.326 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 kernel: tap3d4e04ad-20: left promiscuous mode
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.338 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.341 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0070eb84-30e3-44d5-a011-f2a14f6b0ea4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.354 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ce226ae7-318d-4e2a-a22f-8dc2b74b3779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.354 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[166bfcbe-b8e2-4a64-8c6b-b9b5ab11f5e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.369 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[87c6df14-9a3b-4461-9809-2e6567e10723]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394908, 'reachable_time': 43406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221736, 'error': None, 'target': 'ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d3d4e04ad\x2d2f9e\x2d41ad\x2da89f\x2dc09560078c02.mount: Deactivated successfully.
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.370 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3d4e04ad-2f9e-41ad-a89f-c09560078c02 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:27:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:26.370 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b53d17e3-a560-4597-bd22-84c9f63add6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.848 182096 DEBUG nova.network.neutron [-] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.859 182096 INFO nova.compute.manager [-] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Took 0.54 seconds to deallocate network for instance.
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.906 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.906 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.957 182096 DEBUG nova.compute.provider_tree [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.974 182096 DEBUG nova.scheduler.client.report [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:27:26 compute-0 nova_compute[182092]: 2026-01-23 09:27:26.996 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:27 compute-0 nova_compute[182092]: 2026-01-23 09:27:27.018 182096 INFO nova.scheduler.client.report [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Deleted allocations for instance 2efdc376-bfd9-4146-80d0-89cad9bf5107
Jan 23 09:27:27 compute-0 nova_compute[182092]: 2026-01-23 09:27:27.084 182096 DEBUG oslo_concurrency.lockutils [None req-7398f949-8cb1-4c8b-83ac-b5a43398e6a2 da28b720673344f98f7bca36bd875832 2a5b852465da47d2828e9d8bfdd0a944 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:27 compute-0 nova_compute[182092]: 2026-01-23 09:27:27.096 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.053 182096 DEBUG nova.compute.manager [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received event network-vif-unplugged-89cf9c6f-f3bf-489a-afce-46b308954126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.054 182096 DEBUG oslo_concurrency.lockutils [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.054 182096 DEBUG oslo_concurrency.lockutils [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.054 182096 DEBUG oslo_concurrency.lockutils [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.054 182096 DEBUG nova.compute.manager [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] No waiting events found dispatching network-vif-unplugged-89cf9c6f-f3bf-489a-afce-46b308954126 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.054 182096 WARNING nova.compute.manager [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received unexpected event network-vif-unplugged-89cf9c6f-f3bf-489a-afce-46b308954126 for instance with vm_state deleted and task_state None.
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.055 182096 DEBUG nova.compute.manager [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received event network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.055 182096 DEBUG oslo_concurrency.lockutils [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.055 182096 DEBUG oslo_concurrency.lockutils [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.055 182096 DEBUG oslo_concurrency.lockutils [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2efdc376-bfd9-4146-80d0-89cad9bf5107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.055 182096 DEBUG nova.compute.manager [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] No waiting events found dispatching network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.055 182096 WARNING nova.compute.manager [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received unexpected event network-vif-plugged-89cf9c6f-f3bf-489a-afce-46b308954126 for instance with vm_state deleted and task_state None.
Jan 23 09:27:28 compute-0 nova_compute[182092]: 2026-01-23 09:27:28.056 182096 DEBUG nova.compute.manager [req-50edff58-f01c-4f52-b753-eddde27fb37e req-11454783-4ac6-47ab-9700-f0760c098579 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Received event network-vif-deleted-89cf9c6f-f3bf-489a-afce-46b308954126 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:27:31 compute-0 nova_compute[182092]: 2026-01-23 09:27:31.262 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:31 compute-0 nova_compute[182092]: 2026-01-23 09:27:31.543 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:31 compute-0 nova_compute[182092]: 2026-01-23 09:27:31.748 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:32 compute-0 nova_compute[182092]: 2026-01-23 09:27:32.098 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:33 compute-0 podman[221738]: 2026-01-23 09:27:33.200980889 +0000 UTC m=+0.039476767 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 23 09:27:33 compute-0 podman[221739]: 2026-01-23 09:27:33.225231814 +0000 UTC m=+0.062179593 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:27:36 compute-0 nova_compute[182092]: 2026-01-23 09:27:36.265 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:37 compute-0 nova_compute[182092]: 2026-01-23 09:27:37.100 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:37 compute-0 nova_compute[182092]: 2026-01-23 09:27:37.890 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160442.8892567, 179ad298-ad18-41dd-9b5e-31064aadf7d6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:37 compute-0 nova_compute[182092]: 2026-01-23 09:27:37.890 182096 INFO nova.compute.manager [-] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] VM Stopped (Lifecycle Event)
Jan 23 09:27:37 compute-0 nova_compute[182092]: 2026-01-23 09:27:37.906 182096 DEBUG nova.compute.manager [None req-aa95c186-fd3b-43ff-943e-f4dc9c4ed464 - - - - - -] [instance: 179ad298-ad18-41dd-9b5e-31064aadf7d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:39.861 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:39.862 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:27:39.862 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:41 compute-0 nova_compute[182092]: 2026-01-23 09:27:41.243 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160446.2420268, 2efdc376-bfd9-4146-80d0-89cad9bf5107 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:27:41 compute-0 nova_compute[182092]: 2026-01-23 09:27:41.244 182096 INFO nova.compute.manager [-] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] VM Stopped (Lifecycle Event)
Jan 23 09:27:41 compute-0 nova_compute[182092]: 2026-01-23 09:27:41.258 182096 DEBUG nova.compute.manager [None req-cab3faf5-eaca-4aa5-a8f8-c1bafbdcf711 - - - - - -] [instance: 2efdc376-bfd9-4146-80d0-89cad9bf5107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:27:41 compute-0 nova_compute[182092]: 2026-01-23 09:27:41.266 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:42 compute-0 nova_compute[182092]: 2026-01-23 09:27:42.101 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:44 compute-0 podman[221777]: 2026-01-23 09:27:44.20134869 +0000 UTC m=+0.035024379 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:27:44 compute-0 podman[221776]: 2026-01-23 09:27:44.205235521 +0000 UTC m=+0.041119036 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 09:27:46 compute-0 podman[221814]: 2026-01-23 09:27:46.200144934 +0000 UTC m=+0.039104173 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=)
Jan 23 09:27:46 compute-0 nova_compute[182092]: 2026-01-23 09:27:46.269 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:47 compute-0 nova_compute[182092]: 2026-01-23 09:27:47.102 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:51 compute-0 nova_compute[182092]: 2026-01-23 09:27:51.272 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:52 compute-0 nova_compute[182092]: 2026-01-23 09:27:52.103 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:54 compute-0 nova_compute[182092]: 2026-01-23 09:27:54.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:27:54 compute-0 nova_compute[182092]: 2026-01-23 09:27:54.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:27:56 compute-0 podman[221832]: 2026-01-23 09:27:56.223325785 +0000 UTC m=+0.060207913 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 09:27:56 compute-0 nova_compute[182092]: 2026-01-23 09:27:56.275 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:57 compute-0 nova_compute[182092]: 2026-01-23 09:27:57.105 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.485 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "86281329-601b-43f8-8481-663f35dbe261" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.485 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.505 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.664 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.665 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.672 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.672 182096 INFO nova.compute.claims [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.811 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.811 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.825 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.827 182096 DEBUG nova.scheduler.client.report [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.886 182096 DEBUG nova.scheduler.client.report [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.886 182096 DEBUG nova.compute.provider_tree [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.889 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.901 182096 DEBUG nova.scheduler.client.report [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.919 182096 DEBUG nova.scheduler.client.report [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.969 182096 DEBUG nova.compute.provider_tree [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.983 182096 DEBUG nova.scheduler.client.report [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.997 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:58 compute-0 nova_compute[182092]: 2026-01-23 09:27:58.998 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.000 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.004 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.004 182096 INFO nova.compute.claims [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.065 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.065 182096 DEBUG nova.network.neutron [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.077 182096 INFO nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.087 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.153 182096 DEBUG nova.compute.provider_tree [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.166 182096 DEBUG nova.scheduler.client.report [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.189 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.190 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.190 182096 INFO nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Creating image(s)
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.191 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "/var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.191 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "/var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.192 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "/var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.202 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.202 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.204 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.238 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.238 182096 DEBUG nova.network.neutron [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.247 182096 DEBUG nova.policy [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.250 182096 INFO nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.252 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.252 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.253 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.262 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.275 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.307 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.308 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.329 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.329 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.330 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.367 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.368 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.369 182096 INFO nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Creating image(s)
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.369 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "/var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.369 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "/var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.370 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "/var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.380 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.380 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.392 182096 DEBUG nova.virt.disk.api [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Checking if we can resize image /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.393 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.409 182096 DEBUG nova.policy [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '515824da811b44c5975fc3e39e067bd4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d63881e7e674f288e8f5746af8eddeb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.435 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.436 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.436 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.446 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.458 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.459 182096 DEBUG nova.virt.disk.api [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Cannot resize image /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.459 182096 DEBUG nova.objects.instance [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'migration_context' on Instance uuid 86281329-601b-43f8-8481-663f35dbe261 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.472 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.472 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Ensure instance console log exists: /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.472 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.472 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.473 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.490 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.491 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.511 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.512 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.512 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.556 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.557 182096 DEBUG nova.virt.disk.api [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Checking if we can resize image /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.557 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.602 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.603 182096 DEBUG nova.virt.disk.api [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Cannot resize image /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.603 182096 DEBUG nova.objects.instance [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lazy-loading 'migration_context' on Instance uuid 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.622 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.622 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Ensure instance console log exists: /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.623 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.623 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.623 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.827 182096 DEBUG nova.network.neutron [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Successfully created port: 91360d67-ca14-4940-989d-37d7efdfc1d7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:27:59 compute-0 nova_compute[182092]: 2026-01-23 09:27:59.888 182096 DEBUG nova.network.neutron [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Successfully created port: 5d50299c-b077-40cc-9547-44725cb10cfc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.579 182096 DEBUG nova.network.neutron [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Successfully updated port: 91360d67-ca14-4940-989d-37d7efdfc1d7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.592 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.592 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquired lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.592 182096 DEBUG nova.network.neutron [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.655 182096 DEBUG nova.network.neutron [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Successfully updated port: 5d50299c-b077-40cc-9547-44725cb10cfc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.663 182096 DEBUG nova.compute.manager [req-6f912a41-afe7-4a2d-9831-28aa521e8a26 req-72a2e9b2-fdb9-4ca4-941c-be2ec9a72735 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received event network-changed-91360d67-ca14-4940-989d-37d7efdfc1d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.663 182096 DEBUG nova.compute.manager [req-6f912a41-afe7-4a2d-9831-28aa521e8a26 req-72a2e9b2-fdb9-4ca4-941c-be2ec9a72735 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Refreshing instance network info cache due to event network-changed-91360d67-ca14-4940-989d-37d7efdfc1d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.663 182096 DEBUG oslo_concurrency.lockutils [req-6f912a41-afe7-4a2d-9831-28aa521e8a26 req-72a2e9b2-fdb9-4ca4-941c-be2ec9a72735 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.663 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.666 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "refresh_cache-8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.667 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquired lock "refresh_cache-8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.667 182096 DEBUG nova.network.neutron [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.691 182096 DEBUG nova.network.neutron [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.724 182096 DEBUG nova.compute.manager [req-3d25634d-fd29-48ea-ae2a-e95e5c7bb239 req-2efa7d73-d8d4-4ba8-8045-9985d7fcb05b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-changed-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.724 182096 DEBUG nova.compute.manager [req-3d25634d-fd29-48ea-ae2a-e95e5c7bb239 req-2efa7d73-d8d4-4ba8-8045-9985d7fcb05b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Refreshing instance network info cache due to event network-changed-5d50299c-b077-40cc-9547-44725cb10cfc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.724 182096 DEBUG oslo_concurrency.lockutils [req-3d25634d-fd29-48ea-ae2a-e95e5c7bb239 req-2efa7d73-d8d4-4ba8-8045-9985d7fcb05b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:00 compute-0 nova_compute[182092]: 2026-01-23 09:28:00.797 182096 DEBUG nova.network.neutron [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.276 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.375 182096 DEBUG nova.network.neutron [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updating instance_info_cache with network_info: [{"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.389 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Releasing lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.389 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Instance network_info: |[{"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.389 182096 DEBUG oslo_concurrency.lockutils [req-6f912a41-afe7-4a2d-9831-28aa521e8a26 req-72a2e9b2-fdb9-4ca4-941c-be2ec9a72735 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.390 182096 DEBUG nova.network.neutron [req-6f912a41-afe7-4a2d-9831-28aa521e8a26 req-72a2e9b2-fdb9-4ca4-941c-be2ec9a72735 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Refreshing network info cache for port 91360d67-ca14-4940-989d-37d7efdfc1d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.391 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Start _get_guest_xml network_info=[{"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.395 182096 WARNING nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.398 182096 DEBUG nova.virt.libvirt.host [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.399 182096 DEBUG nova.virt.libvirt.host [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.404 182096 DEBUG nova.virt.libvirt.host [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.404 182096 DEBUG nova.virt.libvirt.host [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.405 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.405 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.405 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.406 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.406 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.406 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.406 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.406 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.407 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.407 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.407 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.407 182096 DEBUG nova.virt.hardware [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.409 182096 DEBUG nova.virt.libvirt.vif [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:27:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-499831423',display_name='tempest-TestNetworkAdvancedServerOps-server-499831423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-499831423',id=98,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAIsYFCa/11HZR3PTxodDOgYteIa6L1Fy2AcBxm/cET+zB2mloDk36zPnZjwfjHoVBZJTHqUMTQzNneLTFMGKPbO7IrpXzXA4rU3Vf9S33Vyg1DVCmvSF4cG9nHo/x7+Ug==',key_name='tempest-TestNetworkAdvancedServerOps-844257604',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-lg0l11g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:27:59Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=86281329-601b-43f8-8481-663f35dbe261,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.410 182096 DEBUG nova.network.os_vif_util [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.410 182096 DEBUG nova.network.os_vif_util [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:22:00,bridge_name='br-int',has_traffic_filtering=True,id=91360d67-ca14-4940-989d-37d7efdfc1d7,network=Network(338626c5-1f64-4e00-b560-256b83590866),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91360d67-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.411 182096 DEBUG nova.objects.instance [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'pci_devices' on Instance uuid 86281329-601b-43f8-8481-663f35dbe261 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.425 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <uuid>86281329-601b-43f8-8481-663f35dbe261</uuid>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <name>instance-00000062</name>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-499831423</nova:name>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:28:01</nova:creationTime>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:user uuid="2880f53bded147989ea61dc68ec0880e">tempest-TestNetworkAdvancedServerOps-169193993-project-member</nova:user>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:project uuid="5a5525bfc549464cace77d44548fb012">tempest-TestNetworkAdvancedServerOps-169193993</nova:project>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:port uuid="91360d67-ca14-4940-989d-37d7efdfc1d7">
Jan 23 09:28:01 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <system>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="serial">86281329-601b-43f8-8481-663f35dbe261</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="uuid">86281329-601b-43f8-8481-663f35dbe261</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </system>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <os>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </os>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <features>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </features>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk.config"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:e5:22:00"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <target dev="tap91360d67-ca"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/console.log" append="off"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <video>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </video>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:28:01 compute-0 nova_compute[182092]: </domain>
Jan 23 09:28:01 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.425 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Preparing to wait for external event network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.426 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "86281329-601b-43f8-8481-663f35dbe261-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.426 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.426 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.427 182096 DEBUG nova.virt.libvirt.vif [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:27:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-499831423',display_name='tempest-TestNetworkAdvancedServerOps-server-499831423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-499831423',id=98,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAIsYFCa/11HZR3PTxodDOgYteIa6L1Fy2AcBxm/cET+zB2mloDk36zPnZjwfjHoVBZJTHqUMTQzNneLTFMGKPbO7IrpXzXA4rU3Vf9S33Vyg1DVCmvSF4cG9nHo/x7+Ug==',key_name='tempest-TestNetworkAdvancedServerOps-844257604',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-lg0l11g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:27:59Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=86281329-601b-43f8-8481-663f35dbe261,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.427 182096 DEBUG nova.network.os_vif_util [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.427 182096 DEBUG nova.network.os_vif_util [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:22:00,bridge_name='br-int',has_traffic_filtering=True,id=91360d67-ca14-4940-989d-37d7efdfc1d7,network=Network(338626c5-1f64-4e00-b560-256b83590866),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91360d67-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.428 182096 DEBUG os_vif [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:22:00,bridge_name='br-int',has_traffic_filtering=True,id=91360d67-ca14-4940-989d-37d7efdfc1d7,network=Network(338626c5-1f64-4e00-b560-256b83590866),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91360d67-ca') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.428 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.428 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.429 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.430 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.431 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91360d67-ca, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.431 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap91360d67-ca, col_values=(('external_ids', {'iface-id': '91360d67-ca14-4940-989d-37d7efdfc1d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:22:00', 'vm-uuid': '86281329-601b-43f8-8481-663f35dbe261'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:01 compute-0 NetworkManager[54920]: <info>  [1769160481.4328] manager: (tap91360d67-ca): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.434 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.436 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.436 182096 INFO os_vif [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:22:00,bridge_name='br-int',has_traffic_filtering=True,id=91360d67-ca14-4940-989d-37d7efdfc1d7,network=Network(338626c5-1f64-4e00-b560-256b83590866),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91360d67-ca')
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.470 182096 DEBUG nova.network.neutron [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Updating instance_info_cache with network_info: [{"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.473 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.473 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.473 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No VIF found with MAC fa:16:3e:e5:22:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.474 182096 INFO nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Using config drive
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.490 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Releasing lock "refresh_cache-8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.490 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Instance network_info: |[{"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.491 182096 DEBUG oslo_concurrency.lockutils [req-3d25634d-fd29-48ea-ae2a-e95e5c7bb239 req-2efa7d73-d8d4-4ba8-8045-9985d7fcb05b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.491 182096 DEBUG nova.network.neutron [req-3d25634d-fd29-48ea-ae2a-e95e5c7bb239 req-2efa7d73-d8d4-4ba8-8045-9985d7fcb05b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Refreshing network info cache for port 5d50299c-b077-40cc-9547-44725cb10cfc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.493 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Start _get_guest_xml network_info=[{"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.495 182096 WARNING nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.498 182096 DEBUG nova.virt.libvirt.host [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.498 182096 DEBUG nova.virt.libvirt.host [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.504 182096 DEBUG nova.virt.libvirt.host [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.504 182096 DEBUG nova.virt.libvirt.host [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.505 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.505 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.506 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.506 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.506 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.506 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.506 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.506 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.507 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.507 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.507 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.507 182096 DEBUG nova.virt.hardware [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.509 182096 DEBUG nova.virt.libvirt.vif [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:27:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1729204163',display_name='tempest-ServerMetadataTestJSON-server-1729204163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1729204163',id=99,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d63881e7e674f288e8f5746af8eddeb',ramdisk_id='',reservation_id='r-tguiv1hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-1492635001',owner_user_name='tempest-ServerMetadataTestJSON-1492635001-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:27:59Z,user_data=None,user_id='515824da811b44c5975fc3e39e067bd4',uuid=8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.509 182096 DEBUG nova.network.os_vif_util [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Converting VIF {"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.509 182096 DEBUG nova.network.os_vif_util [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:4b:6f,bridge_name='br-int',has_traffic_filtering=True,id=5d50299c-b077-40cc-9547-44725cb10cfc,network=Network(053c9fc1-7210-4bb4-97b3-0a578f258011),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d50299c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.510 182096 DEBUG nova.objects.instance [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lazy-loading 'pci_devices' on Instance uuid 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.522 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <uuid>8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3</uuid>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <name>instance-00000063</name>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerMetadataTestJSON-server-1729204163</nova:name>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:28:01</nova:creationTime>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:user uuid="515824da811b44c5975fc3e39e067bd4">tempest-ServerMetadataTestJSON-1492635001-project-member</nova:user>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:project uuid="6d63881e7e674f288e8f5746af8eddeb">tempest-ServerMetadataTestJSON-1492635001</nova:project>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         <nova:port uuid="5d50299c-b077-40cc-9547-44725cb10cfc">
Jan 23 09:28:01 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <system>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="serial">8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="uuid">8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </system>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <os>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </os>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <features>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </features>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk.config"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:f8:4b:6f"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <target dev="tap5d50299c-b0"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/console.log" append="off"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <video>
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </video>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:28:01 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:28:01 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:28:01 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:28:01 compute-0 nova_compute[182092]: </domain>
Jan 23 09:28:01 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.522 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Preparing to wait for external event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.522 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.523 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.523 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.523 182096 DEBUG nova.virt.libvirt.vif [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:27:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1729204163',display_name='tempest-ServerMetadataTestJSON-server-1729204163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1729204163',id=99,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d63881e7e674f288e8f5746af8eddeb',ramdisk_id='',reservation_id='r-tguiv1hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-1492635001',owner_user_name='tempest-ServerMetadataTestJSON-1492635001-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:27:59Z,user_data=None,user_id='515824da811b44c5975fc3e39e067bd4',uuid=8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.523 182096 DEBUG nova.network.os_vif_util [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Converting VIF {"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.524 182096 DEBUG nova.network.os_vif_util [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:4b:6f,bridge_name='br-int',has_traffic_filtering=True,id=5d50299c-b077-40cc-9547-44725cb10cfc,network=Network(053c9fc1-7210-4bb4-97b3-0a578f258011),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d50299c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.524 182096 DEBUG os_vif [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:4b:6f,bridge_name='br-int',has_traffic_filtering=True,id=5d50299c-b077-40cc-9547-44725cb10cfc,network=Network(053c9fc1-7210-4bb4-97b3-0a578f258011),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d50299c-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.524 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.524 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.525 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.526 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.526 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d50299c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.527 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5d50299c-b0, col_values=(('external_ids', {'iface-id': '5d50299c-b077-40cc-9547-44725cb10cfc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:4b:6f', 'vm-uuid': '8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.527 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 NetworkManager[54920]: <info>  [1769160481.5289] manager: (tap5d50299c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.529 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.533 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.533 182096 INFO os_vif [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:4b:6f,bridge_name='br-int',has_traffic_filtering=True,id=5d50299c-b077-40cc-9547-44725cb10cfc,network=Network(053c9fc1-7210-4bb4-97b3-0a578f258011),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d50299c-b0')
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.566 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.567 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.567 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] No VIF found with MAC fa:16:3e:f8:4b:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.567 182096 INFO nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Using config drive
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.762 182096 INFO nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Creating config drive at /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk.config
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.766 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwvbgbqk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.859 182096 INFO nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Creating config drive at /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk.config
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.863 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyad5s_0k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.883 182096 DEBUG oslo_concurrency.processutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwvbgbqk" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:01 compute-0 kernel: tap91360d67-ca: entered promiscuous mode
Jan 23 09:28:01 compute-0 NetworkManager[54920]: <info>  [1769160481.9190] manager: (tap91360d67-ca): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Jan 23 09:28:01 compute-0 ovn_controller[94697]: 2026-01-23T09:28:01Z|00362|binding|INFO|Claiming lport 91360d67-ca14-4940-989d-37d7efdfc1d7 for this chassis.
Jan 23 09:28:01 compute-0 ovn_controller[94697]: 2026-01-23T09:28:01Z|00363|binding|INFO|91360d67-ca14-4940-989d-37d7efdfc1d7: Claiming fa:16:3e:e5:22:00 10.100.0.9
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.922 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.932 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:22:00 10.100.0.9'], port_security=['fa:16:3e:e5:22:00 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '86281329-601b-43f8-8481-663f35dbe261', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-338626c5-1f64-4e00-b560-256b83590866', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ef5e74ef-33f5-4136-a29e-09c8fc96c8b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9baa215-6352-4e59-96e8-12bbc8110c9a, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=91360d67-ca14-4940-989d-37d7efdfc1d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.933 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 91360d67-ca14-4940-989d-37d7efdfc1d7 in datapath 338626c5-1f64-4e00-b560-256b83590866 bound to our chassis
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.934 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 338626c5-1f64-4e00-b560-256b83590866
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.942 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb8b698-e3ee-4160-9841-6480a9994333]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.943 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap338626c5-11 in ovnmeta-338626c5-1f64-4e00-b560-256b83590866 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.944 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap338626c5-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.944 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[edd13e42-1bf4-437b-8e81-c9c076752a66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.945 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a7693a60-39da-4a4c-b39b-eaab71e5277b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:01 compute-0 systemd-udevd[221914]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:28:01 compute-0 systemd-machined[153562]: New machine qemu-50-instance-00000062.
Jan 23 09:28:01 compute-0 NetworkManager[54920]: <info>  [1769160481.9594] device (tap91360d67-ca): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:28:01 compute-0 NetworkManager[54920]: <info>  [1769160481.9602] device (tap91360d67-ca): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.960 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[547af08a-c57f-4d56-9bf7-70d8d1608e6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.980 182096 DEBUG oslo_concurrency.processutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyad5s_0k" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:01.983 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b4af3c2a-ca5d-44f5-bd09-6313dc364a18]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:01 compute-0 systemd[1]: Started Virtual Machine qemu-50-instance-00000062.
Jan 23 09:28:01 compute-0 nova_compute[182092]: 2026-01-23 09:28:01.992 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.003 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[23721b59-7137-4344-b130-f3ec599fbea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_controller[94697]: 2026-01-23T09:28:02Z|00364|binding|INFO|Setting lport 91360d67-ca14-4940-989d-37d7efdfc1d7 ovn-installed in OVS
Jan 23 09:28:02 compute-0 ovn_controller[94697]: 2026-01-23T09:28:02Z|00365|binding|INFO|Setting lport 91360d67-ca14-4940-989d-37d7efdfc1d7 up in Southbound
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.0073] manager: (tap338626c5-10): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.007 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.006 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7f5b41-a66c-4493-b4ff-61514015fad4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.028 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ff25cc0f-bc07-4382-a84d-3677760b274a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.030 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[06f2c891-192d-45f7-a8e5-616af9ad1279]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 kernel: tap5d50299c-b0: entered promiscuous mode
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.0458] manager: (tap5d50299c-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/190)
Jan 23 09:28:02 compute-0 ovn_controller[94697]: 2026-01-23T09:28:02Z|00366|binding|INFO|Claiming lport 5d50299c-b077-40cc-9547-44725cb10cfc for this chassis.
Jan 23 09:28:02 compute-0 ovn_controller[94697]: 2026-01-23T09:28:02Z|00367|binding|INFO|5d50299c-b077-40cc-9547-44725cb10cfc: Claiming fa:16:3e:f8:4b:6f 10.100.0.3
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.047 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.0491] device (tap338626c5-10): carrier: link connected
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.053 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.052 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[df306183-366a-4daa-9a69-634c1bf72990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.060 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:4b:6f 10.100.0.3'], port_security=['fa:16:3e:f8:4b:6f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-053c9fc1-7210-4bb4-97b3-0a578f258011', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d63881e7e674f288e8f5746af8eddeb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '31f115c9-4597-433c-aeb5-8297feb59b3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6b5a3c4-fb76-4c11-a3c5-5fefda4cb7b4, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=5d50299c-b077-40cc-9547-44725cb10cfc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.0652] device (tap5d50299c-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.0660] device (tap5d50299c-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.071 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6e9f02-088b-47b1-b753-7c91b2b83bea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap338626c5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:51:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398753, 'reachable_time': 26017, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221952, 'error': None, 'target': 'ovnmeta-338626c5-1f64-4e00-b560-256b83590866', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.083 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1fe196-0bad-49b2-b348-2237a1fe87df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:513d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398753, 'tstamp': 398753}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221955, 'error': None, 'target': 'ovnmeta-338626c5-1f64-4e00-b560-256b83590866', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.097 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0d82f3-238e-4418-9d6e-07b3db9532b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap338626c5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:51:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398753, 'reachable_time': 26017, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221956, 'error': None, 'target': 'ovnmeta-338626c5-1f64-4e00-b560-256b83590866', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 systemd-machined[153562]: New machine qemu-51-instance-00000063.
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.107 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.108 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 systemd[1]: Started Virtual Machine qemu-51-instance-00000063.
Jan 23 09:28:02 compute-0 ovn_controller[94697]: 2026-01-23T09:28:02Z|00368|binding|INFO|Setting lport 5d50299c-b077-40cc-9547-44725cb10cfc ovn-installed in OVS
Jan 23 09:28:02 compute-0 ovn_controller[94697]: 2026-01-23T09:28:02Z|00369|binding|INFO|Setting lport 5d50299c-b077-40cc-9547-44725cb10cfc up in Southbound
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.115 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.124 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f8adb84a-2372-49c6-9ad0-225b7179fffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.165 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3343a9-6a4b-41f0-a287-efc12dc344a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.166 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap338626c5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.166 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.166 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap338626c5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:02 compute-0 kernel: tap338626c5-10: entered promiscuous mode
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.1692] manager: (tap338626c5-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.167 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.171 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.173 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap338626c5-10, col_values=(('external_ids', {'iface-id': '8f5fbc9a-d56d-440e-b553-45cbeacd70e2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.174 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_controller[94697]: 2026-01-23T09:28:02Z|00370|binding|INFO|Releasing lport 8f5fbc9a-d56d-440e-b553-45cbeacd70e2 from this chassis (sb_readonly=0)
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.177 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/338626c5-1f64-4e00-b560-256b83590866.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/338626c5-1f64-4e00-b560-256b83590866.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.177 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[594d057c-fc65-4b65-b685-d2b2fdb27bff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.178 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-338626c5-1f64-4e00-b560-256b83590866
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/338626c5-1f64-4e00-b560-256b83590866.pid.haproxy
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 338626c5-1f64-4e00-b560-256b83590866
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.179 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-338626c5-1f64-4e00-b560-256b83590866', 'env', 'PROCESS_TAG=haproxy-338626c5-1f64-4e00-b560-256b83590866', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/338626c5-1f64-4e00-b560-256b83590866.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.186 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.345 182096 DEBUG nova.compute.manager [req-1fc91aa8-e139-47c0-90f0-5d8e41e8aad3 req-e3c4c95d-9e92-4b1d-8e43-a5a5a61f24e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.345 182096 DEBUG oslo_concurrency.lockutils [req-1fc91aa8-e139-47c0-90f0-5d8e41e8aad3 req-e3c4c95d-9e92-4b1d-8e43-a5a5a61f24e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.346 182096 DEBUG oslo_concurrency.lockutils [req-1fc91aa8-e139-47c0-90f0-5d8e41e8aad3 req-e3c4c95d-9e92-4b1d-8e43-a5a5a61f24e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.346 182096 DEBUG oslo_concurrency.lockutils [req-1fc91aa8-e139-47c0-90f0-5d8e41e8aad3 req-e3c4c95d-9e92-4b1d-8e43-a5a5a61f24e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.346 182096 DEBUG nova.compute.manager [req-1fc91aa8-e139-47c0-90f0-5d8e41e8aad3 req-e3c4c95d-9e92-4b1d-8e43-a5a5a61f24e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Processing event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.388 182096 DEBUG nova.network.neutron [req-6f912a41-afe7-4a2d-9831-28aa521e8a26 req-72a2e9b2-fdb9-4ca4-941c-be2ec9a72735 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updated VIF entry in instance network info cache for port 91360d67-ca14-4940-989d-37d7efdfc1d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.388 182096 DEBUG nova.network.neutron [req-6f912a41-afe7-4a2d-9831-28aa521e8a26 req-72a2e9b2-fdb9-4ca4-941c-be2ec9a72735 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updating instance_info_cache with network_info: [{"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.403 182096 DEBUG oslo_concurrency.lockutils [req-6f912a41-afe7-4a2d-9831-28aa521e8a26 req-72a2e9b2-fdb9-4ca4-941c-be2ec9a72735 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:02 compute-0 podman[221999]: 2026-01-23 09:28:02.47778212 +0000 UTC m=+0.039171762 container create 790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.492 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160482.492345, 86281329-601b-43f8-8481-663f35dbe261 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.493 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] VM Started (Lifecycle Event)
Jan 23 09:28:02 compute-0 systemd[1]: Started libpod-conmon-790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b.scope.
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.510 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.512 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160482.4925723, 86281329-601b-43f8-8481-663f35dbe261 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.512 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] VM Paused (Lifecycle Event)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.525 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:02 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:28:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb805da73109bdc45eb032bd84f75421c26c489b7f41104060628c5e59961384/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.531 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:02 compute-0 podman[221999]: 2026-01-23 09:28:02.536227207 +0000 UTC m=+0.097616849 container init 790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:28:02 compute-0 podman[221999]: 2026-01-23 09:28:02.540219728 +0000 UTC m=+0.101609359 container start 790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:28:02 compute-0 podman[221999]: 2026-01-23 09:28:02.455740121 +0000 UTC m=+0.017129773 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.544 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:02 compute-0 neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866[222012]: [NOTICE]   (222016) : New worker (222018) forked
Jan 23 09:28:02 compute-0 neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866[222012]: [NOTICE]   (222016) : Loading success.
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.582 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 5d50299c-b077-40cc-9547-44725cb10cfc in datapath 053c9fc1-7210-4bb4-97b3-0a578f258011 unbound from our chassis
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.583 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 053c9fc1-7210-4bb4-97b3-0a578f258011
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.590 182096 DEBUG nova.network.neutron [req-3d25634d-fd29-48ea-ae2a-e95e5c7bb239 req-2efa7d73-d8d4-4ba8-8045-9985d7fcb05b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Updated VIF entry in instance network info cache for port 5d50299c-b077-40cc-9547-44725cb10cfc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.591 182096 DEBUG nova.network.neutron [req-3d25634d-fd29-48ea-ae2a-e95e5c7bb239 req-2efa7d73-d8d4-4ba8-8045-9985d7fcb05b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Updating instance_info_cache with network_info: [{"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.594 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[24493ca2-9529-47fa-b149-546d698c1ae9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.595 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap053c9fc1-71 in ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.596 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap053c9fc1-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.596 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[87922980-8f9b-4ef5-a893-440a60a70c3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.597 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[402cc601-93f7-44ab-beda-4ce26497820c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.605 182096 DEBUG oslo_concurrency.lockutils [req-3d25634d-fd29-48ea-ae2a-e95e5c7bb239 req-2efa7d73-d8d4-4ba8-8045-9985d7fcb05b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.608 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[ed17659a-22ca-4f56-837a-beebf752f76a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.618 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[34c231d5-9658-4799-bfcb-a14abacf8a7b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.638 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6c08dc4f-da3b-4390-8f4e-aa08f930e7e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.643 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6e1951c3-a164-4044-8e4c-f008da52f503]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.6437] manager: (tap053c9fc1-70): new Veth device (/org/freedesktop/NetworkManager/Devices/192)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.667 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a8aed49b-01a6-4a07-b2b4-989ae28fe628]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.669 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[443fd457-84e3-4050-8a28-3e3eb4119b46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.6875] device (tap053c9fc1-70): carrier: link connected
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.692 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ed20aacf-3388-4dab-8d97-eadd196539e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.703 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9f956a4e-9a15-4b40-991d-583e3bafa117]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap053c9fc1-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:bf:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398817, 'reachable_time': 21008, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222033, 'error': None, 'target': 'ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.715 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6b81e8a8-b2ee-49aa-823c-1f7d3ed203c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feab:bff4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398817, 'tstamp': 398817}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222034, 'error': None, 'target': 'ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.725 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3c186a78-942c-404a-b820-54257d790cbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap053c9fc1-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:bf:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398817, 'reachable_time': 21008, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222035, 'error': None, 'target': 'ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.747 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6627c58b-9895-49e1-af52-c1e19bce249f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.784 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a928a386-1c9d-4712-962a-146d74eebbce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.785 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap053c9fc1-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.785 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.786 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap053c9fc1-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:02 compute-0 NetworkManager[54920]: <info>  [1769160482.7880] manager: (tap053c9fc1-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.787 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 kernel: tap053c9fc1-70: entered promiscuous mode
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.791 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.796 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap053c9fc1-70, col_values=(('external_ids', {'iface-id': '9ed575e3-2e8a-4b2a-9423-d7dd8bc7c3cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.797 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_controller[94697]: 2026-01-23T09:28:02Z|00371|binding|INFO|Releasing lport 9ed575e3-2e8a-4b2a-9423-d7dd8bc7c3cb from this chassis (sb_readonly=0)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.809 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.810 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/053c9fc1-7210-4bb4-97b3-0a578f258011.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/053c9fc1-7210-4bb4-97b3-0a578f258011.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.811 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b687b8a4-16a7-42b5-8330-439b568004ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.811 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-053c9fc1-7210-4bb4-97b3-0a578f258011
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/053c9fc1-7210-4bb4-97b3-0a578f258011.pid.haproxy
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 053c9fc1-7210-4bb4-97b3-0a578f258011
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:28:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:02.812 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011', 'env', 'PROCESS_TAG=haproxy-053c9fc1-7210-4bb4-97b3-0a578f258011', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/053c9fc1-7210-4bb4-97b3-0a578f258011.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.814 182096 DEBUG nova.compute.manager [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received event network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.814 182096 DEBUG oslo_concurrency.lockutils [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "86281329-601b-43f8-8481-663f35dbe261-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.814 182096 DEBUG oslo_concurrency.lockutils [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.814 182096 DEBUG oslo_concurrency.lockutils [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.815 182096 DEBUG nova.compute.manager [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Processing event network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.815 182096 DEBUG nova.compute.manager [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received event network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.815 182096 DEBUG oslo_concurrency.lockutils [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "86281329-601b-43f8-8481-663f35dbe261-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.815 182096 DEBUG oslo_concurrency.lockutils [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.815 182096 DEBUG oslo_concurrency.lockutils [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.815 182096 DEBUG nova.compute.manager [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] No waiting events found dispatching network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.816 182096 WARNING nova.compute.manager [req-ca07909a-a3b2-4beb-9467-2d4d3f9a20ef req-a34f7c27-e5a3-4ab1-b28b-4ee35f6edf97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received unexpected event network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 for instance with vm_state building and task_state spawning.
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.816 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.818 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160482.8186, 86281329-601b-43f8-8481-663f35dbe261 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.819 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] VM Resumed (Lifecycle Event)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.830 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.833 182096 INFO nova.virt.libvirt.driver [-] [instance: 86281329-601b-43f8-8481-663f35dbe261] Instance spawned successfully.
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.833 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.842 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.846 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.850 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.851 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.851 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.851 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.852 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.852 182096 DEBUG nova.virt.libvirt.driver [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.874 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.910 182096 INFO nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Took 3.72 seconds to spawn the instance on the hypervisor.
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.910 182096 DEBUG nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.913 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.914 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160482.913927, 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.914 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] VM Started (Lifecycle Event)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.921 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.923 182096 INFO nova.virt.libvirt.driver [-] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Instance spawned successfully.
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.923 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.951 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.953 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.969 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.969 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.969 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.970 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.970 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.971 182096 DEBUG nova.virt.libvirt.driver [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.974 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.974 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160482.9146001, 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.974 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] VM Paused (Lifecycle Event)
Jan 23 09:28:02 compute-0 nova_compute[182092]: 2026-01-23 09:28:02.998 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.000 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160482.9201343, 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.000 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] VM Resumed (Lifecycle Event)
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.014 182096 INFO nova.compute.manager [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Took 4.47 seconds to build instance.
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.024 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.026 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.050 182096 INFO nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Took 3.68 seconds to spawn the instance on the hypervisor.
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.050 182096 DEBUG nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.051 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.091 182096 DEBUG oslo_concurrency.lockutils [None req-d1922a5e-a81d-49ce-944f-8478a757e0c8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.124 182096 INFO nova.compute.manager [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Took 4.26 seconds to build instance.
Jan 23 09:28:03 compute-0 podman[222070]: 2026-01-23 09:28:03.126426589 +0000 UTC m=+0.040362446 container create 9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.139 182096 DEBUG oslo_concurrency.lockutils [None req-b5735c72-da28-41fc-a700-c2bb72bd7707 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:03 compute-0 systemd[1]: Started libpod-conmon-9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9.scope.
Jan 23 09:28:03 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:28:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f9428c72ac01d84cd6706e12a31eb3903191f5b1e98295c3538b5d6d5906f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:28:03 compute-0 podman[222070]: 2026-01-23 09:28:03.185545196 +0000 UTC m=+0.099481064 container init 9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:28:03 compute-0 podman[222070]: 2026-01-23 09:28:03.189650491 +0000 UTC m=+0.103586348 container start 9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:28:03 compute-0 podman[222070]: 2026-01-23 09:28:03.112839449 +0000 UTC m=+0.026775337 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:28:03 compute-0 neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011[222082]: [NOTICE]   (222086) : New worker (222088) forked
Jan 23 09:28:03 compute-0 neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011[222082]: [NOTICE]   (222086) : Loading success.
Jan 23 09:28:03 compute-0 nova_compute[182092]: 2026-01-23 09:28:03.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:04 compute-0 podman[222093]: 2026-01-23 09:28:04.23810303 +0000 UTC m=+0.077169359 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:28:04 compute-0 podman[222094]: 2026-01-23 09:28:04.266329475 +0000 UTC m=+0.103598462 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.422 182096 DEBUG nova.compute.manager [req-72eae4fc-ff53-4a88-99ba-0cd043305558 req-1be09b09-7727-4083-b7b7-ce57c140fad2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.422 182096 DEBUG oslo_concurrency.lockutils [req-72eae4fc-ff53-4a88-99ba-0cd043305558 req-1be09b09-7727-4083-b7b7-ce57c140fad2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.422 182096 DEBUG oslo_concurrency.lockutils [req-72eae4fc-ff53-4a88-99ba-0cd043305558 req-1be09b09-7727-4083-b7b7-ce57c140fad2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.422 182096 DEBUG oslo_concurrency.lockutils [req-72eae4fc-ff53-4a88-99ba-0cd043305558 req-1be09b09-7727-4083-b7b7-ce57c140fad2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.422 182096 DEBUG nova.compute.manager [req-72eae4fc-ff53-4a88-99ba-0cd043305558 req-1be09b09-7727-4083-b7b7-ce57c140fad2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] No waiting events found dispatching network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.422 182096 WARNING nova.compute.manager [req-72eae4fc-ff53-4a88-99ba-0cd043305558 req-1be09b09-7727-4083-b7b7-ce57c140fad2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received unexpected event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc for instance with vm_state active and task_state None.
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.664 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.664 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.665 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.665 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.718 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:04 compute-0 NetworkManager[54920]: <info>  [1769160484.7666] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Jan 23 09:28:04 compute-0 NetworkManager[54920]: <info>  [1769160484.7672] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.773 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.788 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.789 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:04 compute-0 ovn_controller[94697]: 2026-01-23T09:28:04Z|00372|binding|INFO|Releasing lport 9ed575e3-2e8a-4b2a-9423-d7dd8bc7c3cb from this chassis (sb_readonly=0)
Jan 23 09:28:04 compute-0 ovn_controller[94697]: 2026-01-23T09:28:04Z|00373|binding|INFO|Releasing lport 8f5fbc9a-d56d-440e-b553-45cbeacd70e2 from this chassis (sb_readonly=0)
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.883 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.900 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.907 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261/disk --force-share --output=json" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.912 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.975 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.976 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.992 182096 DEBUG nova.compute.manager [req-be9ef4c1-8690-4d28-ac55-5c831801a6de req-b21e0ccc-5547-4c34-baa0-22effc1c1fba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received event network-changed-91360d67-ca14-4940-989d-37d7efdfc1d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.993 182096 DEBUG nova.compute.manager [req-be9ef4c1-8690-4d28-ac55-5c831801a6de req-b21e0ccc-5547-4c34-baa0-22effc1c1fba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Refreshing instance network info cache due to event network-changed-91360d67-ca14-4940-989d-37d7efdfc1d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.993 182096 DEBUG oslo_concurrency.lockutils [req-be9ef4c1-8690-4d28-ac55-5c831801a6de req-b21e0ccc-5547-4c34-baa0-22effc1c1fba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.993 182096 DEBUG oslo_concurrency.lockutils [req-be9ef4c1-8690-4d28-ac55-5c831801a6de req-b21e0ccc-5547-4c34-baa0-22effc1c1fba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:04 compute-0 nova_compute[182092]: 2026-01-23 09:28:04.994 182096 DEBUG nova.network.neutron [req-be9ef4c1-8690-4d28-ac55-5c831801a6de req-b21e0ccc-5547-4c34-baa0-22effc1c1fba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Refreshing network info cache for port 91360d67-ca14-4940-989d-37d7efdfc1d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.035 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.341 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.344 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5387MB free_disk=73.2621078491211GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.345 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.345 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.447 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 86281329-601b-43f8-8481-663f35dbe261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.447 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.447 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.448 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.499 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.516 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.534 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:28:05 compute-0 nova_compute[182092]: 2026-01-23 09:28:05.535 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.462 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.463 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.463 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.463 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.464 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.470 182096 INFO nova.compute.manager [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Terminating instance
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.476 182096 DEBUG nova.compute.manager [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:28:06 compute-0 kernel: tap5d50299c-b0 (unregistering): left promiscuous mode
Jan 23 09:28:06 compute-0 NetworkManager[54920]: <info>  [1769160486.5025] device (tap5d50299c-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00374|binding|INFO|Releasing lport 5d50299c-b077-40cc-9547-44725cb10cfc from this chassis (sb_readonly=0)
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00375|binding|INFO|Setting lport 5d50299c-b077-40cc-9547-44725cb10cfc down in Southbound
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00376|binding|INFO|Removing iface tap5d50299c-b0 ovn-installed in OVS
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.510 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.513 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:4b:6f 10.100.0.3'], port_security=['fa:16:3e:f8:4b:6f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-053c9fc1-7210-4bb4-97b3-0a578f258011', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d63881e7e674f288e8f5746af8eddeb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '31f115c9-4597-433c-aeb5-8297feb59b3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6b5a3c4-fb76-4c11-a3c5-5fefda4cb7b4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=5d50299c-b077-40cc-9547-44725cb10cfc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.514 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 5d50299c-b077-40cc-9547-44725cb10cfc in datapath 053c9fc1-7210-4bb4-97b3-0a578f258011 unbound from our chassis
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.516 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 053c9fc1-7210-4bb4-97b3-0a578f258011, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.521 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf87b1d-234a-4bfd-989f-484aa9fd1d88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.522 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011 namespace which is not needed anymore
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.523 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.528 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.535 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:06 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000063.scope: Deactivated successfully.
Jan 23 09:28:06 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000063.scope: Consumed 4.184s CPU time.
Jan 23 09:28:06 compute-0 systemd-machined[153562]: Machine qemu-51-instance-00000063 terminated.
Jan 23 09:28:06 compute-0 neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011[222082]: [NOTICE]   (222086) : haproxy version is 2.8.14-c23fe91
Jan 23 09:28:06 compute-0 neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011[222082]: [NOTICE]   (222086) : path to executable is /usr/sbin/haproxy
Jan 23 09:28:06 compute-0 neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011[222082]: [ALERT]    (222086) : Current worker (222088) exited with code 143 (Terminated)
Jan 23 09:28:06 compute-0 neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011[222082]: [WARNING]  (222086) : All workers exited. Exiting... (0)
Jan 23 09:28:06 compute-0 systemd[1]: libpod-9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9.scope: Deactivated successfully.
Jan 23 09:28:06 compute-0 conmon[222082]: conmon 9d19da5949e255dab84f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9.scope/container/memory.events
Jan 23 09:28:06 compute-0 podman[222171]: 2026-01-23 09:28:06.621875798 +0000 UTC m=+0.035317452 container died 9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.637 182096 DEBUG nova.network.neutron [req-be9ef4c1-8690-4d28-ac55-5c831801a6de req-b21e0ccc-5547-4c34-baa0-22effc1c1fba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updated VIF entry in instance network info cache for port 91360d67-ca14-4940-989d-37d7efdfc1d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.638 182096 DEBUG nova.network.neutron [req-be9ef4c1-8690-4d28-ac55-5c831801a6de req-b21e0ccc-5547-4c34-baa0-22effc1c1fba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updating instance_info_cache with network_info: [{"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9-userdata-shm.mount: Deactivated successfully.
Jan 23 09:28:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-e6f9428c72ac01d84cd6706e12a31eb3903191f5b1e98295c3538b5d6d5906f2-merged.mount: Deactivated successfully.
Jan 23 09:28:06 compute-0 podman[222171]: 2026-01-23 09:28:06.643277829 +0000 UTC m=+0.056719484 container cleanup 9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:28:06 compute-0 systemd[1]: libpod-conmon-9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9.scope: Deactivated successfully.
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.663 182096 DEBUG oslo_concurrency.lockutils [req-be9ef4c1-8690-4d28-ac55-5c831801a6de req-b21e0ccc-5547-4c34-baa0-22effc1c1fba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.666 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.669 182096 DEBUG nova.compute.manager [req-81946331-f7a2-4696-b924-67c27e66d9b4 req-017cc193-4556-4795-9a82-1cd1eaa1f930 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-unplugged-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.669 182096 DEBUG oslo_concurrency.lockutils [req-81946331-f7a2-4696-b924-67c27e66d9b4 req-017cc193-4556-4795-9a82-1cd1eaa1f930 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.670 182096 DEBUG oslo_concurrency.lockutils [req-81946331-f7a2-4696-b924-67c27e66d9b4 req-017cc193-4556-4795-9a82-1cd1eaa1f930 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.670 182096 DEBUG oslo_concurrency.lockutils [req-81946331-f7a2-4696-b924-67c27e66d9b4 req-017cc193-4556-4795-9a82-1cd1eaa1f930 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.670 182096 DEBUG nova.compute.manager [req-81946331-f7a2-4696-b924-67c27e66d9b4 req-017cc193-4556-4795-9a82-1cd1eaa1f930 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] No waiting events found dispatching network-vif-unplugged-5d50299c-b077-40cc-9547-44725cb10cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.670 182096 DEBUG nova.compute.manager [req-81946331-f7a2-4696-b924-67c27e66d9b4 req-017cc193-4556-4795-9a82-1cd1eaa1f930 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-unplugged-5d50299c-b077-40cc-9547-44725cb10cfc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:28:06 compute-0 podman[222196]: 2026-01-23 09:28:06.68531213 +0000 UTC m=+0.026892229 container remove 9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:28:06 compute-0 kernel: tap5d50299c-b0: entered promiscuous mode
Jan 23 09:28:06 compute-0 kernel: tap5d50299c-b0 (unregistering): left promiscuous mode
Jan 23 09:28:06 compute-0 NetworkManager[54920]: <info>  [1769160486.6908] manager: (tap5d50299c-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00377|binding|INFO|Claiming lport 5d50299c-b077-40cc-9547-44725cb10cfc for this chassis.
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00378|binding|INFO|5d50299c-b077-40cc-9547-44725cb10cfc: Claiming fa:16:3e:f8:4b:6f 10.100.0.3
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.694 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.696 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[789dc105-02d3-430a-b685-b31a06b7daa8]: (4, ('Fri Jan 23 09:28:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011 (9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9)\n9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9\nFri Jan 23 09:28:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011 (9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9)\n9d19da5949e255dab84f3ddcb563f0fe1029c8e0e21ed132ccbdd88949e2f0c9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.701 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:4b:6f 10.100.0.3'], port_security=['fa:16:3e:f8:4b:6f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-053c9fc1-7210-4bb4-97b3-0a578f258011', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d63881e7e674f288e8f5746af8eddeb', 'neutron:revision_number': '5', 'neutron:security_group_ids': '31f115c9-4597-433c-aeb5-8297feb59b3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6b5a3c4-fb76-4c11-a3c5-5fefda4cb7b4, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=5d50299c-b077-40cc-9547-44725cb10cfc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00379|binding|INFO|Setting lport 5d50299c-b077-40cc-9547-44725cb10cfc up in Southbound
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.710 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa4130f-83b4-404e-b845-d5de07c51599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.712 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00380|binding|INFO|Setting lport 5d50299c-b077-40cc-9547-44725cb10cfc ovn-installed in OVS
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.714 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.716 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap053c9fc1-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.717 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 kernel: tap053c9fc1-70: left promiscuous mode
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00381|binding|INFO|Releasing lport 5d50299c-b077-40cc-9547-44725cb10cfc from this chassis (sb_readonly=0)
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00382|binding|INFO|Setting lport 5d50299c-b077-40cc-9547-44725cb10cfc down in Southbound
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.733 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 ovn_controller[94697]: 2026-01-23T09:28:06Z|00383|binding|INFO|Removing iface tap5d50299c-b0 ovn-installed in OVS
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.736 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f80834d7-72cc-4dfa-a6af-1087ce416a0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.740 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:4b:6f 10.100.0.3'], port_security=['fa:16:3e:f8:4b:6f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-053c9fc1-7210-4bb4-97b3-0a578f258011', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d63881e7e674f288e8f5746af8eddeb', 'neutron:revision_number': '5', 'neutron:security_group_ids': '31f115c9-4597-433c-aeb5-8297feb59b3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6b5a3c4-fb76-4c11-a3c5-5fefda4cb7b4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=5d50299c-b077-40cc-9547-44725cb10cfc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.737 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.739 182096 INFO nova.virt.libvirt.driver [-] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Instance destroyed successfully.
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.740 182096 DEBUG nova.objects.instance [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lazy-loading 'resources' on Instance uuid 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.750 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1c53a05e-1046-41e3-9bcb-3f50798a4342]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.753 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[878fa438-0146-4be1-b64b-ab2716322fac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.751 182096 DEBUG nova.virt.libvirt.vif [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:27:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1729204163',display_name='tempest-ServerMetadataTestJSON-server-1729204163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1729204163',id=99,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:28:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d63881e7e674f288e8f5746af8eddeb',ramdisk_id='',reservation_id='r-tguiv1hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-1492635001',owner_user_name='tempest-ServerMetadataTestJSON-1492635001-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:28:06Z,user_data=None,user_id='515824da811b44c5975fc3e39e067bd4',uuid=8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.751 182096 DEBUG nova.network.os_vif_util [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Converting VIF {"id": "5d50299c-b077-40cc-9547-44725cb10cfc", "address": "fa:16:3e:f8:4b:6f", "network": {"id": "053c9fc1-7210-4bb4-97b3-0a578f258011", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1655346663-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d63881e7e674f288e8f5746af8eddeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d50299c-b0", "ovs_interfaceid": "5d50299c-b077-40cc-9547-44725cb10cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.752 182096 DEBUG nova.network.os_vif_util [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:4b:6f,bridge_name='br-int',has_traffic_filtering=True,id=5d50299c-b077-40cc-9547-44725cb10cfc,network=Network(053c9fc1-7210-4bb4-97b3-0a578f258011),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d50299c-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.752 182096 DEBUG os_vif [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:4b:6f,bridge_name='br-int',has_traffic_filtering=True,id=5d50299c-b077-40cc-9547-44725cb10cfc,network=Network(053c9fc1-7210-4bb4-97b3-0a578f258011),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d50299c-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.754 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.754 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d50299c-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.756 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.757 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.759 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.761 182096 INFO os_vif [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:4b:6f,bridge_name='br-int',has_traffic_filtering=True,id=5d50299c-b077-40cc-9547-44725cb10cfc,network=Network(053c9fc1-7210-4bb4-97b3-0a578f258011),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d50299c-b0')
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.761 182096 INFO nova.virt.libvirt.driver [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Deleting instance files /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3_del
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.762 182096 INFO nova.virt.libvirt.driver [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Deletion of /var/lib/nova/instances/8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3_del complete
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.768 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c26c296e-a43f-49a2-bd29-5e6220d73587]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398812, 'reachable_time': 43906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222224, 'error': None, 'target': 'ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d053c9fc1\x2d7210\x2d4bb4\x2d97b3\x2d0a578f258011.mount: Deactivated successfully.
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.770 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-053c9fc1-7210-4bb4-97b3-0a578f258011 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.771 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[68fe92ab-ddad-43ca-87c3-c6e9f9d1bc2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.772 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 5d50299c-b077-40cc-9547-44725cb10cfc in datapath 053c9fc1-7210-4bb4-97b3-0a578f258011 unbound from our chassis
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.773 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.773 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 053c9fc1-7210-4bb4-97b3-0a578f258011, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.774 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.775 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[030e5e00-c77d-4cc6-bc2f-4596d740ea76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.775 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 5d50299c-b077-40cc-9547-44725cb10cfc in datapath 053c9fc1-7210-4bb4-97b3-0a578f258011 unbound from our chassis
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.777 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 053c9fc1-7210-4bb4-97b3-0a578f258011, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.777 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ddccec53-d27d-4247-895c-012e0ebd326b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:06 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:06.778 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.807 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.808 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.808 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.808 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 86281329-601b-43f8-8481-663f35dbe261 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.839 182096 INFO nova.compute.manager [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.840 182096 DEBUG oslo.service.loopingcall [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.840 182096 DEBUG nova.compute.manager [-] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:28:06 compute-0 nova_compute[182092]: 2026-01-23 09:28:06.840 182096 DEBUG nova.network.neutron [-] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.113 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.342 182096 DEBUG nova.network.neutron [-] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.353 182096 INFO nova.compute.manager [-] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Took 0.51 seconds to deallocate network for instance.
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.399 182096 DEBUG nova.compute.manager [req-36fef15e-105b-4fe2-b320-1df57f2350ad req-23c4f024-7fbf-4e44-8092-c632116e0569 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-deleted-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.423 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.424 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.468 182096 DEBUG nova.compute.provider_tree [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.478 182096 DEBUG nova.scheduler.client.report [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.490 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.506 182096 INFO nova.scheduler.client.report [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Deleted allocations for instance 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.552 182096 DEBUG oslo_concurrency.lockutils [None req-34dec886-3ffe-415b-aaf2-5ec6d8bdd668 515824da811b44c5975fc3e39e067bd4 6d63881e7e674f288e8f5746af8eddeb - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.744 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updating instance_info_cache with network_info: [{"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.754 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.754 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.754 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.755 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:07 compute-0 nova_compute[182092]: 2026-01-23 09:28:07.755 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:28:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:07.780 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.668 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.741 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.741 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.742 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.742 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.742 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] No waiting events found dispatching network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.743 182096 WARNING nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received unexpected event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc for instance with vm_state deleted and task_state None.
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.743 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.743 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.743 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.744 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.744 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] No waiting events found dispatching network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.744 182096 WARNING nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received unexpected event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc for instance with vm_state deleted and task_state None.
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.744 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.745 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.745 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.745 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.745 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] No waiting events found dispatching network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.746 182096 WARNING nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received unexpected event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc for instance with vm_state deleted and task_state None.
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.746 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-unplugged-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.746 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.746 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.747 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.747 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] No waiting events found dispatching network-vif-unplugged-5d50299c-b077-40cc-9547-44725cb10cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.747 182096 WARNING nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received unexpected event network-vif-unplugged-5d50299c-b077-40cc-9547-44725cb10cfc for instance with vm_state deleted and task_state None.
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.747 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.748 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.748 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.748 182096 DEBUG oslo_concurrency.lockutils [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.748 182096 DEBUG nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] No waiting events found dispatching network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:08 compute-0 nova_compute[182092]: 2026-01-23 09:28:08.749 182096 WARNING nova.compute.manager [req-e02a901f-1039-4666-a2a0-b6569798ad08 req-760b968e-d3eb-4fc8-a5bc-36f627f25519 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Received unexpected event network-vif-plugged-5d50299c-b077-40cc-9547-44725cb10cfc for instance with vm_state deleted and task_state None.
Jan 23 09:28:09 compute-0 nova_compute[182092]: 2026-01-23 09:28:09.663 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:11 compute-0 nova_compute[182092]: 2026-01-23 09:28:11.756 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:12 compute-0 nova_compute[182092]: 2026-01-23 09:28:12.115 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:14 compute-0 nova_compute[182092]: 2026-01-23 09:28:14.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:14 compute-0 ovn_controller[94697]: 2026-01-23T09:28:14Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:22:00 10.100.0.9
Jan 23 09:28:14 compute-0 ovn_controller[94697]: 2026-01-23T09:28:14Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:22:00 10.100.0.9
Jan 23 09:28:15 compute-0 podman[222235]: 2026-01-23 09:28:15.198227429 +0000 UTC m=+0.035073993 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:28:15 compute-0 podman[222236]: 2026-01-23 09:28:15.21219199 +0000 UTC m=+0.048474901 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:28:16 compute-0 ovn_controller[94697]: 2026-01-23T09:28:16Z|00384|binding|INFO|Releasing lport 8f5fbc9a-d56d-440e-b553-45cbeacd70e2 from this chassis (sb_readonly=0)
Jan 23 09:28:16 compute-0 nova_compute[182092]: 2026-01-23 09:28:16.265 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:16 compute-0 nova_compute[182092]: 2026-01-23 09:28:16.759 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:17 compute-0 nova_compute[182092]: 2026-01-23 09:28:17.116 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:17 compute-0 podman[222273]: 2026-01-23 09:28:17.22826752 +0000 UTC m=+0.066039224 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.446 182096 INFO nova.compute.manager [None req-1fa817c9-e7b8-45c6-a4a0-57ee8788108f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Get console output
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.449 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.643 182096 INFO nova.compute.manager [None req-e77bb07f-0c40-43e9-8da5-eb062af16f03 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Pausing
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.644 182096 DEBUG nova.objects.instance [None req-e77bb07f-0c40-43e9-8da5-eb062af16f03 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'flavor' on Instance uuid 86281329-601b-43f8-8481-663f35dbe261 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.679 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160500.6790478, 86281329-601b-43f8-8481-663f35dbe261 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.679 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] VM Paused (Lifecycle Event)
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.680 182096 DEBUG nova.compute.manager [None req-e77bb07f-0c40-43e9-8da5-eb062af16f03 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.698 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:20 compute-0 nova_compute[182092]: 2026-01-23 09:28:20.700 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:21 compute-0 nova_compute[182092]: 2026-01-23 09:28:21.733 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160486.7285473, 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:21 compute-0 nova_compute[182092]: 2026-01-23 09:28:21.733 182096 INFO nova.compute.manager [-] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] VM Stopped (Lifecycle Event)
Jan 23 09:28:21 compute-0 nova_compute[182092]: 2026-01-23 09:28:21.744 182096 DEBUG nova.compute.manager [None req-df265b32-6991-43f6-be85-b0323115f160 - - - - - -] [instance: 8b84fb7b-c80d-47e4-a39f-fa4738c5f1b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:21 compute-0 nova_compute[182092]: 2026-01-23 09:28:21.760 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:22 compute-0 nova_compute[182092]: 2026-01-23 09:28:22.119 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:22 compute-0 nova_compute[182092]: 2026-01-23 09:28:22.990 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.807 182096 INFO nova.compute.manager [None req-b9ddd7ba-4831-4bcc-8bc3-3806438566e8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Get console output
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.810 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.927 182096 INFO nova.compute.manager [None req-81f374c3-28db-4ec3-9ba0-595891d9a08a 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Unpausing
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.928 182096 DEBUG nova.objects.instance [None req-81f374c3-28db-4ec3-9ba0-595891d9a08a 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'flavor' on Instance uuid 86281329-601b-43f8-8481-663f35dbe261 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.950 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160503.9506207, 86281329-601b-43f8-8481-663f35dbe261 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.951 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] VM Resumed (Lifecycle Event)
Jan 23 09:28:23 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.954 182096 DEBUG nova.virt.libvirt.guest [None req-81f374c3-28db-4ec3-9ba0-595891d9a08a 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.954 182096 DEBUG nova.compute.manager [None req-81f374c3-28db-4ec3-9ba0-595891d9a08a 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.979 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.981 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:23 compute-0 nova_compute[182092]: 2026-01-23 09:28:23.999 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] During sync_power_state the instance has a pending task (unpausing). Skip.
Jan 23 09:28:26 compute-0 nova_compute[182092]: 2026-01-23 09:28:26.415 182096 INFO nova.compute.manager [None req-f7dd3999-2ec4-47d4-a182-59a6735d81d8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Get console output
Jan 23 09:28:26 compute-0 nova_compute[182092]: 2026-01-23 09:28:26.418 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:28:26 compute-0 nova_compute[182092]: 2026-01-23 09:28:26.761 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.120 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 podman[222291]: 2026-01-23 09:28:27.234848167 +0000 UTC m=+0.058996607 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.577 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.610 182096 DEBUG nova.compute.manager [req-87dc9993-4702-46b1-a876-e3511255f379 req-cea9a4d2-2689-41d7-acb3-16c549f759db 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received event network-changed-91360d67-ca14-4940-989d-37d7efdfc1d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.610 182096 DEBUG nova.compute.manager [req-87dc9993-4702-46b1-a876-e3511255f379 req-cea9a4d2-2689-41d7-acb3-16c549f759db 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Refreshing instance network info cache due to event network-changed-91360d67-ca14-4940-989d-37d7efdfc1d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.610 182096 DEBUG oslo_concurrency.lockutils [req-87dc9993-4702-46b1-a876-e3511255f379 req-cea9a4d2-2689-41d7-acb3-16c549f759db 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.610 182096 DEBUG oslo_concurrency.lockutils [req-87dc9993-4702-46b1-a876-e3511255f379 req-cea9a4d2-2689-41d7-acb3-16c549f759db 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.610 182096 DEBUG nova.network.neutron [req-87dc9993-4702-46b1-a876-e3511255f379 req-cea9a4d2-2689-41d7-acb3-16c549f759db 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Refreshing network info cache for port 91360d67-ca14-4940-989d-37d7efdfc1d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.685 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "86281329-601b-43f8-8481-663f35dbe261" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.685 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.686 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "86281329-601b-43f8-8481-663f35dbe261-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.686 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.686 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.692 182096 INFO nova.compute.manager [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Terminating instance
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.698 182096 DEBUG nova.compute.manager [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:28:27 compute-0 kernel: tap91360d67-ca (unregistering): left promiscuous mode
Jan 23 09:28:27 compute-0 NetworkManager[54920]: <info>  [1769160507.7277] device (tap91360d67-ca): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.731 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 ovn_controller[94697]: 2026-01-23T09:28:27Z|00385|binding|INFO|Releasing lport 91360d67-ca14-4940-989d-37d7efdfc1d7 from this chassis (sb_readonly=0)
Jan 23 09:28:27 compute-0 ovn_controller[94697]: 2026-01-23T09:28:27Z|00386|binding|INFO|Setting lport 91360d67-ca14-4940-989d-37d7efdfc1d7 down in Southbound
Jan 23 09:28:27 compute-0 ovn_controller[94697]: 2026-01-23T09:28:27Z|00387|binding|INFO|Removing iface tap91360d67-ca ovn-installed in OVS
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.735 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.745 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:22:00 10.100.0.9'], port_security=['fa:16:3e:e5:22:00 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '86281329-601b-43f8-8481-663f35dbe261', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-338626c5-1f64-4e00-b560-256b83590866', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ef5e74ef-33f5-4136-a29e-09c8fc96c8b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9baa215-6352-4e59-96e8-12bbc8110c9a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=91360d67-ca14-4940-989d-37d7efdfc1d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.746 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 91360d67-ca14-4940-989d-37d7efdfc1d7 in datapath 338626c5-1f64-4e00-b560-256b83590866 unbound from our chassis
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.747 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 338626c5-1f64-4e00-b560-256b83590866, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.747 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.748 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c9815df7-e46f-485e-9197-0f96e14793d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.748 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-338626c5-1f64-4e00-b560-256b83590866 namespace which is not needed anymore
Jan 23 09:28:27 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000062.scope: Deactivated successfully.
Jan 23 09:28:27 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000062.scope: Consumed 11.340s CPU time.
Jan 23 09:28:27 compute-0 systemd-machined[153562]: Machine qemu-50-instance-00000062 terminated.
Jan 23 09:28:27 compute-0 neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866[222012]: [NOTICE]   (222016) : haproxy version is 2.8.14-c23fe91
Jan 23 09:28:27 compute-0 neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866[222012]: [NOTICE]   (222016) : path to executable is /usr/sbin/haproxy
Jan 23 09:28:27 compute-0 neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866[222012]: [WARNING]  (222016) : Exiting Master process...
Jan 23 09:28:27 compute-0 neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866[222012]: [ALERT]    (222016) : Current worker (222018) exited with code 143 (Terminated)
Jan 23 09:28:27 compute-0 neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866[222012]: [WARNING]  (222016) : All workers exited. Exiting... (0)
Jan 23 09:28:27 compute-0 systemd[1]: libpod-790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b.scope: Deactivated successfully.
Jan 23 09:28:27 compute-0 podman[222336]: 2026-01-23 09:28:27.842471567 +0000 UTC m=+0.033947718 container died 790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:28:27 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b-userdata-shm.mount: Deactivated successfully.
Jan 23 09:28:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-cb805da73109bdc45eb032bd84f75421c26c489b7f41104060628c5e59961384-merged.mount: Deactivated successfully.
Jan 23 09:28:27 compute-0 podman[222336]: 2026-01-23 09:28:27.860957668 +0000 UTC m=+0.052433799 container cleanup 790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 09:28:27 compute-0 systemd[1]: libpod-conmon-790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b.scope: Deactivated successfully.
Jan 23 09:28:27 compute-0 podman[222362]: 2026-01-23 09:28:27.90213874 +0000 UTC m=+0.023761113 container remove 790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.906 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dceee6e2-0eb0-4a6f-bac7-577ba3d8aebf]: (4, ('Fri Jan 23 09:28:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866 (790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b)\n790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b\nFri Jan 23 09:28:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-338626c5-1f64-4e00-b560-256b83590866 (790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b)\n790a004b195005088fdb2605bcbdbaa2c6c601f305b27842a0c90e3872972c9b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.907 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e235df25-0674-46cf-92d7-c2bbed1b19c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.908 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap338626c5-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.910 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.924 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 kernel: tap338626c5-10: left promiscuous mode
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.930 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.932 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[90c0c638-c689-49f1-be48-2b332f535d83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.944 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fccebd1b-0b1d-40bf-8e1d-ae20dc69d251]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.945 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a8addc2e-3853-4d89-90a7-5953339043c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.950 182096 INFO nova.virt.libvirt.driver [-] [instance: 86281329-601b-43f8-8481-663f35dbe261] Instance destroyed successfully.
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.951 182096 DEBUG nova.objects.instance [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'resources' on Instance uuid 86281329-601b-43f8-8481-663f35dbe261 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.958 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9f86c4-927c-4ded-b026-e6239a35ae7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398748, 'reachable_time': 32617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222394, 'error': None, 'target': 'ovnmeta-338626c5-1f64-4e00-b560-256b83590866', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:27 compute-0 systemd[1]: run-netns-ovnmeta\x2d338626c5\x2d1f64\x2d4e00\x2db560\x2d256b83590866.mount: Deactivated successfully.
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.960 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-338626c5-1f64-4e00-b560-256b83590866 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:28:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:27.960 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[68af9a37-768c-4b8d-a54f-da0ba7413c48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.973 182096 DEBUG nova.virt.libvirt.vif [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:27:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-499831423',display_name='tempest-TestNetworkAdvancedServerOps-server-499831423',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-499831423',id=98,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAIsYFCa/11HZR3PTxodDOgYteIa6L1Fy2AcBxm/cET+zB2mloDk36zPnZjwfjHoVBZJTHqUMTQzNneLTFMGKPbO7IrpXzXA4rU3Vf9S33Vyg1DVCmvSF4cG9nHo/x7+Ug==',key_name='tempest-TestNetworkAdvancedServerOps-844257604',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:28:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-lg0l11g9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:28:24Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=86281329-601b-43f8-8481-663f35dbe261,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.973 182096 DEBUG nova.network.os_vif_util [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.974 182096 DEBUG nova.network.os_vif_util [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e5:22:00,bridge_name='br-int',has_traffic_filtering=True,id=91360d67-ca14-4940-989d-37d7efdfc1d7,network=Network(338626c5-1f64-4e00-b560-256b83590866),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91360d67-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.974 182096 DEBUG os_vif [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:22:00,bridge_name='br-int',has_traffic_filtering=True,id=91360d67-ca14-4940-989d-37d7efdfc1d7,network=Network(338626c5-1f64-4e00-b560-256b83590866),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91360d67-ca') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.975 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.975 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91360d67-ca, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.976 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.978 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.979 182096 INFO os_vif [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:22:00,bridge_name='br-int',has_traffic_filtering=True,id=91360d67-ca14-4940-989d-37d7efdfc1d7,network=Network(338626c5-1f64-4e00-b560-256b83590866),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91360d67-ca')
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.980 182096 INFO nova.virt.libvirt.driver [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Deleting instance files /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261_del
Jan 23 09:28:27 compute-0 nova_compute[182092]: 2026-01-23 09:28:27.980 182096 INFO nova.virt.libvirt.driver [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Deletion of /var/lib/nova/instances/86281329-601b-43f8-8481-663f35dbe261_del complete
Jan 23 09:28:28 compute-0 nova_compute[182092]: 2026-01-23 09:28:28.060 182096 INFO nova.compute.manager [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 23 09:28:28 compute-0 nova_compute[182092]: 2026-01-23 09:28:28.060 182096 DEBUG oslo.service.loopingcall [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:28:28 compute-0 nova_compute[182092]: 2026-01-23 09:28:28.060 182096 DEBUG nova.compute.manager [-] [instance: 86281329-601b-43f8-8481-663f35dbe261] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:28:28 compute-0 nova_compute[182092]: 2026-01-23 09:28:28.061 182096 DEBUG nova.network.neutron [-] [instance: 86281329-601b-43f8-8481-663f35dbe261] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.229 182096 DEBUG nova.network.neutron [-] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.240 182096 INFO nova.compute.manager [-] [instance: 86281329-601b-43f8-8481-663f35dbe261] Took 1.18 seconds to deallocate network for instance.
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.287 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.287 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.342 182096 DEBUG nova.compute.provider_tree [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.351 182096 DEBUG nova.scheduler.client.report [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.362 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.375 182096 DEBUG nova.network.neutron [req-87dc9993-4702-46b1-a876-e3511255f379 req-cea9a4d2-2689-41d7-acb3-16c549f759db 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updated VIF entry in instance network info cache for port 91360d67-ca14-4940-989d-37d7efdfc1d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.375 182096 DEBUG nova.network.neutron [req-87dc9993-4702-46b1-a876-e3511255f379 req-cea9a4d2-2689-41d7-acb3-16c549f759db 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Updating instance_info_cache with network_info: [{"id": "91360d67-ca14-4940-989d-37d7efdfc1d7", "address": "fa:16:3e:e5:22:00", "network": {"id": "338626c5-1f64-4e00-b560-256b83590866", "bridge": "br-int", "label": "tempest-network-smoke--2029135739", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91360d67-ca", "ovs_interfaceid": "91360d67-ca14-4940-989d-37d7efdfc1d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.385 182096 DEBUG oslo_concurrency.lockutils [req-87dc9993-4702-46b1-a876-e3511255f379 req-cea9a4d2-2689-41d7-acb3-16c549f759db 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-86281329-601b-43f8-8481-663f35dbe261" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.386 182096 INFO nova.scheduler.client.report [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Deleted allocations for instance 86281329-601b-43f8-8481-663f35dbe261
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.434 182096 DEBUG oslo_concurrency.lockutils [None req-4f1ab8f7-7bd7-4fb1-851e-e1c80b5cc6a6 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.686 182096 DEBUG nova.compute.manager [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received event network-vif-unplugged-91360d67-ca14-4940-989d-37d7efdfc1d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.686 182096 DEBUG oslo_concurrency.lockutils [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "86281329-601b-43f8-8481-663f35dbe261-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.686 182096 DEBUG oslo_concurrency.lockutils [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.686 182096 DEBUG oslo_concurrency.lockutils [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.687 182096 DEBUG nova.compute.manager [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] No waiting events found dispatching network-vif-unplugged-91360d67-ca14-4940-989d-37d7efdfc1d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.687 182096 WARNING nova.compute.manager [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received unexpected event network-vif-unplugged-91360d67-ca14-4940-989d-37d7efdfc1d7 for instance with vm_state deleted and task_state None.
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.687 182096 DEBUG nova.compute.manager [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received event network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.687 182096 DEBUG oslo_concurrency.lockutils [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "86281329-601b-43f8-8481-663f35dbe261-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.687 182096 DEBUG oslo_concurrency.lockutils [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.687 182096 DEBUG oslo_concurrency.lockutils [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "86281329-601b-43f8-8481-663f35dbe261-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.687 182096 DEBUG nova.compute.manager [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] No waiting events found dispatching network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:29 compute-0 nova_compute[182092]: 2026-01-23 09:28:29.688 182096 WARNING nova.compute.manager [req-db85a9aa-b43e-46ab-98e2-9e98722b515f req-9e33ca31-0e30-466a-939d-a473faedd7a0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received unexpected event network-vif-plugged-91360d67-ca14-4940-989d-37d7efdfc1d7 for instance with vm_state deleted and task_state None.
Jan 23 09:28:30 compute-0 nova_compute[182092]: 2026-01-23 09:28:30.503 182096 DEBUG nova.compute.manager [req-28a983d0-330d-4201-85e4-dad8a80b78d7 req-6e2f8d0a-f8c8-4a30-b4aa-928f80801401 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Received event network-vif-deleted-91360d67-ca14-4940-989d-37d7efdfc1d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:30 compute-0 nova_compute[182092]: 2026-01-23 09:28:30.503 182096 INFO nova.compute.manager [req-28a983d0-330d-4201-85e4-dad8a80b78d7 req-6e2f8d0a-f8c8-4a30-b4aa-928f80801401 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Neutron deleted interface 91360d67-ca14-4940-989d-37d7efdfc1d7; detaching it from the instance and deleting it from the info cache
Jan 23 09:28:30 compute-0 nova_compute[182092]: 2026-01-23 09:28:30.503 182096 DEBUG nova.network.neutron [req-28a983d0-330d-4201-85e4-dad8a80b78d7 req-6e2f8d0a-f8c8-4a30-b4aa-928f80801401 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 23 09:28:30 compute-0 nova_compute[182092]: 2026-01-23 09:28:30.505 182096 DEBUG nova.compute.manager [req-28a983d0-330d-4201-85e4-dad8a80b78d7 req-6e2f8d0a-f8c8-4a30-b4aa-928f80801401 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 86281329-601b-43f8-8481-663f35dbe261] Detach interface failed, port_id=91360d67-ca14-4940-989d-37d7efdfc1d7, reason: Instance 86281329-601b-43f8-8481-663f35dbe261 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.463 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.463 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.474 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.528 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.528 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.533 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.534 182096 INFO nova.compute.claims [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.610 182096 DEBUG nova.compute.provider_tree [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.621 182096 DEBUG nova.scheduler.client.report [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.642 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.642 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.676 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.676 182096 DEBUG nova.network.neutron [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.687 182096 INFO nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.697 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.762 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.764 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.765 182096 INFO nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Creating image(s)
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.765 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "/var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.766 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "/var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.766 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "/var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.776 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.824 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.824 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.825 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.834 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.880 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.881 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.903 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.903 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.904 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.950 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.951 182096 DEBUG nova.virt.disk.api [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Checking if we can resize image /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.951 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.997 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.998 182096 DEBUG nova.virt.disk.api [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Cannot resize image /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:28:31 compute-0 nova_compute[182092]: 2026-01-23 09:28:31.998 182096 DEBUG nova.objects.instance [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lazy-loading 'migration_context' on Instance uuid a6e1cbfb-7cfe-45af-96d9-f198d0342d79 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:32 compute-0 nova_compute[182092]: 2026-01-23 09:28:32.022 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:28:32 compute-0 nova_compute[182092]: 2026-01-23 09:28:32.024 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Ensure instance console log exists: /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:28:32 compute-0 nova_compute[182092]: 2026-01-23 09:28:32.024 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:32 compute-0 nova_compute[182092]: 2026-01-23 09:28:32.025 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:32 compute-0 nova_compute[182092]: 2026-01-23 09:28:32.025 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:32 compute-0 nova_compute[182092]: 2026-01-23 09:28:32.121 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:32 compute-0 nova_compute[182092]: 2026-01-23 09:28:32.203 182096 DEBUG nova.policy [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5efca8c207f84ad492cc9ad81a1a25db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0851b7ba138142a5bbebfa067cfcfb09', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:28:32 compute-0 nova_compute[182092]: 2026-01-23 09:28:32.977 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:32 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:28:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:28:33 compute-0 nova_compute[182092]: 2026-01-23 09:28:33.144 182096 DEBUG nova.network.neutron [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Successfully created port: 16d1a1ff-9f40-4471-b299-74debf9599df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:28:33 compute-0 nova_compute[182092]: 2026-01-23 09:28:33.578 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:33 compute-0 nova_compute[182092]: 2026-01-23 09:28:33.783 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.093 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.112 182096 WARNING nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.112 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Triggering sync for uuid a6e1cbfb-7cfe-45af-96d9-f198d0342d79 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.112 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.119 182096 DEBUG nova.network.neutron [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Successfully updated port: 16d1a1ff-9f40-4471-b299-74debf9599df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.130 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "refresh_cache-a6e1cbfb-7cfe-45af-96d9-f198d0342d79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.130 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquired lock "refresh_cache-a6e1cbfb-7cfe-45af-96d9-f198d0342d79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.130 182096 DEBUG nova.network.neutron [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.192 182096 DEBUG nova.compute.manager [req-7771d573-eee6-49ab-be2f-6501e10213f6 req-d445296f-84e3-4a6a-b542-8cf9c082b908 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received event network-changed-16d1a1ff-9f40-4471-b299-74debf9599df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.193 182096 DEBUG nova.compute.manager [req-7771d573-eee6-49ab-be2f-6501e10213f6 req-d445296f-84e3-4a6a-b542-8cf9c082b908 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Refreshing instance network info cache due to event network-changed-16d1a1ff-9f40-4471-b299-74debf9599df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.193 182096 DEBUG oslo_concurrency.lockutils [req-7771d573-eee6-49ab-be2f-6501e10213f6 req-d445296f-84e3-4a6a-b542-8cf9c082b908 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a6e1cbfb-7cfe-45af-96d9-f198d0342d79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.238 182096 DEBUG nova.network.neutron [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.849 182096 DEBUG nova.network.neutron [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Updating instance_info_cache with network_info: [{"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.868 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Releasing lock "refresh_cache-a6e1cbfb-7cfe-45af-96d9-f198d0342d79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.868 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Instance network_info: |[{"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.868 182096 DEBUG oslo_concurrency.lockutils [req-7771d573-eee6-49ab-be2f-6501e10213f6 req-d445296f-84e3-4a6a-b542-8cf9c082b908 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a6e1cbfb-7cfe-45af-96d9-f198d0342d79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.868 182096 DEBUG nova.network.neutron [req-7771d573-eee6-49ab-be2f-6501e10213f6 req-d445296f-84e3-4a6a-b542-8cf9c082b908 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Refreshing network info cache for port 16d1a1ff-9f40-4471-b299-74debf9599df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.871 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Start _get_guest_xml network_info=[{"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.875 182096 WARNING nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.881 182096 DEBUG nova.virt.libvirt.host [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.881 182096 DEBUG nova.virt.libvirt.host [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.884 182096 DEBUG nova.virt.libvirt.host [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.884 182096 DEBUG nova.virt.libvirt.host [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.885 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.886 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.886 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.886 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.886 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.886 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.887 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.887 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.887 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.887 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.887 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.888 182096 DEBUG nova.virt.hardware [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.890 182096 DEBUG nova.virt.libvirt.vif [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:28:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1726417773',display_name='tempest-ServerPasswordTestJSON-server-1726417773',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1726417773',id=102,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0851b7ba138142a5bbebfa067cfcfb09',ramdisk_id='',reservation_id='r-bvg1yyha',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-611377974',owner_user_name='tempest-ServerPasswordTestJSON-611377974-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:28:31Z,user_data=None,user_id='5efca8c207f84ad492cc9ad81a1a25db',uuid=a6e1cbfb-7cfe-45af-96d9-f198d0342d79,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.890 182096 DEBUG nova.network.os_vif_util [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Converting VIF {"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.891 182096 DEBUG nova.network.os_vif_util [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:8c,bridge_name='br-int',has_traffic_filtering=True,id=16d1a1ff-9f40-4471-b299-74debf9599df,network=Network(8795a3d5-1a82-4f5f-933d-445cddea9b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d1a1ff-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.892 182096 DEBUG nova.objects.instance [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lazy-loading 'pci_devices' on Instance uuid a6e1cbfb-7cfe-45af-96d9-f198d0342d79 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.900 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <uuid>a6e1cbfb-7cfe-45af-96d9-f198d0342d79</uuid>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <name>instance-00000066</name>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerPasswordTestJSON-server-1726417773</nova:name>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:28:34</nova:creationTime>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:28:34 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:28:34 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:28:34 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:28:34 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:28:34 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:28:34 compute-0 nova_compute[182092]:         <nova:user uuid="5efca8c207f84ad492cc9ad81a1a25db">tempest-ServerPasswordTestJSON-611377974-project-member</nova:user>
Jan 23 09:28:34 compute-0 nova_compute[182092]:         <nova:project uuid="0851b7ba138142a5bbebfa067cfcfb09">tempest-ServerPasswordTestJSON-611377974</nova:project>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:28:34 compute-0 nova_compute[182092]:         <nova:port uuid="16d1a1ff-9f40-4471-b299-74debf9599df">
Jan 23 09:28:34 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <system>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <entry name="serial">a6e1cbfb-7cfe-45af-96d9-f198d0342d79</entry>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <entry name="uuid">a6e1cbfb-7cfe-45af-96d9-f198d0342d79</entry>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </system>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <os>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   </os>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <features>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   </features>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk.config"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:7d:94:8c"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <target dev="tap16d1a1ff-9f"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/console.log" append="off"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <video>
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </video>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:28:34 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:28:34 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:28:34 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:28:34 compute-0 nova_compute[182092]: </domain>
Jan 23 09:28:34 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.901 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Preparing to wait for external event network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.901 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.901 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.901 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.902 182096 DEBUG nova.virt.libvirt.vif [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:28:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1726417773',display_name='tempest-ServerPasswordTestJSON-server-1726417773',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1726417773',id=102,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0851b7ba138142a5bbebfa067cfcfb09',ramdisk_id='',reservation_id='r-bvg1yyha',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-611377974',owner_user_name='tempest-ServerPasswordTestJSON-611377974-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:28:31Z,user_data=None,user_id='5efca8c207f84ad492cc9ad81a1a25db',uuid=a6e1cbfb-7cfe-45af-96d9-f198d0342d79,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.902 182096 DEBUG nova.network.os_vif_util [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Converting VIF {"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.903 182096 DEBUG nova.network.os_vif_util [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:8c,bridge_name='br-int',has_traffic_filtering=True,id=16d1a1ff-9f40-4471-b299-74debf9599df,network=Network(8795a3d5-1a82-4f5f-933d-445cddea9b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d1a1ff-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.903 182096 DEBUG os_vif [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:8c,bridge_name='br-int',has_traffic_filtering=True,id=16d1a1ff-9f40-4471-b299-74debf9599df,network=Network(8795a3d5-1a82-4f5f-933d-445cddea9b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d1a1ff-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.903 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.903 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.904 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.905 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.906 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16d1a1ff-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.906 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap16d1a1ff-9f, col_values=(('external_ids', {'iface-id': '16d1a1ff-9f40-4471-b299-74debf9599df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:94:8c', 'vm-uuid': 'a6e1cbfb-7cfe-45af-96d9-f198d0342d79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.907 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:34 compute-0 NetworkManager[54920]: <info>  [1769160514.9084] manager: (tap16d1a1ff-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.909 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.911 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.912 182096 INFO os_vif [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:8c,bridge_name='br-int',has_traffic_filtering=True,id=16d1a1ff-9f40-4471-b299-74debf9599df,network=Network(8795a3d5-1a82-4f5f-933d-445cddea9b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d1a1ff-9f')
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.955 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.956 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.956 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] No VIF found with MAC fa:16:3e:7d:94:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:28:34 compute-0 nova_compute[182092]: 2026-01-23 09:28:34.956 182096 INFO nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Using config drive
Jan 23 09:28:34 compute-0 podman[222414]: 2026-01-23 09:28:34.978181097 +0000 UTC m=+0.041918964 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:28:35 compute-0 podman[222415]: 2026-01-23 09:28:35.003191536 +0000 UTC m=+0.067024172 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.256 182096 INFO nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Creating config drive at /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk.config
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.261 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpux64c7ig execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.377 182096 DEBUG oslo_concurrency.processutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpux64c7ig" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:35 compute-0 kernel: tap16d1a1ff-9f: entered promiscuous mode
Jan 23 09:28:35 compute-0 NetworkManager[54920]: <info>  [1769160515.4164] manager: (tap16d1a1ff-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.417 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:35 compute-0 ovn_controller[94697]: 2026-01-23T09:28:35Z|00388|binding|INFO|Claiming lport 16d1a1ff-9f40-4471-b299-74debf9599df for this chassis.
Jan 23 09:28:35 compute-0 ovn_controller[94697]: 2026-01-23T09:28:35Z|00389|binding|INFO|16d1a1ff-9f40-4471-b299-74debf9599df: Claiming fa:16:3e:7d:94:8c 10.100.0.11
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.420 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.424 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:94:8c 10.100.0.11'], port_security=['fa:16:3e:7d:94:8c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a6e1cbfb-7cfe-45af-96d9-f198d0342d79', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8795a3d5-1a82-4f5f-933d-445cddea9b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0851b7ba138142a5bbebfa067cfcfb09', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4c753e4f-6b96-48fd-a6d0-3206708c6169', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ae376f2-0af3-4b42-912b-45292fdc0d3b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=16d1a1ff-9f40-4471-b299-74debf9599df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.425 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 16d1a1ff-9f40-4471-b299-74debf9599df in datapath 8795a3d5-1a82-4f5f-933d-445cddea9b8f bound to our chassis
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.426 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8795a3d5-1a82-4f5f-933d-445cddea9b8f
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.434 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d24a3632-b31b-4ef1-bc06-8cdacd7dd242]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.434 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8795a3d5-11 in ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.436 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8795a3d5-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.436 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7e227d-dc14-47ef-b13a-7d1434d1ebe3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.436 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d80e93f4-e372-435e-b40e-51bb86278d30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.444 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[da4fe9dc-9218-4c82-94a5-f3eee382a019]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 systemd-udevd[222470]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:28:35 compute-0 NetworkManager[54920]: <info>  [1769160515.4575] device (tap16d1a1ff-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:28:35 compute-0 NetworkManager[54920]: <info>  [1769160515.4583] device (tap16d1a1ff-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:28:35 compute-0 systemd-machined[153562]: New machine qemu-52-instance-00000066.
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.463 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4fd75f-2b8b-49b1-a349-dcb36e4bb6d4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.475 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:35 compute-0 systemd[1]: Started Virtual Machine qemu-52-instance-00000066.
Jan 23 09:28:35 compute-0 ovn_controller[94697]: 2026-01-23T09:28:35Z|00390|binding|INFO|Setting lport 16d1a1ff-9f40-4471-b299-74debf9599df ovn-installed in OVS
Jan 23 09:28:35 compute-0 ovn_controller[94697]: 2026-01-23T09:28:35Z|00391|binding|INFO|Setting lport 16d1a1ff-9f40-4471-b299-74debf9599df up in Southbound
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.481 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.489 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1d6e966d-7ccb-417d-95c1-e08a2cb691ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 NetworkManager[54920]: <info>  [1769160515.4932] manager: (tap8795a3d5-10): new Veth device (/org/freedesktop/NetworkManager/Devices/199)
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.492 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[136f8186-005a-44e5-ae59-85af3a824006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.518 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6d096d72-abfe-4250-a351-51b05d16fdb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.520 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[95b6b015-4f6c-46da-9482-f68ff51698a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 NetworkManager[54920]: <info>  [1769160515.5342] device (tap8795a3d5-10): carrier: link connected
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.537 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b87b04e6-fefb-4d78-8470-f2910e946de4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.548 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c1d253aa-d8d7-4e06-a056-9ff585b8c9b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8795a3d5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:fa:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402102, 'reachable_time': 36190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222494, 'error': None, 'target': 'ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.558 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aee1347c-ba13-4090-90e6-488b5ecb93e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:fae6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402102, 'tstamp': 402102}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222495, 'error': None, 'target': 'ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.567 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9d301a48-3903-4340-aaba-531a1be18212]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8795a3d5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:fa:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402102, 'reachable_time': 36190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222496, 'error': None, 'target': 'ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.584 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b15f58b7-30e3-4e35-99f9-a57e3602a7e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.616 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[18aa91e5-9c18-49f0-b442-1de1ebdef10a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.617 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8795a3d5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.617 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.617 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8795a3d5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:35 compute-0 kernel: tap8795a3d5-10: entered promiscuous mode
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.619 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:35 compute-0 NetworkManager[54920]: <info>  [1769160515.6206] manager: (tap8795a3d5-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.621 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8795a3d5-10, col_values=(('external_ids', {'iface-id': '27a8d035-6fa1-4239-91ce-e0988f5f8ab1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.621 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:35 compute-0 ovn_controller[94697]: 2026-01-23T09:28:35Z|00392|binding|INFO|Releasing lport 27a8d035-6fa1-4239-91ce-e0988f5f8ab1 from this chassis (sb_readonly=0)
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.634 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.635 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8795a3d5-1a82-4f5f-933d-445cddea9b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8795a3d5-1a82-4f5f-933d-445cddea9b8f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.636 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2997f9-2e9b-4d1a-b701-5a6fbf214acc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.637 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-8795a3d5-1a82-4f5f-933d-445cddea9b8f
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/8795a3d5-1a82-4f5f-933d-445cddea9b8f.pid.haproxy
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 8795a3d5-1a82-4f5f-933d-445cddea9b8f
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:28:35 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:35.639 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f', 'env', 'PROCESS_TAG=haproxy-8795a3d5-1a82-4f5f-933d-445cddea9b8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8795a3d5-1a82-4f5f-933d-445cddea9b8f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.639 182096 DEBUG nova.compute.manager [req-c7a01d11-8d98-406a-9815-8680fe8eefb1 req-19a44688-757f-419c-9796-7709a2c03a92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received event network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.640 182096 DEBUG oslo_concurrency.lockutils [req-c7a01d11-8d98-406a-9815-8680fe8eefb1 req-19a44688-757f-419c-9796-7709a2c03a92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.640 182096 DEBUG oslo_concurrency.lockutils [req-c7a01d11-8d98-406a-9815-8680fe8eefb1 req-19a44688-757f-419c-9796-7709a2c03a92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.640 182096 DEBUG oslo_concurrency.lockutils [req-c7a01d11-8d98-406a-9815-8680fe8eefb1 req-19a44688-757f-419c-9796-7709a2c03a92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:35 compute-0 nova_compute[182092]: 2026-01-23 09:28:35.641 182096 DEBUG nova.compute.manager [req-c7a01d11-8d98-406a-9815-8680fe8eefb1 req-19a44688-757f-419c-9796-7709a2c03a92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Processing event network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:28:35 compute-0 podman[222525]: 2026-01-23 09:28:35.913441619 +0000 UTC m=+0.033743372 container create ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:28:35 compute-0 systemd[1]: Started libpod-conmon-ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396.scope.
Jan 23 09:28:35 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:28:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77b54874eef3ff24eb6465cc33803cbdf4428118262b8236eeff3d22b76aa623/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:28:35 compute-0 podman[222525]: 2026-01-23 09:28:35.985170564 +0000 UTC m=+0.105472317 container init ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 09:28:35 compute-0 podman[222525]: 2026-01-23 09:28:35.989534746 +0000 UTC m=+0.109836499 container start ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:28:35 compute-0 podman[222525]: 2026-01-23 09:28:35.900674827 +0000 UTC m=+0.020976601 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:28:36 compute-0 neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f[222538]: [NOTICE]   (222548) : New worker (222550) forked
Jan 23 09:28:36 compute-0 neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f[222538]: [NOTICE]   (222548) : Loading success.
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.036 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160516.0356066, a6e1cbfb-7cfe-45af-96d9-f198d0342d79 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.036 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] VM Started (Lifecycle Event)
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.038 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.052 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.054 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.057 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.059 182096 INFO nova.virt.libvirt.driver [-] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Instance spawned successfully.
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.059 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.073 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.073 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160516.0357184, a6e1cbfb-7cfe-45af-96d9-f198d0342d79 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.074 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] VM Paused (Lifecycle Event)
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.076 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.077 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.077 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.078 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.078 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.078 182096 DEBUG nova.virt.libvirt.driver [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.096 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.097 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160516.040117, a6e1cbfb-7cfe-45af-96d9-f198d0342d79 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.098 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] VM Resumed (Lifecycle Event)
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.115 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.117 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.134 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.135 182096 INFO nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Took 4.37 seconds to spawn the instance on the hypervisor.
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.135 182096 DEBUG nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.201 182096 INFO nova.compute.manager [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Took 4.69 seconds to build instance.
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.215 182096 DEBUG oslo_concurrency.lockutils [None req-27780558-3d84-481b-a0f8-64491ab93574 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.216 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 2.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.216 182096 INFO nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.216 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.217 182096 DEBUG nova.network.neutron [req-7771d573-eee6-49ab-be2f-6501e10213f6 req-d445296f-84e3-4a6a-b542-8cf9c082b908 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Updated VIF entry in instance network info cache for port 16d1a1ff-9f40-4471-b299-74debf9599df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.218 182096 DEBUG nova.network.neutron [req-7771d573-eee6-49ab-be2f-6501e10213f6 req-d445296f-84e3-4a6a-b542-8cf9c082b908 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Updating instance_info_cache with network_info: [{"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.227 182096 DEBUG oslo_concurrency.lockutils [req-7771d573-eee6-49ab-be2f-6501e10213f6 req-d445296f-84e3-4a6a-b542-8cf9c082b908 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a6e1cbfb-7cfe-45af-96d9-f198d0342d79" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.892 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.893 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.893 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.894 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.894 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.900 182096 INFO nova.compute.manager [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Terminating instance
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.906 182096 DEBUG nova.compute.manager [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:28:36 compute-0 kernel: tap16d1a1ff-9f (unregistering): left promiscuous mode
Jan 23 09:28:36 compute-0 NetworkManager[54920]: <info>  [1769160516.9280] device (tap16d1a1ff-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:28:36 compute-0 ovn_controller[94697]: 2026-01-23T09:28:36Z|00393|binding|INFO|Releasing lport 16d1a1ff-9f40-4471-b299-74debf9599df from this chassis (sb_readonly=0)
Jan 23 09:28:36 compute-0 ovn_controller[94697]: 2026-01-23T09:28:36Z|00394|binding|INFO|Setting lport 16d1a1ff-9f40-4471-b299-74debf9599df down in Southbound
Jan 23 09:28:36 compute-0 ovn_controller[94697]: 2026-01-23T09:28:36Z|00395|binding|INFO|Removing iface tap16d1a1ff-9f ovn-installed in OVS
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.934 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.936 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:36.941 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:94:8c 10.100.0.11'], port_security=['fa:16:3e:7d:94:8c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a6e1cbfb-7cfe-45af-96d9-f198d0342d79', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8795a3d5-1a82-4f5f-933d-445cddea9b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0851b7ba138142a5bbebfa067cfcfb09', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4c753e4f-6b96-48fd-a6d0-3206708c6169', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ae376f2-0af3-4b42-912b-45292fdc0d3b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=16d1a1ff-9f40-4471-b299-74debf9599df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:36.942 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 16d1a1ff-9f40-4471-b299-74debf9599df in datapath 8795a3d5-1a82-4f5f-933d-445cddea9b8f unbound from our chassis
Jan 23 09:28:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:36.943 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8795a3d5-1a82-4f5f-933d-445cddea9b8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:28:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:36.944 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7858eeff-600a-480d-9b2e-f2423ca414b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:36.944 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f namespace which is not needed anymore
Jan 23 09:28:36 compute-0 nova_compute[182092]: 2026-01-23 09:28:36.950 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:36 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000066.scope: Deactivated successfully.
Jan 23 09:28:36 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000066.scope: Consumed 1.400s CPU time.
Jan 23 09:28:36 compute-0 systemd-machined[153562]: Machine qemu-52-instance-00000066 terminated.
Jan 23 09:28:37 compute-0 neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f[222538]: [NOTICE]   (222548) : haproxy version is 2.8.14-c23fe91
Jan 23 09:28:37 compute-0 neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f[222538]: [NOTICE]   (222548) : path to executable is /usr/sbin/haproxy
Jan 23 09:28:37 compute-0 neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f[222538]: [ALERT]    (222548) : Current worker (222550) exited with code 143 (Terminated)
Jan 23 09:28:37 compute-0 neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f[222538]: [WARNING]  (222548) : All workers exited. Exiting... (0)
Jan 23 09:28:37 compute-0 systemd[1]: libpod-ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396.scope: Deactivated successfully.
Jan 23 09:28:37 compute-0 conmon[222538]: conmon ca146003dc72d74072da <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396.scope/container/memory.events
Jan 23 09:28:37 compute-0 podman[222574]: 2026-01-23 09:28:37.047008604 +0000 UTC m=+0.039761674 container died ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:28:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396-userdata-shm.mount: Deactivated successfully.
Jan 23 09:28:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-77b54874eef3ff24eb6465cc33803cbdf4428118262b8236eeff3d22b76aa623-merged.mount: Deactivated successfully.
Jan 23 09:28:37 compute-0 podman[222574]: 2026-01-23 09:28:37.065337959 +0000 UTC m=+0.058091030 container cleanup ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 09:28:37 compute-0 systemd[1]: libpod-conmon-ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396.scope: Deactivated successfully.
Jan 23 09:28:37 compute-0 podman[222598]: 2026-01-23 09:28:37.106435473 +0000 UTC m=+0.025628655 container remove ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.111 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a39ff3-d3ab-4d7b-a096-7217d257b1aa]: (4, ('Fri Jan 23 09:28:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f (ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396)\nca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396\nFri Jan 23 09:28:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f (ca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396)\nca146003dc72d74072dab29c78e18ceccfd4b172c0a9ac62221a2ee69192e396\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.112 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[67f5203c-b41a-46f4-b8d3-b36f24789187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.114 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8795a3d5-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:37 compute-0 NetworkManager[54920]: <info>  [1769160517.1182] manager: (tap16d1a1ff-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/201)
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.118 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:37 compute-0 kernel: tap8795a3d5-10: left promiscuous mode
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.134 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.137 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[15444612-3a70-4966-b811-138883df42bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.147 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0025defd-912f-4ee0-8f3e-353027bcfe72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.148 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fa64d254-9597-4afd-8c70-9b7046e4bcfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.159 182096 INFO nova.virt.libvirt.driver [-] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Instance destroyed successfully.
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.160 182096 DEBUG nova.objects.instance [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lazy-loading 'resources' on Instance uuid a6e1cbfb-7cfe-45af-96d9-f198d0342d79 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.162 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bb6b3f62-fca4-4a1b-9d4a-d26fee938d68]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402097, 'reachable_time': 40997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222627, 'error': None, 'target': 'ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.163 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8795a3d5-1a82-4f5f-933d-445cddea9b8f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:28:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:37.164 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[9096c0f4-91dd-4384-8601-6a9e2d14fc70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d8795a3d5\x2d1a82\x2d4f5f\x2d933d\x2d445cddea9b8f.mount: Deactivated successfully.
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.171 182096 DEBUG nova.virt.libvirt.vif [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:28:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1726417773',display_name='tempest-ServerPasswordTestJSON-server-1726417773',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1726417773',id=102,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:28:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0851b7ba138142a5bbebfa067cfcfb09',ramdisk_id='',reservation_id='r-bvg1yyha',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-611377974',owner_user_name='tempest-ServerPasswordTestJSON-611377974-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:28:36Z,user_data=None,user_id='5efca8c207f84ad492cc9ad81a1a25db',uuid=a6e1cbfb-7cfe-45af-96d9-f198d0342d79,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.172 182096 DEBUG nova.network.os_vif_util [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Converting VIF {"id": "16d1a1ff-9f40-4471-b299-74debf9599df", "address": "fa:16:3e:7d:94:8c", "network": {"id": "8795a3d5-1a82-4f5f-933d-445cddea9b8f", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-687224685-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0851b7ba138142a5bbebfa067cfcfb09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d1a1ff-9f", "ovs_interfaceid": "16d1a1ff-9f40-4471-b299-74debf9599df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.173 182096 DEBUG nova.network.os_vif_util [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:8c,bridge_name='br-int',has_traffic_filtering=True,id=16d1a1ff-9f40-4471-b299-74debf9599df,network=Network(8795a3d5-1a82-4f5f-933d-445cddea9b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d1a1ff-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.173 182096 DEBUG os_vif [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:8c,bridge_name='br-int',has_traffic_filtering=True,id=16d1a1ff-9f40-4471-b299-74debf9599df,network=Network(8795a3d5-1a82-4f5f-933d-445cddea9b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d1a1ff-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.175 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.175 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d1a1ff-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.177 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.179 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.181 182096 INFO os_vif [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:8c,bridge_name='br-int',has_traffic_filtering=True,id=16d1a1ff-9f40-4471-b299-74debf9599df,network=Network(8795a3d5-1a82-4f5f-933d-445cddea9b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d1a1ff-9f')
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.182 182096 INFO nova.virt.libvirt.driver [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Deleting instance files /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79_del
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.182 182096 INFO nova.virt.libvirt.driver [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Deletion of /var/lib/nova/instances/a6e1cbfb-7cfe-45af-96d9-f198d0342d79_del complete
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.238 182096 INFO nova.compute.manager [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.238 182096 DEBUG oslo.service.loopingcall [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.239 182096 DEBUG nova.compute.manager [-] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.239 182096 DEBUG nova.network.neutron [-] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.727 182096 DEBUG nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received event network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.728 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.728 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.728 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.729 182096 DEBUG nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] No waiting events found dispatching network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.729 182096 WARNING nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received unexpected event network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df for instance with vm_state active and task_state deleting.
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.729 182096 DEBUG nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received event network-vif-unplugged-16d1a1ff-9f40-4471-b299-74debf9599df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.729 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.730 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.730 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.730 182096 DEBUG nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] No waiting events found dispatching network-vif-unplugged-16d1a1ff-9f40-4471-b299-74debf9599df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.731 182096 DEBUG nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received event network-vif-unplugged-16d1a1ff-9f40-4471-b299-74debf9599df for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.731 182096 DEBUG nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received event network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.731 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.731 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.732 182096 DEBUG oslo_concurrency.lockutils [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.732 182096 DEBUG nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] No waiting events found dispatching network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.732 182096 WARNING nova.compute.manager [req-fbd6d1fd-5ed1-4365-af1f-6c647ad43b65 req-10ca598d-6ec4-445b-856d-3cb90fa4382b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received unexpected event network-vif-plugged-16d1a1ff-9f40-4471-b299-74debf9599df for instance with vm_state active and task_state deleting.
Jan 23 09:28:37 compute-0 nova_compute[182092]: 2026-01-23 09:28:37.996 182096 DEBUG nova.network.neutron [-] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.014 182096 INFO nova.compute.manager [-] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Took 0.78 seconds to deallocate network for instance.
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.066 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.067 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.069 182096 DEBUG nova.compute.manager [req-07aeeccb-8dbf-4611-a5a1-5fe6bd5601fb req-e0b64e7d-22f7-40a6-abb3-5c2c0daf73b1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Received event network-vif-deleted-16d1a1ff-9f40-4471-b299-74debf9599df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.143 182096 DEBUG nova.compute.provider_tree [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.168 182096 DEBUG nova.scheduler.client.report [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.182 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.214 182096 INFO nova.scheduler.client.report [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Deleted allocations for instance a6e1cbfb-7cfe-45af-96d9-f198d0342d79
Jan 23 09:28:38 compute-0 nova_compute[182092]: 2026-01-23 09:28:38.262 182096 DEBUG oslo_concurrency.lockutils [None req-4c6fdafd-68a6-4fa0-bfe9-275e08cd5c0a 5efca8c207f84ad492cc9ad81a1a25db 0851b7ba138142a5bbebfa067cfcfb09 - - default default] Lock "a6e1cbfb-7cfe-45af-96d9-f198d0342d79" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:39.862 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:39.863 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:39.863 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:41 compute-0 nova_compute[182092]: 2026-01-23 09:28:41.327 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:42 compute-0 nova_compute[182092]: 2026-01-23 09:28:42.140 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:42 compute-0 nova_compute[182092]: 2026-01-23 09:28:42.177 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:42 compute-0 nova_compute[182092]: 2026-01-23 09:28:42.950 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160507.9492843, 86281329-601b-43f8-8481-663f35dbe261 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:42 compute-0 nova_compute[182092]: 2026-01-23 09:28:42.951 182096 INFO nova.compute.manager [-] [instance: 86281329-601b-43f8-8481-663f35dbe261] VM Stopped (Lifecycle Event)
Jan 23 09:28:42 compute-0 nova_compute[182092]: 2026-01-23 09:28:42.974 182096 DEBUG nova.compute.manager [None req-40595ae2-d1fb-40ef-a663-a2c501a2d1a8 - - - - - -] [instance: 86281329-601b-43f8-8481-663f35dbe261] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:46 compute-0 podman[222630]: 2026-01-23 09:28:46.207204277 +0000 UTC m=+0.042937345 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:28:46 compute-0 podman[222629]: 2026-01-23 09:28:46.210353698 +0000 UTC m=+0.046809128 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.626 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "1c7830f4-294f-413c-843b-113d480d4ed9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.626 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.642 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.719 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.719 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.723 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.724 182096 INFO nova.compute.claims [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.846 182096 DEBUG nova.compute.provider_tree [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.857 182096 DEBUG nova.scheduler.client.report [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.873 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.873 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.924 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.924 182096 DEBUG nova.network.neutron [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.942 182096 INFO nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:28:46 compute-0 nova_compute[182092]: 2026-01-23 09:28:46.955 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.035 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.036 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.036 182096 INFO nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Creating image(s)
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.037 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "/var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.037 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "/var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.038 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "/var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.048 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.093 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.094 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.095 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.104 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.142 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.148 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.148 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.169 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.169 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.170 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.180 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.214 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.215 182096 DEBUG nova.virt.disk.api [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Checking if we can resize image /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.215 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.259 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.260 182096 DEBUG nova.virt.disk.api [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Cannot resize image /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.261 182096 DEBUG nova.objects.instance [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lazy-loading 'migration_context' on Instance uuid 1c7830f4-294f-413c-843b-113d480d4ed9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.269 182096 DEBUG nova.policy [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff07a58fbdbb442cb7faa2ceb6066b86', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '439a96c54de44b239d374954108968be', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.274 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.274 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Ensure instance console log exists: /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.274 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.275 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:47 compute-0 nova_compute[182092]: 2026-01-23 09:28:47.275 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:48 compute-0 podman[222681]: 2026-01-23 09:28:48.204438366 +0000 UTC m=+0.041953981 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:28:48 compute-0 nova_compute[182092]: 2026-01-23 09:28:48.269 182096 DEBUG nova.network.neutron [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Successfully created port: 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:28:49 compute-0 nova_compute[182092]: 2026-01-23 09:28:49.298 182096 DEBUG nova.network.neutron [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Successfully updated port: 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:28:49 compute-0 nova_compute[182092]: 2026-01-23 09:28:49.309 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "refresh_cache-1c7830f4-294f-413c-843b-113d480d4ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:49 compute-0 nova_compute[182092]: 2026-01-23 09:28:49.309 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquired lock "refresh_cache-1c7830f4-294f-413c-843b-113d480d4ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:49 compute-0 nova_compute[182092]: 2026-01-23 09:28:49.309 182096 DEBUG nova.network.neutron [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:28:49 compute-0 nova_compute[182092]: 2026-01-23 09:28:49.374 182096 DEBUG nova.compute.manager [req-b1674ab3-fa71-4434-95bc-f437f3c5d385 req-4e6f2326-6a49-41d6-b7fd-380c9b3c0d3d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received event network-changed-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:49 compute-0 nova_compute[182092]: 2026-01-23 09:28:49.374 182096 DEBUG nova.compute.manager [req-b1674ab3-fa71-4434-95bc-f437f3c5d385 req-4e6f2326-6a49-41d6-b7fd-380c9b3c0d3d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Refreshing instance network info cache due to event network-changed-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:28:49 compute-0 nova_compute[182092]: 2026-01-23 09:28:49.374 182096 DEBUG oslo_concurrency.lockutils [req-b1674ab3-fa71-4434-95bc-f437f3c5d385 req-4e6f2326-6a49-41d6-b7fd-380c9b3c0d3d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-1c7830f4-294f-413c-843b-113d480d4ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:28:49 compute-0 nova_compute[182092]: 2026-01-23 09:28:49.439 182096 DEBUG nova.network.neutron [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.083 182096 DEBUG nova.network.neutron [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Updating instance_info_cache with network_info: [{"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.099 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Releasing lock "refresh_cache-1c7830f4-294f-413c-843b-113d480d4ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.099 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Instance network_info: |[{"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.099 182096 DEBUG oslo_concurrency.lockutils [req-b1674ab3-fa71-4434-95bc-f437f3c5d385 req-4e6f2326-6a49-41d6-b7fd-380c9b3c0d3d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-1c7830f4-294f-413c-843b-113d480d4ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.100 182096 DEBUG nova.network.neutron [req-b1674ab3-fa71-4434-95bc-f437f3c5d385 req-4e6f2326-6a49-41d6-b7fd-380c9b3c0d3d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Refreshing network info cache for port 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.101 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Start _get_guest_xml network_info=[{"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.105 182096 WARNING nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.109 182096 DEBUG nova.virt.libvirt.host [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.110 182096 DEBUG nova.virt.libvirt.host [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.112 182096 DEBUG nova.virt.libvirt.host [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.112 182096 DEBUG nova.virt.libvirt.host [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.113 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.113 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.113 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.113 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.114 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.114 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.114 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.114 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.114 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.115 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.115 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.115 182096 DEBUG nova.virt.hardware [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.117 182096 DEBUG nova.virt.libvirt.vif [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1045320169',display_name='tempest-DeleteServersTestJSON-server-1045320169',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1045320169',id=104,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='439a96c54de44b239d374954108968be',ramdisk_id='',reservation_id='r-tmx90p5w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2005458721',owner_user_name='tempest-DeleteServersTestJSON-2005458721-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:28:46Z,user_data=None,user_id='ff07a58fbdbb442cb7faa2ceb6066b86',uuid=1c7830f4-294f-413c-843b-113d480d4ed9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.118 182096 DEBUG nova.network.os_vif_util [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converting VIF {"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.118 182096 DEBUG nova.network.os_vif_util [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:bd:1f,bridge_name='br-int',has_traffic_filtering=True,id=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d2cf5ac-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.119 182096 DEBUG nova.objects.instance [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c7830f4-294f-413c-843b-113d480d4ed9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.132 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <uuid>1c7830f4-294f-413c-843b-113d480d4ed9</uuid>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <name>instance-00000068</name>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <nova:name>tempest-DeleteServersTestJSON-server-1045320169</nova:name>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:28:50</nova:creationTime>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:28:50 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:28:50 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:28:50 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:28:50 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:28:50 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:28:50 compute-0 nova_compute[182092]:         <nova:user uuid="ff07a58fbdbb442cb7faa2ceb6066b86">tempest-DeleteServersTestJSON-2005458721-project-member</nova:user>
Jan 23 09:28:50 compute-0 nova_compute[182092]:         <nova:project uuid="439a96c54de44b239d374954108968be">tempest-DeleteServersTestJSON-2005458721</nova:project>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:28:50 compute-0 nova_compute[182092]:         <nova:port uuid="4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99">
Jan 23 09:28:50 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <system>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <entry name="serial">1c7830f4-294f-413c-843b-113d480d4ed9</entry>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <entry name="uuid">1c7830f4-294f-413c-843b-113d480d4ed9</entry>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </system>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <os>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   </os>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <features>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   </features>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk.config"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:d5:bd:1f"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <target dev="tap4d2cf5ac-5e"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/console.log" append="off"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <video>
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </video>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:28:50 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:28:50 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:28:50 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:28:50 compute-0 nova_compute[182092]: </domain>
Jan 23 09:28:50 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.133 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Preparing to wait for external event network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.133 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.133 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.133 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.134 182096 DEBUG nova.virt.libvirt.vif [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1045320169',display_name='tempest-DeleteServersTestJSON-server-1045320169',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1045320169',id=104,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='439a96c54de44b239d374954108968be',ramdisk_id='',reservation_id='r-tmx90p5w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2005458721',owner_user_name='tempest-DeleteServersTestJSON-2005458721-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:28:46Z,user_data=None,user_id='ff07a58fbdbb442cb7faa2ceb6066b86',uuid=1c7830f4-294f-413c-843b-113d480d4ed9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.134 182096 DEBUG nova.network.os_vif_util [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converting VIF {"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.135 182096 DEBUG nova.network.os_vif_util [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:bd:1f,bridge_name='br-int',has_traffic_filtering=True,id=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d2cf5ac-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.135 182096 DEBUG os_vif [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:bd:1f,bridge_name='br-int',has_traffic_filtering=True,id=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d2cf5ac-5e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.135 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.136 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.136 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.138 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.138 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d2cf5ac-5e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.139 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4d2cf5ac-5e, col_values=(('external_ids', {'iface-id': '4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:bd:1f', 'vm-uuid': '1c7830f4-294f-413c-843b-113d480d4ed9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:50 compute-0 NetworkManager[54920]: <info>  [1769160530.1408] manager: (tap4d2cf5ac-5e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.141 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.143 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.144 182096 INFO os_vif [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:bd:1f,bridge_name='br-int',has_traffic_filtering=True,id=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d2cf5ac-5e')
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.185 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.186 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.186 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] No VIF found with MAC fa:16:3e:d5:bd:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.186 182096 INFO nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Using config drive
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.433 182096 INFO nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Creating config drive at /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk.config
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.437 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyf2xkazs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.556 182096 DEBUG oslo_concurrency.processutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyf2xkazs" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:28:50 compute-0 kernel: tap4d2cf5ac-5e: entered promiscuous mode
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.595 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 ovn_controller[94697]: 2026-01-23T09:28:50Z|00396|binding|INFO|Claiming lport 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 for this chassis.
Jan 23 09:28:50 compute-0 ovn_controller[94697]: 2026-01-23T09:28:50Z|00397|binding|INFO|4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99: Claiming fa:16:3e:d5:bd:1f 10.100.0.13
Jan 23 09:28:50 compute-0 NetworkManager[54920]: <info>  [1769160530.5991] manager: (tap4d2cf5ac-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/203)
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.608 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:bd:1f 10.100.0.13'], port_security=['fa:16:3e:d5:bd:1f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1c7830f4-294f-413c-843b-113d480d4ed9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '439a96c54de44b239d374954108968be', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd676b957-c18c-4bbf-b899-13521827f709', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=38af584d-6b9d-4c98-af0a-2dc34a75edac, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.609 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 in datapath 0ebeaff5-3e26-47fb-9cca-bb690223e5fb bound to our chassis
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.610 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0ebeaff5-3e26-47fb-9cca-bb690223e5fb
Jan 23 09:28:50 compute-0 systemd-udevd[222718]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.619 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4552170e-8162-49fb-8caf-a7de1176af8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.620 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0ebeaff5-31 in ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.622 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0ebeaff5-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.622 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b76935f3-ea82-4c31-919b-756adaa3e480]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.622 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[554a13bf-d70a-4bf9-a63f-713ce5f218ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.630 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[0deab141-84e6-4721-9c69-9ec775c38ab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 NetworkManager[54920]: <info>  [1769160530.6368] device (tap4d2cf5ac-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:28:50 compute-0 NetworkManager[54920]: <info>  [1769160530.6375] device (tap4d2cf5ac-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:28:50 compute-0 systemd-machined[153562]: New machine qemu-53-instance-00000068.
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.660 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[32bbf537-62c6-4618-b324-dfef166ce7cd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.661 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 systemd[1]: Started Virtual Machine qemu-53-instance-00000068.
Jan 23 09:28:50 compute-0 ovn_controller[94697]: 2026-01-23T09:28:50Z|00398|binding|INFO|Setting lport 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 ovn-installed in OVS
Jan 23 09:28:50 compute-0 ovn_controller[94697]: 2026-01-23T09:28:50Z|00399|binding|INFO|Setting lport 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 up in Southbound
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.670 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.681 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8d2f72-478e-48ee-9a52-90f44a4a8626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.685 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2fada766-0585-4e45-9017-2f53f2ce86fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 NetworkManager[54920]: <info>  [1769160530.6856] manager: (tap0ebeaff5-30): new Veth device (/org/freedesktop/NetworkManager/Devices/204)
Jan 23 09:28:50 compute-0 systemd-udevd[222724]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.715 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd50b39-5540-46ff-a708-fdaba09d7aca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.717 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d8218ecf-4ed6-4b95-bf2f-02bc3d0bb243]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 NetworkManager[54920]: <info>  [1769160530.7323] device (tap0ebeaff5-30): carrier: link connected
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.735 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[f194be4d-cddd-4e97-b4a3-94079927ab42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.747 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0e7f37-5507-415e-9883-f158fc71b0ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0ebeaff5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:ee:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403621, 'reachable_time': 41760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222744, 'error': None, 'target': 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.758 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4d6e22-681d-4421-a7e1-c53275a958c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:eeba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403621, 'tstamp': 403621}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222745, 'error': None, 'target': 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.769 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5ba6c5-22e1-4439-853e-685db7d8ee54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0ebeaff5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:ee:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403621, 'reachable_time': 41760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222746, 'error': None, 'target': 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.789 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e353168b-15f8-423c-9426-604e5d9e77b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.831 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[12eac4b8-0a60-4582-806f-0ffa7de80b28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.831 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ebeaff5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.832 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.832 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ebeaff5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:50 compute-0 kernel: tap0ebeaff5-30: entered promiscuous mode
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.833 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 NetworkManager[54920]: <info>  [1769160530.8342] manager: (tap0ebeaff5-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.835 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.838 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0ebeaff5-30, col_values=(('external_ids', {'iface-id': '08ec51c7-019e-438c-bd01-f7c758e0c6d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.838 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.839 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 ovn_controller[94697]: 2026-01-23T09:28:50Z|00400|binding|INFO|Releasing lport 08ec51c7-019e-438c-bd01-f7c758e0c6d5 from this chassis (sb_readonly=0)
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.841 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0ebeaff5-3e26-47fb-9cca-bb690223e5fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0ebeaff5-3e26-47fb-9cca-bb690223e5fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.850 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4adfaa-6405-41e8-9510-cbdc9e73ebe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.851 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-0ebeaff5-3e26-47fb-9cca-bb690223e5fb
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/0ebeaff5-3e26-47fb-9cca-bb690223e5fb.pid.haproxy
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 0ebeaff5-3e26-47fb-9cca-bb690223e5fb
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.852 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:50.853 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'env', 'PROCESS_TAG=haproxy-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0ebeaff5-3e26-47fb-9cca-bb690223e5fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.908 182096 DEBUG nova.network.neutron [req-b1674ab3-fa71-4434-95bc-f437f3c5d385 req-4e6f2326-6a49-41d6-b7fd-380c9b3c0d3d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Updated VIF entry in instance network info cache for port 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.908 182096 DEBUG nova.network.neutron [req-b1674ab3-fa71-4434-95bc-f437f3c5d385 req-4e6f2326-6a49-41d6-b7fd-380c9b3c0d3d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Updating instance_info_cache with network_info: [{"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:28:50 compute-0 nova_compute[182092]: 2026-01-23 09:28:50.929 182096 DEBUG oslo_concurrency.lockutils [req-b1674ab3-fa71-4434-95bc-f437f3c5d385 req-4e6f2326-6a49-41d6-b7fd-380c9b3c0d3d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-1c7830f4-294f-413c-843b-113d480d4ed9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:28:51 compute-0 podman[222775]: 2026-01-23 09:28:51.124156615 +0000 UTC m=+0.030977244 container create d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:28:51 compute-0 systemd[1]: Started libpod-conmon-d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e.scope.
Jan 23 09:28:51 compute-0 nova_compute[182092]: 2026-01-23 09:28:51.158 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160531.1585698, 1c7830f4-294f-413c-843b-113d480d4ed9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:51 compute-0 nova_compute[182092]: 2026-01-23 09:28:51.159 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] VM Started (Lifecycle Event)
Jan 23 09:28:51 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:28:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/884fab4c2fd29f03303845424f1f5b5ffd2cdc8b0e6aedb007b749694803e535/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:28:51 compute-0 podman[222775]: 2026-01-23 09:28:51.171275678 +0000 UTC m=+0.078096317 container init d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:28:51 compute-0 podman[222775]: 2026-01-23 09:28:51.175510957 +0000 UTC m=+0.082331576 container start d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:28:51 compute-0 nova_compute[182092]: 2026-01-23 09:28:51.175 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:51 compute-0 podman[222775]: 2026-01-23 09:28:51.107595146 +0000 UTC m=+0.014415785 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:28:51 compute-0 nova_compute[182092]: 2026-01-23 09:28:51.178 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160531.1591215, 1c7830f4-294f-413c-843b-113d480d4ed9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:51 compute-0 nova_compute[182092]: 2026-01-23 09:28:51.179 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] VM Paused (Lifecycle Event)
Jan 23 09:28:51 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[222793]: [NOTICE]   (222797) : New worker (222799) forked
Jan 23 09:28:51 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[222793]: [NOTICE]   (222797) : Loading success.
Jan 23 09:28:51 compute-0 nova_compute[182092]: 2026-01-23 09:28:51.192 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:51 compute-0 nova_compute[182092]: 2026-01-23 09:28:51.194 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:51 compute-0 nova_compute[182092]: 2026-01-23 09:28:51.211 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:52 compute-0 nova_compute[182092]: 2026-01-23 09:28:52.143 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:52 compute-0 nova_compute[182092]: 2026-01-23 09:28:52.159 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160517.1581829, a6e1cbfb-7cfe-45af-96d9-f198d0342d79 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:52 compute-0 nova_compute[182092]: 2026-01-23 09:28:52.159 182096 INFO nova.compute.manager [-] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] VM Stopped (Lifecycle Event)
Jan 23 09:28:52 compute-0 nova_compute[182092]: 2026-01-23 09:28:52.175 182096 DEBUG nova.compute.manager [None req-2542c7b5-d654-48bb-bdc2-c2acc3decfe4 - - - - - -] [instance: a6e1cbfb-7cfe-45af-96d9-f198d0342d79] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:55 compute-0 nova_compute[182092]: 2026-01-23 09:28:55.140 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.319 182096 DEBUG nova.compute.manager [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received event network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.320 182096 DEBUG oslo_concurrency.lockutils [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.320 182096 DEBUG oslo_concurrency.lockutils [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.320 182096 DEBUG oslo_concurrency.lockutils [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.320 182096 DEBUG nova.compute.manager [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Processing event network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.320 182096 DEBUG nova.compute.manager [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received event network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.320 182096 DEBUG oslo_concurrency.lockutils [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.321 182096 DEBUG oslo_concurrency.lockutils [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.321 182096 DEBUG oslo_concurrency.lockutils [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.321 182096 DEBUG nova.compute.manager [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] No waiting events found dispatching network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.321 182096 WARNING nova.compute.manager [req-ca6d253b-eb1b-40b2-9911-e653ab6f7b37 req-fa4bc470-1887-4b97-82da-417b2c4101e6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received unexpected event network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 for instance with vm_state building and task_state spawning.
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.321 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.323 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160536.323836, 1c7830f4-294f-413c-843b-113d480d4ed9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.324 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] VM Resumed (Lifecycle Event)
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.326 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.328 182096 INFO nova.virt.libvirt.driver [-] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Instance spawned successfully.
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.328 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.350 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.354 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.357 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.358 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.358 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.358 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.359 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.359 182096 DEBUG nova.virt.libvirt.driver [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.380 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.411 182096 INFO nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Took 9.38 seconds to spawn the instance on the hypervisor.
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.411 182096 DEBUG nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.460 182096 INFO nova.compute.manager [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Took 9.76 seconds to build instance.
Jan 23 09:28:56 compute-0 nova_compute[182092]: 2026-01-23 09:28:56.474 182096 DEBUG oslo_concurrency.lockutils [None req-90facff8-3dbb-47e4-b2b0-36bdfdee48c5 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:57 compute-0 nova_compute[182092]: 2026-01-23 09:28:57.145 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:58 compute-0 podman[222804]: 2026-01-23 09:28:58.220889143 +0000 UTC m=+0.057901443 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.390 182096 DEBUG nova.objects.instance [None req-ff4b6f57-ff5b-46a0-9024-a96ecb8e0de8 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c7830f4-294f-413c-843b-113d480d4ed9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.402 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160538.4027343, 1c7830f4-294f-413c-843b-113d480d4ed9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.403 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] VM Paused (Lifecycle Event)
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.415 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.416 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.439 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 23 09:28:58 compute-0 kernel: tap4d2cf5ac-5e (unregistering): left promiscuous mode
Jan 23 09:28:58 compute-0 NetworkManager[54920]: <info>  [1769160538.7293] device (tap4d2cf5ac-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:28:58 compute-0 ovn_controller[94697]: 2026-01-23T09:28:58Z|00401|binding|INFO|Releasing lport 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 from this chassis (sb_readonly=0)
Jan 23 09:28:58 compute-0 ovn_controller[94697]: 2026-01-23T09:28:58Z|00402|binding|INFO|Setting lport 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 down in Southbound
Jan 23 09:28:58 compute-0 ovn_controller[94697]: 2026-01-23T09:28:58Z|00403|binding|INFO|Removing iface tap4d2cf5ac-5e ovn-installed in OVS
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.735 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.737 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:bd:1f 10.100.0.13'], port_security=['fa:16:3e:d5:bd:1f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1c7830f4-294f-413c-843b-113d480d4ed9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '439a96c54de44b239d374954108968be', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd676b957-c18c-4bbf-b899-13521827f709', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=38af584d-6b9d-4c98-af0a-2dc34a75edac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.738 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 in datapath 0ebeaff5-3e26-47fb-9cca-bb690223e5fb unbound from our chassis
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.740 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0ebeaff5-3e26-47fb-9cca-bb690223e5fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.740 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca1edad-e95d-4000-9017-f81a1571454e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.741 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb namespace which is not needed anymore
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.752 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:58 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 23 09:28:58 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000068.scope: Consumed 2.481s CPU time.
Jan 23 09:28:58 compute-0 systemd-machined[153562]: Machine qemu-53-instance-00000068 terminated.
Jan 23 09:28:58 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[222793]: [NOTICE]   (222797) : haproxy version is 2.8.14-c23fe91
Jan 23 09:28:58 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[222793]: [NOTICE]   (222797) : path to executable is /usr/sbin/haproxy
Jan 23 09:28:58 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[222793]: [WARNING]  (222797) : Exiting Master process...
Jan 23 09:28:58 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[222793]: [WARNING]  (222797) : Exiting Master process...
Jan 23 09:28:58 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[222793]: [ALERT]    (222797) : Current worker (222799) exited with code 143 (Terminated)
Jan 23 09:28:58 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[222793]: [WARNING]  (222797) : All workers exited. Exiting... (0)
Jan 23 09:28:58 compute-0 systemd[1]: libpod-d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e.scope: Deactivated successfully.
Jan 23 09:28:58 compute-0 podman[222851]: 2026-01-23 09:28:58.839831894 +0000 UTC m=+0.035938985 container died d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.850 182096 DEBUG nova.compute.manager [req-0bb1afb7-e046-4d5e-8968-115ccb427e8a req-91937be9-d6f1-46a2-bdfb-a65de773376e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received event network-vif-unplugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.850 182096 DEBUG oslo_concurrency.lockutils [req-0bb1afb7-e046-4d5e-8968-115ccb427e8a req-91937be9-d6f1-46a2-bdfb-a65de773376e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.850 182096 DEBUG oslo_concurrency.lockutils [req-0bb1afb7-e046-4d5e-8968-115ccb427e8a req-91937be9-d6f1-46a2-bdfb-a65de773376e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.850 182096 DEBUG oslo_concurrency.lockutils [req-0bb1afb7-e046-4d5e-8968-115ccb427e8a req-91937be9-d6f1-46a2-bdfb-a65de773376e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.851 182096 DEBUG nova.compute.manager [req-0bb1afb7-e046-4d5e-8968-115ccb427e8a req-91937be9-d6f1-46a2-bdfb-a65de773376e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] No waiting events found dispatching network-vif-unplugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.851 182096 WARNING nova.compute.manager [req-0bb1afb7-e046-4d5e-8968-115ccb427e8a req-91937be9-d6f1-46a2-bdfb-a65de773376e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received unexpected event network-vif-unplugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 for instance with vm_state active and task_state suspending.
Jan 23 09:28:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e-userdata-shm.mount: Deactivated successfully.
Jan 23 09:28:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-884fab4c2fd29f03303845424f1f5b5ffd2cdc8b0e6aedb007b749694803e535-merged.mount: Deactivated successfully.
Jan 23 09:28:58 compute-0 podman[222851]: 2026-01-23 09:28:58.865721712 +0000 UTC m=+0.061828801 container cleanup d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:28:58 compute-0 systemd[1]: libpod-conmon-d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e.scope: Deactivated successfully.
Jan 23 09:28:58 compute-0 podman[222874]: 2026-01-23 09:28:58.906107783 +0000 UTC m=+0.025010651 container remove d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.910 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e22df3fa-2799-4a5e-a028-d9e293664338]: (4, ('Fri Jan 23 09:28:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb (d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e)\nd58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e\nFri Jan 23 09:28:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb (d58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e)\nd58df9d36d8bfb8f65e56fac6f6ab441096feb98d2cea6b5a75f624f599ab43e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.911 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[12041500-f35d-48e1-a36e-1b1951cf961e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.911 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ebeaff5-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:28:58 compute-0 kernel: tap0ebeaff5-30: left promiscuous mode
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.913 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:58 compute-0 NetworkManager[54920]: <info>  [1769160538.9291] manager: (tap4d2cf5ac-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.929 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.933 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d993bd-0f3e-4aa1-99b8-80f2ff390ac1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.946 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad69ae5-8a9c-478e-bcae-4b8a0d3e350a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.946 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d71d5a98-341c-47f8-b6a9-e922f7e395bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.959 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[58cb76a0-cfc8-45aa-870d-4e5515fac338]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403616, 'reachable_time': 17698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222898, 'error': None, 'target': 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d0ebeaff5\x2d3e26\x2d47fb\x2d9cca\x2dbb690223e5fb.mount: Deactivated successfully.
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.961 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:28:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:28:58.961 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[27c787e0-d34e-43af-bbf0-bb9e2e734e49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:28:58 compute-0 nova_compute[182092]: 2026-01-23 09:28:58.963 182096 DEBUG nova.compute.manager [None req-ff4b6f57-ff5b-46a0-9024-a96ecb8e0de8 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:00 compute-0 nova_compute[182092]: 2026-01-23 09:29:00.143 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:01 compute-0 nova_compute[182092]: 2026-01-23 09:29:01.607 182096 DEBUG nova.compute.manager [req-337fcaa1-7c66-4694-981f-1fdc537b5bae req-9483983f-d67a-48d0-872e-ce2c3ff952bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received event network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:01 compute-0 nova_compute[182092]: 2026-01-23 09:29:01.607 182096 DEBUG oslo_concurrency.lockutils [req-337fcaa1-7c66-4694-981f-1fdc537b5bae req-9483983f-d67a-48d0-872e-ce2c3ff952bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:01 compute-0 nova_compute[182092]: 2026-01-23 09:29:01.607 182096 DEBUG oslo_concurrency.lockutils [req-337fcaa1-7c66-4694-981f-1fdc537b5bae req-9483983f-d67a-48d0-872e-ce2c3ff952bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:01 compute-0 nova_compute[182092]: 2026-01-23 09:29:01.607 182096 DEBUG oslo_concurrency.lockutils [req-337fcaa1-7c66-4694-981f-1fdc537b5bae req-9483983f-d67a-48d0-872e-ce2c3ff952bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:01 compute-0 nova_compute[182092]: 2026-01-23 09:29:01.608 182096 DEBUG nova.compute.manager [req-337fcaa1-7c66-4694-981f-1fdc537b5bae req-9483983f-d67a-48d0-872e-ce2c3ff952bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] No waiting events found dispatching network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:29:01 compute-0 nova_compute[182092]: 2026-01-23 09:29:01.608 182096 WARNING nova.compute.manager [req-337fcaa1-7c66-4694-981f-1fdc537b5bae req-9483983f-d67a-48d0-872e-ce2c3ff952bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received unexpected event network-vif-plugged-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 for instance with vm_state suspended and task_state None.
Jan 23 09:29:01 compute-0 nova_compute[182092]: 2026-01-23 09:29:01.669 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:02 compute-0 nova_compute[182092]: 2026-01-23 09:29:02.147 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:02 compute-0 nova_compute[182092]: 2026-01-23 09:29:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.548 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "1c7830f4-294f-413c-843b-113d480d4ed9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.548 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.548 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.548 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.549 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.554 182096 INFO nova.compute.manager [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Terminating instance
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.560 182096 DEBUG nova.compute.manager [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.563 182096 INFO nova.virt.libvirt.driver [-] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Instance destroyed successfully.
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.564 182096 DEBUG nova.objects.instance [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lazy-loading 'resources' on Instance uuid 1c7830f4-294f-413c-843b-113d480d4ed9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.571 182096 DEBUG nova.virt.libvirt.vif [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:28:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1045320169',display_name='tempest-DeleteServersTestJSON-server-1045320169',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1045320169',id=104,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:28:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='439a96c54de44b239d374954108968be',ramdisk_id='',reservation_id='r-tmx90p5w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2005458721',owner_user_name='tempest-DeleteServersTestJSON-2005458721-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:28:58Z,user_data=None,user_id='ff07a58fbdbb442cb7faa2ceb6066b86',uuid=1c7830f4-294f-413c-843b-113d480d4ed9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.572 182096 DEBUG nova.network.os_vif_util [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converting VIF {"id": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "address": "fa:16:3e:d5:bd:1f", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d2cf5ac-5e", "ovs_interfaceid": "4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.572 182096 DEBUG nova.network.os_vif_util [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:bd:1f,bridge_name='br-int',has_traffic_filtering=True,id=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d2cf5ac-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.573 182096 DEBUG os_vif [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:bd:1f,bridge_name='br-int',has_traffic_filtering=True,id=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d2cf5ac-5e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.574 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.574 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d2cf5ac-5e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.576 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.578 182096 INFO os_vif [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:bd:1f,bridge_name='br-int',has_traffic_filtering=True,id=4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d2cf5ac-5e')
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.578 182096 INFO nova.virt.libvirt.driver [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Deleting instance files /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9_del
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.579 182096 INFO nova.virt.libvirt.driver [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Deletion of /var/lib/nova/instances/1c7830f4-294f-413c-843b-113d480d4ed9_del complete
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.637 182096 INFO nova.compute.manager [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Took 0.08 seconds to destroy the instance on the hypervisor.
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.637 182096 DEBUG oslo.service.loopingcall [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.638 182096 DEBUG nova.compute.manager [-] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:29:03 compute-0 nova_compute[182092]: 2026-01-23 09:29:03.638 182096 DEBUG nova.network.neutron [-] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.159 182096 DEBUG nova.network.neutron [-] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.169 182096 INFO nova.compute.manager [-] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Took 0.53 seconds to deallocate network for instance.
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.208 182096 DEBUG nova.compute.manager [req-9d598743-7da0-45af-b554-a5f140990b1c req-3d8d4857-8313-4fe4-8f1a-09f089b9b26d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Received event network-vif-deleted-4d2cf5ac-5e59-45ff-a8a3-c65181e0ae99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.228 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.228 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.273 182096 DEBUG nova.compute.provider_tree [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.286 182096 DEBUG nova.scheduler.client.report [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.300 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.318 182096 INFO nova.scheduler.client.report [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Deleted allocations for instance 1c7830f4-294f-413c-843b-113d480d4ed9
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.367 182096 DEBUG oslo_concurrency.lockutils [None req-66e04f35-ef06-4e95-8b50-6d1a9f85b508 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "1c7830f4-294f-413c-843b-113d480d4ed9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.819s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:04 compute-0 nova_compute[182092]: 2026-01-23 09:29:04.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:05 compute-0 podman[222904]: 2026-01-23 09:29:05.200400192 +0000 UTC m=+0.038316490 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 09:29:05 compute-0 podman[222905]: 2026-01-23 09:29:05.229324712 +0000 UTC m=+0.066220085 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:29:05 compute-0 nova_compute[182092]: 2026-01-23 09:29:05.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:05 compute-0 nova_compute[182092]: 2026-01-23 09:29:05.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:05 compute-0 nova_compute[182092]: 2026-01-23 09:29:05.860 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:05 compute-0 nova_compute[182092]: 2026-01-23 09:29:05.860 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:05 compute-0 nova_compute[182092]: 2026-01-23 09:29:05.861 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:05 compute-0 nova_compute[182092]: 2026-01-23 09:29:05.861 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.059 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.059 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5617MB free_disk=73.2636489868164GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.059 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.060 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.115 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.115 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.131 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.139 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.151 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:29:06 compute-0 nova_compute[182092]: 2026-01-23 09:29:06.152 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.148 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:07.335 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:29:07 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:07.335 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.335 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.528 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.529 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.543 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.605 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.606 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.611 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.611 182096 INFO nova.compute.claims [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.689 182096 DEBUG nova.compute.provider_tree [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.699 182096 DEBUG nova.scheduler.client.report [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.712 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.712 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.747 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.747 182096 DEBUG nova.network.neutron [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.759 182096 INFO nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.770 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.841 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.841 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.842 182096 INFO nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Creating image(s)
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.842 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.842 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.843 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.853 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.899 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.900 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.901 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.910 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.954 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.955 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.975 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.976 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:07 compute-0 nova_compute[182092]: 2026-01-23 09:29:07.976 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.021 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.022 182096 DEBUG nova.virt.disk.api [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Checking if we can resize image /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.022 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.067 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.068 182096 DEBUG nova.virt.disk.api [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Cannot resize image /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.068 182096 DEBUG nova.objects.instance [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lazy-loading 'migration_context' on Instance uuid bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.080 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.080 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Ensure instance console log exists: /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.080 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.081 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.081 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.148 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.148 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.148 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.148 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.164 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.164 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.237 182096 DEBUG nova.policy [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff07a58fbdbb442cb7faa2ceb6066b86', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '439a96c54de44b239d374954108968be', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.576 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:29:08 compute-0 nova_compute[182092]: 2026-01-23 09:29:08.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:29:09 compute-0 nova_compute[182092]: 2026-01-23 09:29:09.239 182096 DEBUG nova.network.neutron [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Successfully created port: b72ab260-6d0e-4570-8b7c-a31a315016ee _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:29:09 compute-0 nova_compute[182092]: 2026-01-23 09:29:09.901 182096 DEBUG nova.network.neutron [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Successfully updated port: b72ab260-6d0e-4570-8b7c-a31a315016ee _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:29:09 compute-0 nova_compute[182092]: 2026-01-23 09:29:09.912 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:29:09 compute-0 nova_compute[182092]: 2026-01-23 09:29:09.913 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquired lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:29:09 compute-0 nova_compute[182092]: 2026-01-23 09:29:09.913 182096 DEBUG nova.network.neutron [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:29:10 compute-0 nova_compute[182092]: 2026-01-23 09:29:10.034 182096 DEBUG nova.compute.manager [req-b64d054f-86ad-4226-a422-05684b9db95e req-012c297c-7bb0-4581-97fd-6be20c085b97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received event network-changed-b72ab260-6d0e-4570-8b7c-a31a315016ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:10 compute-0 nova_compute[182092]: 2026-01-23 09:29:10.034 182096 DEBUG nova.compute.manager [req-b64d054f-86ad-4226-a422-05684b9db95e req-012c297c-7bb0-4581-97fd-6be20c085b97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Refreshing instance network info cache due to event network-changed-b72ab260-6d0e-4570-8b7c-a31a315016ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:29:10 compute-0 nova_compute[182092]: 2026-01-23 09:29:10.034 182096 DEBUG oslo_concurrency.lockutils [req-b64d054f-86ad-4226-a422-05684b9db95e req-012c297c-7bb0-4581-97fd-6be20c085b97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:29:10 compute-0 nova_compute[182092]: 2026-01-23 09:29:10.082 182096 DEBUG nova.network.neutron [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.259 182096 DEBUG nova.network.neutron [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Updating instance_info_cache with network_info: [{"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.274 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Releasing lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.274 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Instance network_info: |[{"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.274 182096 DEBUG oslo_concurrency.lockutils [req-b64d054f-86ad-4226-a422-05684b9db95e req-012c297c-7bb0-4581-97fd-6be20c085b97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.275 182096 DEBUG nova.network.neutron [req-b64d054f-86ad-4226-a422-05684b9db95e req-012c297c-7bb0-4581-97fd-6be20c085b97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Refreshing network info cache for port b72ab260-6d0e-4570-8b7c-a31a315016ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.277 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Start _get_guest_xml network_info=[{"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.280 182096 WARNING nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.287 182096 DEBUG nova.virt.libvirt.host [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.287 182096 DEBUG nova.virt.libvirt.host [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.291 182096 DEBUG nova.virt.libvirt.host [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.292 182096 DEBUG nova.virt.libvirt.host [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.293 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.293 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.293 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.294 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.294 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.294 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.295 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.295 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.295 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.296 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.296 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.296 182096 DEBUG nova.virt.hardware [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.299 182096 DEBUG nova.virt.libvirt.vif [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:29:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1686633299',display_name='tempest-DeleteServersTestJSON-server-1686633299',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1686633299',id=107,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='439a96c54de44b239d374954108968be',ramdisk_id='',reservation_id='r-8y0xzgj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2005458721',owner_user_name='tempest-DeleteServersTestJSON-2005458721-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:29:07Z,user_data=None,user_id='ff07a58fbdbb442cb7faa2ceb6066b86',uuid=bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.299 182096 DEBUG nova.network.os_vif_util [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converting VIF {"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.300 182096 DEBUG nova.network.os_vif_util [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.301 182096 DEBUG nova.objects.instance [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lazy-loading 'pci_devices' on Instance uuid bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.310 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <uuid>bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a</uuid>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <name>instance-0000006b</name>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <nova:name>tempest-DeleteServersTestJSON-server-1686633299</nova:name>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:29:11</nova:creationTime>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:29:11 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:29:11 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:29:11 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:29:11 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:29:11 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:29:11 compute-0 nova_compute[182092]:         <nova:user uuid="ff07a58fbdbb442cb7faa2ceb6066b86">tempest-DeleteServersTestJSON-2005458721-project-member</nova:user>
Jan 23 09:29:11 compute-0 nova_compute[182092]:         <nova:project uuid="439a96c54de44b239d374954108968be">tempest-DeleteServersTestJSON-2005458721</nova:project>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:29:11 compute-0 nova_compute[182092]:         <nova:port uuid="b72ab260-6d0e-4570-8b7c-a31a315016ee">
Jan 23 09:29:11 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <system>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <entry name="serial">bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a</entry>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <entry name="uuid">bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a</entry>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </system>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <os>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   </os>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <features>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   </features>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.config"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:c0:98:15"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <target dev="tapb72ab260-6d"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/console.log" append="off"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <video>
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </video>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:29:11 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:29:11 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:29:11 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:29:11 compute-0 nova_compute[182092]: </domain>
Jan 23 09:29:11 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.311 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Preparing to wait for external event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.312 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.312 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.312 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.313 182096 DEBUG nova.virt.libvirt.vif [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:29:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1686633299',display_name='tempest-DeleteServersTestJSON-server-1686633299',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1686633299',id=107,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='439a96c54de44b239d374954108968be',ramdisk_id='',reservation_id='r-8y0xzgj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2005458721',owner_user_name='tempest-DeleteServersTestJSON-2005458721-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:29:07Z,user_data=None,user_id='ff07a58fbdbb442cb7faa2ceb6066b86',uuid=bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.313 182096 DEBUG nova.network.os_vif_util [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converting VIF {"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.314 182096 DEBUG nova.network.os_vif_util [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.314 182096 DEBUG os_vif [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.314 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.315 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.315 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.317 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.317 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb72ab260-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.317 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb72ab260-6d, col_values=(('external_ids', {'iface-id': 'b72ab260-6d0e-4570-8b7c-a31a315016ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:98:15', 'vm-uuid': 'bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:11 compute-0 NetworkManager[54920]: <info>  [1769160551.3194] manager: (tapb72ab260-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.318 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.321 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.323 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.324 182096 INFO os_vif [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d')
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.355 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.356 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.356 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] No VIF found with MAC fa:16:3e:c0:98:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.356 182096 INFO nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Using config drive
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.586 182096 INFO nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Creating config drive at /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.config
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.598 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_yphb4gn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.716 182096 DEBUG oslo_concurrency.processutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_yphb4gn" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:11 compute-0 kernel: tapb72ab260-6d: entered promiscuous mode
Jan 23 09:29:11 compute-0 ovn_controller[94697]: 2026-01-23T09:29:11Z|00404|binding|INFO|Claiming lport b72ab260-6d0e-4570-8b7c-a31a315016ee for this chassis.
Jan 23 09:29:11 compute-0 ovn_controller[94697]: 2026-01-23T09:29:11Z|00405|binding|INFO|b72ab260-6d0e-4570-8b7c-a31a315016ee: Claiming fa:16:3e:c0:98:15 10.100.0.10
Jan 23 09:29:11 compute-0 NetworkManager[54920]: <info>  [1769160551.7592] manager: (tapb72ab260-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.757 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.763 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:98:15 10.100.0.10'], port_security=['fa:16:3e:c0:98:15 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '439a96c54de44b239d374954108968be', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd676b957-c18c-4bbf-b899-13521827f709', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=38af584d-6b9d-4c98-af0a-2dc34a75edac, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b72ab260-6d0e-4570-8b7c-a31a315016ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.764 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b72ab260-6d0e-4570-8b7c-a31a315016ee in datapath 0ebeaff5-3e26-47fb-9cca-bb690223e5fb bound to our chassis
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.765 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0ebeaff5-3e26-47fb-9cca-bb690223e5fb
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.772 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 ovn_controller[94697]: 2026-01-23T09:29:11Z|00406|binding|INFO|Setting lport b72ab260-6d0e-4570-8b7c-a31a315016ee ovn-installed in OVS
Jan 23 09:29:11 compute-0 ovn_controller[94697]: 2026-01-23T09:29:11Z|00407|binding|INFO|Setting lport b72ab260-6d0e-4570-8b7c-a31a315016ee up in Southbound
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.773 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[48f6dd9e-1feb-4332-8516-0cd8b89bafc8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.773 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0ebeaff5-31 in ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.774 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.774 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0ebeaff5-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.774 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4843db-f3dc-49d6-b22f-f34c6add155f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.775 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cabd9bd3-8eaa-49e4-9bc3-f59087970cbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.776 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.784 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[2bac41d6-d619-4fce-9aee-1e96d4d15b6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 systemd-udevd[222978]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.793 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bb946a68-f9be-4c00-a7ef-a4a2d9610814]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 NetworkManager[54920]: <info>  [1769160551.7973] device (tapb72ab260-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:29:11 compute-0 systemd-machined[153562]: New machine qemu-54-instance-0000006b.
Jan 23 09:29:11 compute-0 NetworkManager[54920]: <info>  [1769160551.7980] device (tapb72ab260-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:29:11 compute-0 systemd[1]: Started Virtual Machine qemu-54-instance-0000006b.
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.812 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6eba28f0-75f5-4691-807e-0b4fc7fc076d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.815 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[627330bb-66e4-4574-ac75-a8de3492f8f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 NetworkManager[54920]: <info>  [1769160551.8158] manager: (tap0ebeaff5-30): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Jan 23 09:29:11 compute-0 systemd-udevd[222983]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.834 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a046a176-d34f-4288-97a0-2faab9b99e1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.836 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[82fdaafe-4ec1-46c6-b29e-af8590b06af5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 NetworkManager[54920]: <info>  [1769160551.8514] device (tap0ebeaff5-30): carrier: link connected
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.855 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[27a16d13-cc55-4cde-b701-a837d34b6ff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.866 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fe30ad00-81fd-4e2e-9e92-7e2616e7b9fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0ebeaff5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:ee:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405733, 'reachable_time': 15097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223002, 'error': None, 'target': 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.875 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[de5613ef-c861-442a-ab61-6661758639f0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:eeba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 405733, 'tstamp': 405733}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223003, 'error': None, 'target': 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.885 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[89a6ffae-bac3-4625-98ef-fb5c5c240e10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0ebeaff5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:ee:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405733, 'reachable_time': 15097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223004, 'error': None, 'target': 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.902 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4db082fa-7c62-4a56-a743-937ce87291cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.933 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a51bb560-f37e-49a3-aa99-70fb82ad23bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.934 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ebeaff5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.934 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.935 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ebeaff5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:11 compute-0 NetworkManager[54920]: <info>  [1769160551.9368] manager: (tap0ebeaff5-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 23 09:29:11 compute-0 kernel: tap0ebeaff5-30: entered promiscuous mode
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.940 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.945 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0ebeaff5-30, col_values=(('external_ids', {'iface-id': '08ec51c7-019e-438c-bd01-f7c758e0c6d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:11 compute-0 ovn_controller[94697]: 2026-01-23T09:29:11Z|00408|binding|INFO|Releasing lport 08ec51c7-019e-438c-bd01-f7c758e0c6d5 from this chassis (sb_readonly=0)
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.955 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0ebeaff5-3e26-47fb-9cca-bb690223e5fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0ebeaff5-3e26-47fb-9cca-bb690223e5fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.956 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3e8d05b9-d672-468e-9e64-3f1ab79ab13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.956 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-0ebeaff5-3e26-47fb-9cca-bb690223e5fb
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/0ebeaff5-3e26-47fb-9cca-bb690223e5fb.pid.haproxy
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 0ebeaff5-3e26-47fb-9cca-bb690223e5fb
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:29:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:11.957 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'env', 'PROCESS_TAG=haproxy-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0ebeaff5-3e26-47fb-9cca-bb690223e5fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.957 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:11 compute-0 nova_compute[182092]: 2026-01-23 09:29:11.959 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.149 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.235 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160552.2348237, bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.235 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] VM Started (Lifecycle Event)
Jan 23 09:29:12 compute-0 podman[223038]: 2026-01-23 09:29:12.238356788 +0000 UTC m=+0.032913737 container create 43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.249 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.252 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160552.2349124, bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.252 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] VM Paused (Lifecycle Event)
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.263 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:12 compute-0 systemd[1]: Started libpod-conmon-43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2.scope.
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.266 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.280 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:29:12 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:29:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d87948191c2a28faa4e86e1ce7864a3c5b09d4265cbb4c4183645f27831a5e2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:29:12 compute-0 podman[223038]: 2026-01-23 09:29:12.297238067 +0000 UTC m=+0.091795017 container init 43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 09:29:12 compute-0 podman[223038]: 2026-01-23 09:29:12.301975955 +0000 UTC m=+0.096532904 container start 43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 09:29:12 compute-0 podman[223038]: 2026-01-23 09:29:12.222715292 +0000 UTC m=+0.017272252 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:29:12 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[223051]: [NOTICE]   (223055) : New worker (223057) forked
Jan 23 09:29:12 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[223051]: [NOTICE]   (223055) : Loading success.
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.331 182096 DEBUG nova.compute.manager [req-4541ccec-b202-4de3-a42c-df06c8c78db8 req-3f8b6535-636a-4a3f-8a0c-a3298bf190ce 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.332 182096 DEBUG oslo_concurrency.lockutils [req-4541ccec-b202-4de3-a42c-df06c8c78db8 req-3f8b6535-636a-4a3f-8a0c-a3298bf190ce 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.332 182096 DEBUG oslo_concurrency.lockutils [req-4541ccec-b202-4de3-a42c-df06c8c78db8 req-3f8b6535-636a-4a3f-8a0c-a3298bf190ce 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.332 182096 DEBUG oslo_concurrency.lockutils [req-4541ccec-b202-4de3-a42c-df06c8c78db8 req-3f8b6535-636a-4a3f-8a0c-a3298bf190ce 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.333 182096 DEBUG nova.compute.manager [req-4541ccec-b202-4de3-a42c-df06c8c78db8 req-3f8b6535-636a-4a3f-8a0c-a3298bf190ce 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Processing event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.333 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.335 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160552.3354485, bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.335 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] VM Resumed (Lifecycle Event)
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.339 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.342 182096 INFO nova.virt.libvirt.driver [-] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Instance spawned successfully.
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.342 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.348 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.350 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.367 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.367 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.368 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.368 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.369 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.369 182096 DEBUG nova.virt.libvirt.driver [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.371 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.441 182096 INFO nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Took 4.60 seconds to spawn the instance on the hypervisor.
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.444 182096 DEBUG nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.516 182096 INFO nova.compute.manager [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Took 4.93 seconds to build instance.
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.538 182096 DEBUG oslo_concurrency.lockutils [None req-abfb3c86-75f2-4b2e-a7a5-5ddbf780d760 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.641 182096 DEBUG nova.network.neutron [req-b64d054f-86ad-4226-a422-05684b9db95e req-012c297c-7bb0-4581-97fd-6be20c085b97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Updated VIF entry in instance network info cache for port b72ab260-6d0e-4570-8b7c-a31a315016ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.641 182096 DEBUG nova.network.neutron [req-b64d054f-86ad-4226-a422-05684b9db95e req-012c297c-7bb0-4581-97fd-6be20c085b97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Updating instance_info_cache with network_info: [{"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:29:12 compute-0 nova_compute[182092]: 2026-01-23 09:29:12.653 182096 DEBUG oslo_concurrency.lockutils [req-b64d054f-86ad-4226-a422-05684b9db95e req-012c297c-7bb0-4581-97fd-6be20c085b97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:29:13 compute-0 nova_compute[182092]: 2026-01-23 09:29:13.964 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160538.9636178, 1c7830f4-294f-413c-843b-113d480d4ed9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:29:13 compute-0 nova_compute[182092]: 2026-01-23 09:29:13.965 182096 INFO nova.compute.manager [-] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] VM Stopped (Lifecycle Event)
Jan 23 09:29:13 compute-0 nova_compute[182092]: 2026-01-23 09:29:13.989 182096 DEBUG nova.compute.manager [None req-bf366d1f-b3dc-4c49-9e31-b6964b6b2333 - - - - - -] [instance: 1c7830f4-294f-413c-843b-113d480d4ed9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:14 compute-0 nova_compute[182092]: 2026-01-23 09:29:14.411 182096 DEBUG nova.compute.manager [req-1a555007-c65e-4090-8b7d-7c5afc0a5cdb req-b68ab1df-53d2-4faa-8bc5-eb3723bf0ebe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:14 compute-0 nova_compute[182092]: 2026-01-23 09:29:14.412 182096 DEBUG oslo_concurrency.lockutils [req-1a555007-c65e-4090-8b7d-7c5afc0a5cdb req-b68ab1df-53d2-4faa-8bc5-eb3723bf0ebe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:14 compute-0 nova_compute[182092]: 2026-01-23 09:29:14.412 182096 DEBUG oslo_concurrency.lockutils [req-1a555007-c65e-4090-8b7d-7c5afc0a5cdb req-b68ab1df-53d2-4faa-8bc5-eb3723bf0ebe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:14 compute-0 nova_compute[182092]: 2026-01-23 09:29:14.412 182096 DEBUG oslo_concurrency.lockutils [req-1a555007-c65e-4090-8b7d-7c5afc0a5cdb req-b68ab1df-53d2-4faa-8bc5-eb3723bf0ebe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:14 compute-0 nova_compute[182092]: 2026-01-23 09:29:14.412 182096 DEBUG nova.compute.manager [req-1a555007-c65e-4090-8b7d-7c5afc0a5cdb req-b68ab1df-53d2-4faa-8bc5-eb3723bf0ebe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] No waiting events found dispatching network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:29:14 compute-0 nova_compute[182092]: 2026-01-23 09:29:14.412 182096 WARNING nova.compute.manager [req-1a555007-c65e-4090-8b7d-7c5afc0a5cdb req-b68ab1df-53d2-4faa-8bc5-eb3723bf0ebe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received unexpected event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee for instance with vm_state active and task_state None.
Jan 23 09:29:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:15.338 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:16 compute-0 nova_compute[182092]: 2026-01-23 09:29:16.320 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:16 compute-0 nova_compute[182092]: 2026-01-23 09:29:16.692 182096 DEBUG oslo_concurrency.lockutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:29:16 compute-0 nova_compute[182092]: 2026-01-23 09:29:16.692 182096 DEBUG oslo_concurrency.lockutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquired lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:29:16 compute-0 nova_compute[182092]: 2026-01-23 09:29:16.692 182096 DEBUG nova.network.neutron [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:29:17 compute-0 nova_compute[182092]: 2026-01-23 09:29:17.151 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:17 compute-0 podman[223062]: 2026-01-23 09:29:17.21033953 +0000 UTC m=+0.043487152 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:29:17 compute-0 podman[223063]: 2026-01-23 09:29:17.242240555 +0000 UTC m=+0.075092661 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:29:18 compute-0 nova_compute[182092]: 2026-01-23 09:29:18.644 182096 DEBUG nova.network.neutron [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Updating instance_info_cache with network_info: [{"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:29:18 compute-0 nova_compute[182092]: 2026-01-23 09:29:18.658 182096 DEBUG oslo_concurrency.lockutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Releasing lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:29:18 compute-0 nova_compute[182092]: 2026-01-23 09:29:18.750 182096 DEBUG nova.virt.libvirt.driver [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 23 09:29:18 compute-0 nova_compute[182092]: 2026-01-23 09:29:18.750 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Creating file /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/69c47e4dca6240079aea7d33446469c0.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 23 09:29:18 compute-0 nova_compute[182092]: 2026-01-23 09:29:18.750 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/69c47e4dca6240079aea7d33446469c0.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:19 compute-0 nova_compute[182092]: 2026-01-23 09:29:19.071 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/69c47e4dca6240079aea7d33446469c0.tmp" returned: 1 in 0.321s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:19 compute-0 nova_compute[182092]: 2026-01-23 09:29:19.073 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/69c47e4dca6240079aea7d33446469c0.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 09:29:19 compute-0 nova_compute[182092]: 2026-01-23 09:29:19.074 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Creating directory /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 23 09:29:19 compute-0 nova_compute[182092]: 2026-01-23 09:29:19.074 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:19 compute-0 podman[223101]: 2026-01-23 09:29:19.20818498 +0000 UTC m=+0.042671322 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:29:19 compute-0 nova_compute[182092]: 2026-01-23 09:29:19.240 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:19 compute-0 nova_compute[182092]: 2026-01-23 09:29:19.243 182096 DEBUG nova.virt.libvirt.driver [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:29:21 compute-0 nova_compute[182092]: 2026-01-23 09:29:21.322 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:22 compute-0 nova_compute[182092]: 2026-01-23 09:29:22.151 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:24 compute-0 ovn_controller[94697]: 2026-01-23T09:29:24Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:98:15 10.100.0.10
Jan 23 09:29:24 compute-0 ovn_controller[94697]: 2026-01-23T09:29:24Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:98:15 10.100.0.10
Jan 23 09:29:26 compute-0 nova_compute[182092]: 2026-01-23 09:29:26.325 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:27 compute-0 nova_compute[182092]: 2026-01-23 09:29:27.152 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:29 compute-0 podman[223131]: 2026-01-23 09:29:29.225917028 +0000 UTC m=+0.058360425 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 09:29:29 compute-0 nova_compute[182092]: 2026-01-23 09:29:29.278 182096 DEBUG nova.virt.libvirt.driver [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:29:31 compute-0 nova_compute[182092]: 2026-01-23 09:29:31.328 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:31 compute-0 kernel: tapb72ab260-6d (unregistering): left promiscuous mode
Jan 23 09:29:31 compute-0 NetworkManager[54920]: <info>  [1769160571.4069] device (tapb72ab260-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:29:31 compute-0 ovn_controller[94697]: 2026-01-23T09:29:31Z|00409|binding|INFO|Releasing lport b72ab260-6d0e-4570-8b7c-a31a315016ee from this chassis (sb_readonly=0)
Jan 23 09:29:31 compute-0 ovn_controller[94697]: 2026-01-23T09:29:31Z|00410|binding|INFO|Setting lport b72ab260-6d0e-4570-8b7c-a31a315016ee down in Southbound
Jan 23 09:29:31 compute-0 ovn_controller[94697]: 2026-01-23T09:29:31Z|00411|binding|INFO|Removing iface tapb72ab260-6d ovn-installed in OVS
Jan 23 09:29:31 compute-0 nova_compute[182092]: 2026-01-23 09:29:31.411 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:31 compute-0 nova_compute[182092]: 2026-01-23 09:29:31.414 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.423 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:98:15 10.100.0.10'], port_security=['fa:16:3e:c0:98:15 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '439a96c54de44b239d374954108968be', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd676b957-c18c-4bbf-b899-13521827f709', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=38af584d-6b9d-4c98-af0a-2dc34a75edac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b72ab260-6d0e-4570-8b7c-a31a315016ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.424 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b72ab260-6d0e-4570-8b7c-a31a315016ee in datapath 0ebeaff5-3e26-47fb-9cca-bb690223e5fb unbound from our chassis
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.426 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0ebeaff5-3e26-47fb-9cca-bb690223e5fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:29:31 compute-0 nova_compute[182092]: 2026-01-23 09:29:31.427 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.430 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc44641-3a54-4e97-b2d3-15e5f1c02e1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.431 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb namespace which is not needed anymore
Jan 23 09:29:31 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Jan 23 09:29:31 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d0000006b.scope: Consumed 11.401s CPU time.
Jan 23 09:29:31 compute-0 systemd-machined[153562]: Machine qemu-54-instance-0000006b terminated.
Jan 23 09:29:31 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[223051]: [NOTICE]   (223055) : haproxy version is 2.8.14-c23fe91
Jan 23 09:29:31 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[223051]: [NOTICE]   (223055) : path to executable is /usr/sbin/haproxy
Jan 23 09:29:31 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[223051]: [WARNING]  (223055) : Exiting Master process...
Jan 23 09:29:31 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[223051]: [ALERT]    (223055) : Current worker (223057) exited with code 143 (Terminated)
Jan 23 09:29:31 compute-0 neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb[223051]: [WARNING]  (223055) : All workers exited. Exiting... (0)
Jan 23 09:29:31 compute-0 systemd[1]: libpod-43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2.scope: Deactivated successfully.
Jan 23 09:29:31 compute-0 podman[223176]: 2026-01-23 09:29:31.524647321 +0000 UTC m=+0.032369512 container died 43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:29:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2-userdata-shm.mount: Deactivated successfully.
Jan 23 09:29:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-d87948191c2a28faa4e86e1ce7864a3c5b09d4265cbb4c4183645f27831a5e2a-merged.mount: Deactivated successfully.
Jan 23 09:29:31 compute-0 podman[223176]: 2026-01-23 09:29:31.542217113 +0000 UTC m=+0.049939304 container cleanup 43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:29:31 compute-0 systemd[1]: libpod-conmon-43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2.scope: Deactivated successfully.
Jan 23 09:29:31 compute-0 podman[223200]: 2026-01-23 09:29:31.583188586 +0000 UTC m=+0.023162060 container remove 43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.586 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[653426e9-da4e-4c45-a0b3-aa3f7ffbda8b]: (4, ('Fri Jan 23 09:29:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb (43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2)\n43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2\nFri Jan 23 09:29:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb (43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2)\n43fd7cd64d4d9ab5ec9c2d2e71a79622326d584d58a97524bbdb9a9e65bffcd2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.587 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[58f3997a-fc37-4c42-8ff8-7859ddc15d28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.588 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ebeaff5-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:31 compute-0 nova_compute[182092]: 2026-01-23 09:29:31.590 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:31 compute-0 kernel: tap0ebeaff5-30: left promiscuous mode
Jan 23 09:29:31 compute-0 nova_compute[182092]: 2026-01-23 09:29:31.606 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.609 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5771d3cd-aed0-4ae6-b2bc-21c42ff23495]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.617 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[211afc14-2ae2-4ae1-8d9d-214aba8f7601]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.619 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[24fe0863-f240-4dc5-81fe-8b873962ec92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:31 compute-0 nova_compute[182092]: 2026-01-23 09:29:31.625 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:31 compute-0 nova_compute[182092]: 2026-01-23 09:29:31.628 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:31 compute-0 systemd[1]: run-netns-ovnmeta\x2d0ebeaff5\x2d3e26\x2d47fb\x2d9cca\x2dbb690223e5fb.mount: Deactivated successfully.
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.630 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8de59b5c-f898-4aff-9593-81db697ec627]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405729, 'reachable_time': 24264, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223218, 'error': None, 'target': 'ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.633 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0ebeaff5-3e26-47fb-9cca-bb690223e5fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:29:31 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:31.634 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b10bdb14-a4f7-445a-975e-4b44a327af6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.153 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.288 182096 INFO nova.virt.libvirt.driver [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Instance shutdown successfully after 13 seconds.
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.292 182096 INFO nova.virt.libvirt.driver [-] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Instance destroyed successfully.
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.292 182096 DEBUG nova.virt.libvirt.vif [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:29:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1686633299',display_name='tempest-DeleteServersTestJSON-server-1686633299',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1686633299',id=107,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:29:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='439a96c54de44b239d374954108968be',ramdisk_id='',reservation_id='r-8y0xzgj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2005458721',owner_user_name='tempest-DeleteServersTestJSON-2005458721-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:29:16Z,user_data=None,user_id='ff07a58fbdbb442cb7faa2ceb6066b86',uuid=bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-2107966006-network", "vif_mac": "fa:16:3e:c0:98:15"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.293 182096 DEBUG nova.network.os_vif_util [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converting VIF {"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-2107966006-network", "vif_mac": "fa:16:3e:c0:98:15"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.293 182096 DEBUG nova.network.os_vif_util [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.294 182096 DEBUG os_vif [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.295 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.296 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb72ab260-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.297 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.298 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.300 182096 INFO os_vif [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d')
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.302 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.344 182096 DEBUG nova.compute.manager [req-26971578-4544-44d3-ada7-c4cb13d9f8aa req-7171a516-1630-4c65-b5ec-2bb63430667d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received event network-vif-unplugged-b72ab260-6d0e-4570-8b7c-a31a315016ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.345 182096 DEBUG oslo_concurrency.lockutils [req-26971578-4544-44d3-ada7-c4cb13d9f8aa req-7171a516-1630-4c65-b5ec-2bb63430667d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.345 182096 DEBUG oslo_concurrency.lockutils [req-26971578-4544-44d3-ada7-c4cb13d9f8aa req-7171a516-1630-4c65-b5ec-2bb63430667d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.345 182096 DEBUG oslo_concurrency.lockutils [req-26971578-4544-44d3-ada7-c4cb13d9f8aa req-7171a516-1630-4c65-b5ec-2bb63430667d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.346 182096 DEBUG nova.compute.manager [req-26971578-4544-44d3-ada7-c4cb13d9f8aa req-7171a516-1630-4c65-b5ec-2bb63430667d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] No waiting events found dispatching network-vif-unplugged-b72ab260-6d0e-4570-8b7c-a31a315016ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.346 182096 WARNING nova.compute.manager [req-26971578-4544-44d3-ada7-c4cb13d9f8aa req-7171a516-1630-4c65-b5ec-2bb63430667d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received unexpected event network-vif-unplugged-b72ab260-6d0e-4570-8b7c-a31a315016ee for instance with vm_state active and task_state resize_migrating.
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.350 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.350 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.395 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.396 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Copying file /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk to 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:29:32 compute-0 nova_compute[182092]: 2026-01-23 09:29:32.397 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:33 compute-0 nova_compute[182092]: 2026-01-23 09:29:33.010 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "scp -r /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk" returned: 0 in 0.613s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:33 compute-0 nova_compute[182092]: 2026-01-23 09:29:33.011 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Copying file /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:29:33 compute-0 nova_compute[182092]: 2026-01-23 09:29:33.011 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk.config 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:33 compute-0 nova_compute[182092]: 2026-01-23 09:29:33.194 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "scp -C -r /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk.config 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.config" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:33 compute-0 nova_compute[182092]: 2026-01-23 09:29:33.195 182096 DEBUG nova.virt.libvirt.volume.remotefs [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Copying file /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 23 09:29:33 compute-0 nova_compute[182092]: 2026-01-23 09:29:33.195 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk.info 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:33 compute-0 nova_compute[182092]: 2026-01-23 09:29:33.373 182096 DEBUG oslo_concurrency.processutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] CMD "scp -C -r /var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a_resize/disk.info 192.168.122.101:/var/lib/nova/instances/bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a/disk.info" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:33 compute-0 nova_compute[182092]: 2026-01-23 09:29:33.978 182096 DEBUG neutronclient.v2_0.client [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b72ab260-6d0e-4570-8b7c-a31a315016ee for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.057 182096 DEBUG oslo_concurrency.lockutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.058 182096 DEBUG oslo_concurrency.lockutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.058 182096 DEBUG oslo_concurrency.lockutils [None req-6ad68f75-3cf3-4d39-860e-e140c9024be6 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.614 182096 DEBUG nova.compute.manager [req-67c8dd0c-fff6-4281-942f-295fe94d75fb req-4fee9528-9e0b-4f0e-ae84-314f10f04f52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.615 182096 DEBUG oslo_concurrency.lockutils [req-67c8dd0c-fff6-4281-942f-295fe94d75fb req-4fee9528-9e0b-4f0e-ae84-314f10f04f52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.615 182096 DEBUG oslo_concurrency.lockutils [req-67c8dd0c-fff6-4281-942f-295fe94d75fb req-4fee9528-9e0b-4f0e-ae84-314f10f04f52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.615 182096 DEBUG oslo_concurrency.lockutils [req-67c8dd0c-fff6-4281-942f-295fe94d75fb req-4fee9528-9e0b-4f0e-ae84-314f10f04f52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.615 182096 DEBUG nova.compute.manager [req-67c8dd0c-fff6-4281-942f-295fe94d75fb req-4fee9528-9e0b-4f0e-ae84-314f10f04f52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] No waiting events found dispatching network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.615 182096 WARNING nova.compute.manager [req-67c8dd0c-fff6-4281-942f-295fe94d75fb req-4fee9528-9e0b-4f0e-ae84-314f10f04f52 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received unexpected event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee for instance with vm_state active and task_state resize_migrated.
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.934 182096 DEBUG nova.compute.manager [req-8690dc58-7f5b-4cab-a68b-fd0194d70934 req-e4942ffd-59f3-49a9-bbd3-5b621db8096e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received event network-changed-b72ab260-6d0e-4570-8b7c-a31a315016ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.934 182096 DEBUG nova.compute.manager [req-8690dc58-7f5b-4cab-a68b-fd0194d70934 req-e4942ffd-59f3-49a9-bbd3-5b621db8096e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Refreshing instance network info cache due to event network-changed-b72ab260-6d0e-4570-8b7c-a31a315016ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.935 182096 DEBUG oslo_concurrency.lockutils [req-8690dc58-7f5b-4cab-a68b-fd0194d70934 req-e4942ffd-59f3-49a9-bbd3-5b621db8096e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.935 182096 DEBUG oslo_concurrency.lockutils [req-8690dc58-7f5b-4cab-a68b-fd0194d70934 req-e4942ffd-59f3-49a9-bbd3-5b621db8096e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:29:34 compute-0 nova_compute[182092]: 2026-01-23 09:29:34.935 182096 DEBUG nova.network.neutron [req-8690dc58-7f5b-4cab-a68b-fd0194d70934 req-e4942ffd-59f3-49a9-bbd3-5b621db8096e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Refreshing network info cache for port b72ab260-6d0e-4570-8b7c-a31a315016ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:29:36 compute-0 podman[223245]: 2026-01-23 09:29:36.228756003 +0000 UTC m=+0.066523051 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 23 09:29:36 compute-0 podman[223246]: 2026-01-23 09:29:36.234170294 +0000 UTC m=+0.071246051 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.588 182096 DEBUG nova.network.neutron [req-8690dc58-7f5b-4cab-a68b-fd0194d70934 req-e4942ffd-59f3-49a9-bbd3-5b621db8096e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Updated VIF entry in instance network info cache for port b72ab260-6d0e-4570-8b7c-a31a315016ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.588 182096 DEBUG nova.network.neutron [req-8690dc58-7f5b-4cab-a68b-fd0194d70934 req-e4942ffd-59f3-49a9-bbd3-5b621db8096e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Updating instance_info_cache with network_info: [{"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.603 182096 DEBUG oslo_concurrency.lockutils [req-8690dc58-7f5b-4cab-a68b-fd0194d70934 req-e4942ffd-59f3-49a9-bbd3-5b621db8096e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.717 182096 DEBUG nova.compute.manager [req-8db7c097-b0f2-4126-a060-b6605f9d4913 req-b2879017-1440-4093-a9cd-65322073f206 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.717 182096 DEBUG oslo_concurrency.lockutils [req-8db7c097-b0f2-4126-a060-b6605f9d4913 req-b2879017-1440-4093-a9cd-65322073f206 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.718 182096 DEBUG oslo_concurrency.lockutils [req-8db7c097-b0f2-4126-a060-b6605f9d4913 req-b2879017-1440-4093-a9cd-65322073f206 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.718 182096 DEBUG oslo_concurrency.lockutils [req-8db7c097-b0f2-4126-a060-b6605f9d4913 req-b2879017-1440-4093-a9cd-65322073f206 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.718 182096 DEBUG nova.compute.manager [req-8db7c097-b0f2-4126-a060-b6605f9d4913 req-b2879017-1440-4093-a9cd-65322073f206 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] No waiting events found dispatching network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:29:36 compute-0 nova_compute[182092]: 2026-01-23 09:29:36.718 182096 WARNING nova.compute.manager [req-8db7c097-b0f2-4126-a060-b6605f9d4913 req-b2879017-1440-4093-a9cd-65322073f206 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received unexpected event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee for instance with vm_state active and task_state resize_finish.
Jan 23 09:29:37 compute-0 nova_compute[182092]: 2026-01-23 09:29:37.155 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:37 compute-0 nova_compute[182092]: 2026-01-23 09:29:37.297 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.390 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.391 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.391 182096 DEBUG nova.compute.manager [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Going to confirm migration 18 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.415 182096 DEBUG nova.objects.instance [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lazy-loading 'info_cache' on Instance uuid bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.805 182096 DEBUG nova.compute.manager [req-6be662ec-985e-459e-91b0-ce709926f307 req-f7b716bc-7bd7-4349-b45a-b7e1703d2f92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.806 182096 DEBUG oslo_concurrency.lockutils [req-6be662ec-985e-459e-91b0-ce709926f307 req-f7b716bc-7bd7-4349-b45a-b7e1703d2f92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.806 182096 DEBUG oslo_concurrency.lockutils [req-6be662ec-985e-459e-91b0-ce709926f307 req-f7b716bc-7bd7-4349-b45a-b7e1703d2f92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.806 182096 DEBUG oslo_concurrency.lockutils [req-6be662ec-985e-459e-91b0-ce709926f307 req-f7b716bc-7bd7-4349-b45a-b7e1703d2f92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.806 182096 DEBUG nova.compute.manager [req-6be662ec-985e-459e-91b0-ce709926f307 req-f7b716bc-7bd7-4349-b45a-b7e1703d2f92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] No waiting events found dispatching network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.807 182096 WARNING nova.compute.manager [req-6be662ec-985e-459e-91b0-ce709926f307 req-f7b716bc-7bd7-4349-b45a-b7e1703d2f92 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Received unexpected event network-vif-plugged-b72ab260-6d0e-4570-8b7c-a31a315016ee for instance with vm_state resized and task_state deleting.
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.818 182096 DEBUG neutronclient.v2_0.client [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b72ab260-6d0e-4570-8b7c-a31a315016ee for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.818 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.819 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquired lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:29:38 compute-0 nova_compute[182092]: 2026-01-23 09:29:38.819 182096 DEBUG nova.network.neutron [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:29:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:39.862 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:39.863 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:39.863 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.564 182096 DEBUG nova.network.neutron [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Updating instance_info_cache with network_info: [{"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.585 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Releasing lock "refresh_cache-bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.585 182096 DEBUG nova.objects.instance [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lazy-loading 'migration_context' on Instance uuid bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.603 182096 DEBUG nova.virt.libvirt.vif [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:29:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1686633299',display_name='tempest-DeleteServersTestJSON-server-1686633299',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1686633299',id=107,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:29:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='439a96c54de44b239d374954108968be',ramdisk_id='',reservation_id='r-8y0xzgj3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2005458721',owner_user_name='tempest-DeleteServersTestJSON-2005458721-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:29:38Z,user_data=None,user_id='ff07a58fbdbb442cb7faa2ceb6066b86',uuid=bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.603 182096 DEBUG nova.network.os_vif_util [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converting VIF {"id": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "address": "fa:16:3e:c0:98:15", "network": {"id": "0ebeaff5-3e26-47fb-9cca-bb690223e5fb", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2107966006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "439a96c54de44b239d374954108968be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb72ab260-6d", "ovs_interfaceid": "b72ab260-6d0e-4570-8b7c-a31a315016ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.604 182096 DEBUG nova.network.os_vif_util [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.604 182096 DEBUG os_vif [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.606 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.606 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb72ab260-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.606 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.608 182096 INFO os_vif [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:98:15,bridge_name='br-int',has_traffic_filtering=True,id=b72ab260-6d0e-4570-8b7c-a31a315016ee,network=Network(0ebeaff5-3e26-47fb-9cca-bb690223e5fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb72ab260-6d')
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.608 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.609 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.668 182096 DEBUG nova.compute.provider_tree [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.683 182096 DEBUG nova.scheduler.client.report [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.722 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.835 182096 INFO nova.scheduler.client.report [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Deleted allocation for migration 9d08de35-2566-44bb-8e8c-ad7e7addf01d
Jan 23 09:29:40 compute-0 nova_compute[182092]: 2026-01-23 09:29:40.890 182096 DEBUG oslo_concurrency.lockutils [None req-ca8c6a1c-fa86-4021-9509-34adb5f6c759 ff07a58fbdbb442cb7faa2ceb6066b86 439a96c54de44b239d374954108968be - - default default] Lock "bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.499s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:42 compute-0 nova_compute[182092]: 2026-01-23 09:29:42.155 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:42 compute-0 nova_compute[182092]: 2026-01-23 09:29:42.298 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:45 compute-0 nova_compute[182092]: 2026-01-23 09:29:45.668 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:46 compute-0 nova_compute[182092]: 2026-01-23 09:29:46.654 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160571.6539626, bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:29:46 compute-0 nova_compute[182092]: 2026-01-23 09:29:46.654 182096 INFO nova.compute.manager [-] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] VM Stopped (Lifecycle Event)
Jan 23 09:29:46 compute-0 nova_compute[182092]: 2026-01-23 09:29:46.669 182096 DEBUG nova.compute.manager [None req-12293c5e-b65e-4572-8fd2-1db89a5e413e - - - - - -] [instance: bb6d5c13-2d4d-4ae2-bf42-bcd89924d05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:47 compute-0 nova_compute[182092]: 2026-01-23 09:29:47.156 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:47 compute-0 nova_compute[182092]: 2026-01-23 09:29:47.299 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:48 compute-0 podman[223285]: 2026-01-23 09:29:48.204205693 +0000 UTC m=+0.040348340 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 09:29:48 compute-0 podman[223286]: 2026-01-23 09:29:48.205468851 +0000 UTC m=+0.039634357 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:29:50 compute-0 podman[223322]: 2026-01-23 09:29:50.198151446 +0000 UTC m=+0.036573356 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9-minimal)
Jan 23 09:29:52 compute-0 nova_compute[182092]: 2026-01-23 09:29:52.157 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:52 compute-0 nova_compute[182092]: 2026-01-23 09:29:52.301 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.459 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "de422406-f05b-4543-9135-d46371948cc8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.460 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.473 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.567 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.568 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.572 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.573 182096 INFO nova.compute.claims [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.682 182096 DEBUG nova.compute.provider_tree [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.695 182096 DEBUG nova.scheduler.client.report [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.714 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.714 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.751 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.752 182096 DEBUG nova.network.neutron [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.763 182096 INFO nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.773 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.870 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.871 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.871 182096 INFO nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Creating image(s)
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.872 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "/var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.872 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "/var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.873 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "/var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.883 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.927 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.928 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.929 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.938 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.963 182096 DEBUG nova.policy [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3fcfaa4d06934f93b9116c73f6f5e006', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d9bdf487901467e93e87ea077f02990', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.982 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:53 compute-0 nova_compute[182092]: 2026-01-23 09:29:53.982 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.003 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.004 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.004 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.048 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.049 182096 DEBUG nova.virt.disk.api [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Checking if we can resize image /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.049 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.092 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.093 182096 DEBUG nova.virt.disk.api [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Cannot resize image /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.094 182096 DEBUG nova.objects.instance [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lazy-loading 'migration_context' on Instance uuid de422406-f05b-4543-9135-d46371948cc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.111 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.111 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Ensure instance console log exists: /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.111 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.112 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.112 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:54 compute-0 nova_compute[182092]: 2026-01-23 09:29:54.696 182096 DEBUG nova.network.neutron [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Successfully created port: 4318352b-3d26-4823-bf79-e962072b9e7d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:29:55 compute-0 nova_compute[182092]: 2026-01-23 09:29:55.486 182096 DEBUG nova.network.neutron [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Successfully updated port: 4318352b-3d26-4823-bf79-e962072b9e7d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:29:55 compute-0 nova_compute[182092]: 2026-01-23 09:29:55.506 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "refresh_cache-de422406-f05b-4543-9135-d46371948cc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:29:55 compute-0 nova_compute[182092]: 2026-01-23 09:29:55.506 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquired lock "refresh_cache-de422406-f05b-4543-9135-d46371948cc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:29:55 compute-0 nova_compute[182092]: 2026-01-23 09:29:55.506 182096 DEBUG nova.network.neutron [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:29:55 compute-0 nova_compute[182092]: 2026-01-23 09:29:55.583 182096 DEBUG nova.compute.manager [req-fa1469b4-74e7-4bb3-9336-a0f97b1bb2eb req-cd5d1863-fd40-4eca-9c1c-a6e808ce78b2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received event network-changed-4318352b-3d26-4823-bf79-e962072b9e7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:55 compute-0 nova_compute[182092]: 2026-01-23 09:29:55.584 182096 DEBUG nova.compute.manager [req-fa1469b4-74e7-4bb3-9336-a0f97b1bb2eb req-cd5d1863-fd40-4eca-9c1c-a6e808ce78b2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Refreshing instance network info cache due to event network-changed-4318352b-3d26-4823-bf79-e962072b9e7d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:29:55 compute-0 nova_compute[182092]: 2026-01-23 09:29:55.584 182096 DEBUG oslo_concurrency.lockutils [req-fa1469b4-74e7-4bb3-9336-a0f97b1bb2eb req-cd5d1863-fd40-4eca-9c1c-a6e808ce78b2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-de422406-f05b-4543-9135-d46371948cc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:29:55 compute-0 nova_compute[182092]: 2026-01-23 09:29:55.680 182096 DEBUG nova.network.neutron [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.782 182096 DEBUG nova.network.neutron [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Updating instance_info_cache with network_info: [{"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.802 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Releasing lock "refresh_cache-de422406-f05b-4543-9135-d46371948cc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.802 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Instance network_info: |[{"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.803 182096 DEBUG oslo_concurrency.lockutils [req-fa1469b4-74e7-4bb3-9336-a0f97b1bb2eb req-cd5d1863-fd40-4eca-9c1c-a6e808ce78b2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-de422406-f05b-4543-9135-d46371948cc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.803 182096 DEBUG nova.network.neutron [req-fa1469b4-74e7-4bb3-9336-a0f97b1bb2eb req-cd5d1863-fd40-4eca-9c1c-a6e808ce78b2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Refreshing network info cache for port 4318352b-3d26-4823-bf79-e962072b9e7d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.805 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Start _get_guest_xml network_info=[{"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.808 182096 WARNING nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.812 182096 DEBUG nova.virt.libvirt.host [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.812 182096 DEBUG nova.virt.libvirt.host [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.814 182096 DEBUG nova.virt.libvirt.host [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.815 182096 DEBUG nova.virt.libvirt.host [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.816 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.816 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.816 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.816 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.817 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.817 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.817 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.817 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.817 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.818 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.818 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.818 182096 DEBUG nova.virt.hardware [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.821 182096 DEBUG nova.virt.libvirt.vif [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:29:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-642417068',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-642417068',id=110,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d9bdf487901467e93e87ea077f02990',ramdisk_id='',reservation_id='r-01ki4000',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-709013695',owner_user_name='tempest-ServerTagsTestJSON-709013695-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:29:53Z,user_data=None,user_id='3fcfaa4d06934f93b9116c73f6f5e006',uuid=de422406-f05b-4543-9135-d46371948cc8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.821 182096 DEBUG nova.network.os_vif_util [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Converting VIF {"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.821 182096 DEBUG nova.network.os_vif_util [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:75:a2,bridge_name='br-int',has_traffic_filtering=True,id=4318352b-3d26-4823-bf79-e962072b9e7d,network=Network(986514cb-1f26-4ae9-beac-032573b94c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4318352b-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.823 182096 DEBUG nova.objects.instance [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lazy-loading 'pci_devices' on Instance uuid de422406-f05b-4543-9135-d46371948cc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.833 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <uuid>de422406-f05b-4543-9135-d46371948cc8</uuid>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <name>instance-0000006e</name>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerTagsTestJSON-server-642417068</nova:name>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:29:56</nova:creationTime>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:29:56 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:29:56 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:29:56 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:29:56 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:29:56 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:29:56 compute-0 nova_compute[182092]:         <nova:user uuid="3fcfaa4d06934f93b9116c73f6f5e006">tempest-ServerTagsTestJSON-709013695-project-member</nova:user>
Jan 23 09:29:56 compute-0 nova_compute[182092]:         <nova:project uuid="5d9bdf487901467e93e87ea077f02990">tempest-ServerTagsTestJSON-709013695</nova:project>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:29:56 compute-0 nova_compute[182092]:         <nova:port uuid="4318352b-3d26-4823-bf79-e962072b9e7d">
Jan 23 09:29:56 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <system>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <entry name="serial">de422406-f05b-4543-9135-d46371948cc8</entry>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <entry name="uuid">de422406-f05b-4543-9135-d46371948cc8</entry>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </system>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <os>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   </os>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <features>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   </features>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk.config"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:52:75:a2"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <target dev="tap4318352b-3d"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/console.log" append="off"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <video>
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </video>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:29:56 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:29:56 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:29:56 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:29:56 compute-0 nova_compute[182092]: </domain>
Jan 23 09:29:56 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.834 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Preparing to wait for external event network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.834 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "de422406-f05b-4543-9135-d46371948cc8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.834 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.835 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.835 182096 DEBUG nova.virt.libvirt.vif [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:29:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-642417068',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-642417068',id=110,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d9bdf487901467e93e87ea077f02990',ramdisk_id='',reservation_id='r-01ki4000',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-709013695',owner_user_name='tempest-ServerTagsTestJSON-709013695-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:29:53Z,user_data=None,user_id='3fcfaa4d06934f93b9116c73f6f5e006',uuid=de422406-f05b-4543-9135-d46371948cc8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.835 182096 DEBUG nova.network.os_vif_util [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Converting VIF {"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.836 182096 DEBUG nova.network.os_vif_util [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:75:a2,bridge_name='br-int',has_traffic_filtering=True,id=4318352b-3d26-4823-bf79-e962072b9e7d,network=Network(986514cb-1f26-4ae9-beac-032573b94c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4318352b-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.836 182096 DEBUG os_vif [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:75:a2,bridge_name='br-int',has_traffic_filtering=True,id=4318352b-3d26-4823-bf79-e962072b9e7d,network=Network(986514cb-1f26-4ae9-beac-032573b94c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4318352b-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.837 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.837 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.837 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.839 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.839 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4318352b-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.840 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4318352b-3d, col_values=(('external_ids', {'iface-id': '4318352b-3d26-4823-bf79-e962072b9e7d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:75:a2', 'vm-uuid': 'de422406-f05b-4543-9135-d46371948cc8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.841 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:56 compute-0 NetworkManager[54920]: <info>  [1769160596.8417] manager: (tap4318352b-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.843 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.845 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.846 182096 INFO os_vif [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:75:a2,bridge_name='br-int',has_traffic_filtering=True,id=4318352b-3d26-4823-bf79-e962072b9e7d,network=Network(986514cb-1f26-4ae9-beac-032573b94c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4318352b-3d')
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.880 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.880 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.880 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] No VIF found with MAC fa:16:3e:52:75:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:29:56 compute-0 nova_compute[182092]: 2026-01-23 09:29:56.881 182096 INFO nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Using config drive
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.158 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.316 182096 INFO nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Creating config drive at /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk.config
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.321 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6onzygtl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.438 182096 DEBUG oslo_concurrency.processutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6onzygtl" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:29:57 compute-0 kernel: tap4318352b-3d: entered promiscuous mode
Jan 23 09:29:57 compute-0 NetworkManager[54920]: <info>  [1769160597.4727] manager: (tap4318352b-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Jan 23 09:29:57 compute-0 ovn_controller[94697]: 2026-01-23T09:29:57Z|00412|binding|INFO|Claiming lport 4318352b-3d26-4823-bf79-e962072b9e7d for this chassis.
Jan 23 09:29:57 compute-0 ovn_controller[94697]: 2026-01-23T09:29:57Z|00413|binding|INFO|4318352b-3d26-4823-bf79-e962072b9e7d: Claiming fa:16:3e:52:75:a2 10.100.0.10
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.472 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.480 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.485 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:75:a2 10.100.0.10'], port_security=['fa:16:3e:52:75:a2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'de422406-f05b-4543-9135-d46371948cc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-986514cb-1f26-4ae9-beac-032573b94c42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d9bdf487901467e93e87ea077f02990', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3ea0c52d-483c-4406-86d3-46551ccf0178', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c590800-2532-48ae-aab2-c7fb60083f61, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=4318352b-3d26-4823-bf79-e962072b9e7d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.485 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 4318352b-3d26-4823-bf79-e962072b9e7d in datapath 986514cb-1f26-4ae9-beac-032573b94c42 bound to our chassis
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.487 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 986514cb-1f26-4ae9-beac-032573b94c42
Jan 23 09:29:57 compute-0 systemd-udevd[223374]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.494 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[85919b52-e807-40fc-a07a-0f412a653fde]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.495 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap986514cb-11 in ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.496 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap986514cb-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.496 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fd92ad17-8d22-4028-9b01-8a1323a0963f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.497 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e59a0543-b7f7-45a3-99ff-53ce550ef4cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.505 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2057b8-e3cf-470c-b4e5-e6757d38cc51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 NetworkManager[54920]: <info>  [1769160597.5100] device (tap4318352b-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:29:57 compute-0 NetworkManager[54920]: <info>  [1769160597.5107] device (tap4318352b-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:29:57 compute-0 systemd-machined[153562]: New machine qemu-55-instance-0000006e.
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.528 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e582a939-daca-49f9-9d2b-e503df289d1d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 systemd[1]: Started Virtual Machine qemu-55-instance-0000006e.
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.539 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:57 compute-0 ovn_controller[94697]: 2026-01-23T09:29:57Z|00414|binding|INFO|Setting lport 4318352b-3d26-4823-bf79-e962072b9e7d ovn-installed in OVS
Jan 23 09:29:57 compute-0 ovn_controller[94697]: 2026-01-23T09:29:57Z|00415|binding|INFO|Setting lport 4318352b-3d26-4823-bf79-e962072b9e7d up in Southbound
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.544 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.551 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[26567473-f13c-4771-bc3b-8c2c92942b99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 NetworkManager[54920]: <info>  [1769160597.5547] manager: (tap986514cb-10): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.555 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8392c647-c104-4a70-b5e9-4464a9764f7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.577 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[f61002ec-9b0a-4e32-a899-068bdbe03be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.580 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[f12fd782-98a1-47b1-9b35-6bb8ac349a3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 NetworkManager[54920]: <info>  [1769160597.5950] device (tap986514cb-10): carrier: link connected
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.597 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[27835982-2e0f-494a-b3f6-86fb5b93dce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.609 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fdbad89d-bb11-46da-b7df-3fe5e2a07e03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap986514cb-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:f4:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410308, 'reachable_time': 37001, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223401, 'error': None, 'target': 'ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.618 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[34df00ee-dc6e-4b5d-85d0-81dcfb759b11]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:f478'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 410308, 'tstamp': 410308}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223402, 'error': None, 'target': 'ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.627 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[919b8596-7a3d-4070-aa26-c201bf05a689]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap986514cb-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:f4:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410308, 'reachable_time': 37001, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223403, 'error': None, 'target': 'ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.647 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[13adf37c-5b75-4446-bdac-fb2898da03b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.680 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8ec4d3-3725-4736-afe7-3294472ca30b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.681 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap986514cb-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.681 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.682 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap986514cb-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:57 compute-0 kernel: tap986514cb-10: entered promiscuous mode
Jan 23 09:29:57 compute-0 NetworkManager[54920]: <info>  [1769160597.6846] manager: (tap986514cb-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.683 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.691 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap986514cb-10, col_values=(('external_ids', {'iface-id': '0fe01952-02c0-40be-83cf-ecb40a6a403f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:29:57 compute-0 ovn_controller[94697]: 2026-01-23T09:29:57Z|00416|binding|INFO|Releasing lport 0fe01952-02c0-40be-83cf-ecb40a6a403f from this chassis (sb_readonly=0)
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.695 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.707 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.708 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/986514cb-1f26-4ae9-beac-032573b94c42.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/986514cb-1f26-4ae9-beac-032573b94c42.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.708 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2d28d4cf-6992-480f-a1f3-82732ec28f23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.709 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-986514cb-1f26-4ae9-beac-032573b94c42
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/986514cb-1f26-4ae9-beac-032573b94c42.pid.haproxy
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 986514cb-1f26-4ae9-beac-032573b94c42
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:29:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:29:57.710 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42', 'env', 'PROCESS_TAG=haproxy-986514cb-1f26-4ae9-beac-032573b94c42', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/986514cb-1f26-4ae9-beac-032573b94c42.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.942 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160597.9424949, de422406-f05b-4543-9135-d46371948cc8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.945 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] VM Started (Lifecycle Event)
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.980 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.983 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160597.9431531, de422406-f05b-4543-9135-d46371948cc8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:29:57 compute-0 nova_compute[182092]: 2026-01-23 09:29:57.983 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] VM Paused (Lifecycle Event)
Jan 23 09:29:57 compute-0 podman[223439]: 2026-01-23 09:29:57.995718322 +0000 UTC m=+0.031195543 container create e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:29:58 compute-0 nova_compute[182092]: 2026-01-23 09:29:58.002 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:58 compute-0 nova_compute[182092]: 2026-01-23 09:29:58.005 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:29:58 compute-0 systemd[1]: Started libpod-conmon-e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a.scope.
Jan 23 09:29:58 compute-0 nova_compute[182092]: 2026-01-23 09:29:58.025 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:29:58 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:29:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5570e09a1857a0ab5cd3d489f0b64156d798c2820ee26fae304e8a6ab07dc62/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:29:58 compute-0 podman[223439]: 2026-01-23 09:29:58.06109953 +0000 UTC m=+0.096576761 container init e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:29:58 compute-0 podman[223439]: 2026-01-23 09:29:58.065735949 +0000 UTC m=+0.101213169 container start e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 09:29:58 compute-0 podman[223439]: 2026-01-23 09:29:57.981966525 +0000 UTC m=+0.017443756 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:29:58 compute-0 neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42[223451]: [NOTICE]   (223455) : New worker (223457) forked
Jan 23 09:29:58 compute-0 neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42[223451]: [NOTICE]   (223455) : Loading success.
Jan 23 09:29:58 compute-0 nova_compute[182092]: 2026-01-23 09:29:58.286 182096 DEBUG nova.network.neutron [req-fa1469b4-74e7-4bb3-9336-a0f97b1bb2eb req-cd5d1863-fd40-4eca-9c1c-a6e808ce78b2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Updated VIF entry in instance network info cache for port 4318352b-3d26-4823-bf79-e962072b9e7d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:29:58 compute-0 nova_compute[182092]: 2026-01-23 09:29:58.287 182096 DEBUG nova.network.neutron [req-fa1469b4-74e7-4bb3-9336-a0f97b1bb2eb req-cd5d1863-fd40-4eca-9c1c-a6e808ce78b2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Updating instance_info_cache with network_info: [{"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:29:58 compute-0 nova_compute[182092]: 2026-01-23 09:29:58.302 182096 DEBUG oslo_concurrency.lockutils [req-fa1469b4-74e7-4bb3-9336-a0f97b1bb2eb req-cd5d1863-fd40-4eca-9c1c-a6e808ce78b2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-de422406-f05b-4543-9135-d46371948cc8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.797 182096 DEBUG nova.compute.manager [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received event network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.797 182096 DEBUG oslo_concurrency.lockutils [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "de422406-f05b-4543-9135-d46371948cc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.798 182096 DEBUG oslo_concurrency.lockutils [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.798 182096 DEBUG oslo_concurrency.lockutils [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.798 182096 DEBUG nova.compute.manager [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Processing event network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.798 182096 DEBUG nova.compute.manager [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received event network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.798 182096 DEBUG oslo_concurrency.lockutils [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "de422406-f05b-4543-9135-d46371948cc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.799 182096 DEBUG oslo_concurrency.lockutils [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.799 182096 DEBUG oslo_concurrency.lockutils [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.799 182096 DEBUG nova.compute.manager [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] No waiting events found dispatching network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.799 182096 WARNING nova.compute.manager [req-c4bd08f8-34f0-468e-8263-c6419c3153b5 req-aeb4082e-eaee-48df-836a-5a5bbe96e094 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received unexpected event network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d for instance with vm_state building and task_state spawning.
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.800 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.802 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160599.8022616, de422406-f05b-4543-9135-d46371948cc8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.802 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] VM Resumed (Lifecycle Event)
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.804 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.805 182096 INFO nova.virt.libvirt.driver [-] [instance: de422406-f05b-4543-9135-d46371948cc8] Instance spawned successfully.
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.806 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.821 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.824 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.824 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.824 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.825 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.825 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.825 182096 DEBUG nova.virt.libvirt.driver [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.828 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.858 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.879 182096 INFO nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Took 6.01 seconds to spawn the instance on the hypervisor.
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.880 182096 DEBUG nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.941 182096 INFO nova.compute.manager [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Took 6.41 seconds to build instance.
Jan 23 09:29:59 compute-0 nova_compute[182092]: 2026-01-23 09:29:59.968 182096 DEBUG oslo_concurrency.lockutils [None req-4a553c3f-92df-4d5f-9a15-e4bdd2723a9c 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:00 compute-0 podman[223463]: 2026-01-23 09:30:00.224244565 +0000 UTC m=+0.062848314 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 09:30:01 compute-0 nova_compute[182092]: 2026-01-23 09:30:01.843 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:02 compute-0 nova_compute[182092]: 2026-01-23 09:30:02.159 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:02 compute-0 nova_compute[182092]: 2026-01-23 09:30:02.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:02 compute-0 nova_compute[182092]: 2026-01-23 09:30:02.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.638 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "de422406-f05b-4543-9135-d46371948cc8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.638 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.639 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "de422406-f05b-4543-9135-d46371948cc8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.639 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.639 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.645 182096 INFO nova.compute.manager [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Terminating instance
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.651 182096 DEBUG nova.compute.manager [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:30:03 compute-0 kernel: tap4318352b-3d (unregistering): left promiscuous mode
Jan 23 09:30:03 compute-0 NetworkManager[54920]: <info>  [1769160603.6685] device (tap4318352b-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.676 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:03 compute-0 ovn_controller[94697]: 2026-01-23T09:30:03Z|00417|binding|INFO|Releasing lport 4318352b-3d26-4823-bf79-e962072b9e7d from this chassis (sb_readonly=0)
Jan 23 09:30:03 compute-0 ovn_controller[94697]: 2026-01-23T09:30:03Z|00418|binding|INFO|Setting lport 4318352b-3d26-4823-bf79-e962072b9e7d down in Southbound
Jan 23 09:30:03 compute-0 ovn_controller[94697]: 2026-01-23T09:30:03Z|00419|binding|INFO|Removing iface tap4318352b-3d ovn-installed in OVS
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.681 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.683 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:75:a2 10.100.0.10'], port_security=['fa:16:3e:52:75:a2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'de422406-f05b-4543-9135-d46371948cc8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-986514cb-1f26-4ae9-beac-032573b94c42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d9bdf487901467e93e87ea077f02990', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3ea0c52d-483c-4406-86d3-46551ccf0178', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c590800-2532-48ae-aab2-c7fb60083f61, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=4318352b-3d26-4823-bf79-e962072b9e7d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.685 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 4318352b-3d26-4823-bf79-e962072b9e7d in datapath 986514cb-1f26-4ae9-beac-032573b94c42 unbound from our chassis
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.687 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 986514cb-1f26-4ae9-beac-032573b94c42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.690 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[baebd889-fa36-4bbc-be44-f43d9ca71452]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.690 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42 namespace which is not needed anymore
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.695 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:03 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Jan 23 09:30:03 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000006e.scope: Consumed 4.295s CPU time.
Jan 23 09:30:03 compute-0 systemd-machined[153562]: Machine qemu-55-instance-0000006e terminated.
Jan 23 09:30:03 compute-0 neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42[223451]: [NOTICE]   (223455) : haproxy version is 2.8.14-c23fe91
Jan 23 09:30:03 compute-0 neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42[223451]: [NOTICE]   (223455) : path to executable is /usr/sbin/haproxy
Jan 23 09:30:03 compute-0 neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42[223451]: [WARNING]  (223455) : Exiting Master process...
Jan 23 09:30:03 compute-0 neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42[223451]: [WARNING]  (223455) : Exiting Master process...
Jan 23 09:30:03 compute-0 neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42[223451]: [ALERT]    (223455) : Current worker (223457) exited with code 143 (Terminated)
Jan 23 09:30:03 compute-0 neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42[223451]: [WARNING]  (223455) : All workers exited. Exiting... (0)
Jan 23 09:30:03 compute-0 systemd[1]: libpod-e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a.scope: Deactivated successfully.
Jan 23 09:30:03 compute-0 podman[223508]: 2026-01-23 09:30:03.798708557 +0000 UTC m=+0.034478942 container died e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:30:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a-userdata-shm.mount: Deactivated successfully.
Jan 23 09:30:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-e5570e09a1857a0ab5cd3d489f0b64156d798c2820ee26fae304e8a6ab07dc62-merged.mount: Deactivated successfully.
Jan 23 09:30:03 compute-0 podman[223508]: 2026-01-23 09:30:03.818276761 +0000 UTC m=+0.054047136 container cleanup e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:30:03 compute-0 systemd[1]: libpod-conmon-e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a.scope: Deactivated successfully.
Jan 23 09:30:03 compute-0 podman[223532]: 2026-01-23 09:30:03.858711633 +0000 UTC m=+0.025646357 container remove e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.863 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa71b78-57f1-451f-834e-e2a9ac92b360]: (4, ('Fri Jan 23 09:30:03 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42 (e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a)\ne98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a\nFri Jan 23 09:30:03 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42 (e98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a)\ne98e7cb2cfa061289329fed3cf0e6611266a4e8711b72f058fc8df4f141a5e6a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.866 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5c11a642-657e-40ff-b30c-70c73d476b7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.867 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap986514cb-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.868 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:03 compute-0 kernel: tap986514cb-10: left promiscuous mode
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.881 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.885 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.887 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[40ed670b-7fb9-45da-a242-50b82eaeb19b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.895 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3db0ecea-ff1e-4039-b64a-04d528ab00e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.896 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6c81af85-cc87-427a-a294-8f11483c7aaf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.909 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8057f001-3950-4a58-910e-d580fcea132d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 410303, 'reachable_time': 29198, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223560, 'error': None, 'target': 'ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.910 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-986514cb-1f26-4ae9-beac-032573b94c42 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:30:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:03.911 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[deec800e-8db3-417a-bff7-0ca0d3040b8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d986514cb\x2d1f26\x2d4ae9\x2dbeac\x2d032573b94c42.mount: Deactivated successfully.
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.912 182096 INFO nova.virt.libvirt.driver [-] [instance: de422406-f05b-4543-9135-d46371948cc8] Instance destroyed successfully.
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.912 182096 DEBUG nova.objects.instance [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lazy-loading 'resources' on Instance uuid de422406-f05b-4543-9135-d46371948cc8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.992 182096 DEBUG nova.compute.manager [req-f9f26dd4-8a76-4786-995c-9fa9264d878a req-53cf90a6-004c-4152-9fa0-9f93ad511f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received event network-vif-unplugged-4318352b-3d26-4823-bf79-e962072b9e7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.992 182096 DEBUG oslo_concurrency.lockutils [req-f9f26dd4-8a76-4786-995c-9fa9264d878a req-53cf90a6-004c-4152-9fa0-9f93ad511f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "de422406-f05b-4543-9135-d46371948cc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.993 182096 DEBUG oslo_concurrency.lockutils [req-f9f26dd4-8a76-4786-995c-9fa9264d878a req-53cf90a6-004c-4152-9fa0-9f93ad511f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.993 182096 DEBUG oslo_concurrency.lockutils [req-f9f26dd4-8a76-4786-995c-9fa9264d878a req-53cf90a6-004c-4152-9fa0-9f93ad511f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.993 182096 DEBUG nova.compute.manager [req-f9f26dd4-8a76-4786-995c-9fa9264d878a req-53cf90a6-004c-4152-9fa0-9f93ad511f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] No waiting events found dispatching network-vif-unplugged-4318352b-3d26-4823-bf79-e962072b9e7d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:30:03 compute-0 nova_compute[182092]: 2026-01-23 09:30:03.993 182096 DEBUG nova.compute.manager [req-f9f26dd4-8a76-4786-995c-9fa9264d878a req-53cf90a6-004c-4152-9fa0-9f93ad511f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received event network-vif-unplugged-4318352b-3d26-4823-bf79-e962072b9e7d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.091 182096 DEBUG nova.virt.libvirt.vif [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:29:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-642417068',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-642417068',id=110,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:29:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d9bdf487901467e93e87ea077f02990',ramdisk_id='',reservation_id='r-01ki4000',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-709013695',owner_user_name='tempest-ServerTagsTestJSON-709013695-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:29:59Z,user_data=None,user_id='3fcfaa4d06934f93b9116c73f6f5e006',uuid=de422406-f05b-4543-9135-d46371948cc8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.091 182096 DEBUG nova.network.os_vif_util [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Converting VIF {"id": "4318352b-3d26-4823-bf79-e962072b9e7d", "address": "fa:16:3e:52:75:a2", "network": {"id": "986514cb-1f26-4ae9-beac-032573b94c42", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-191224925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d9bdf487901467e93e87ea077f02990", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4318352b-3d", "ovs_interfaceid": "4318352b-3d26-4823-bf79-e962072b9e7d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.092 182096 DEBUG nova.network.os_vif_util [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:75:a2,bridge_name='br-int',has_traffic_filtering=True,id=4318352b-3d26-4823-bf79-e962072b9e7d,network=Network(986514cb-1f26-4ae9-beac-032573b94c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4318352b-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.092 182096 DEBUG os_vif [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:75:a2,bridge_name='br-int',has_traffic_filtering=True,id=4318352b-3d26-4823-bf79-e962072b9e7d,network=Network(986514cb-1f26-4ae9-beac-032573b94c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4318352b-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.093 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.093 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4318352b-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.095 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.097 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.099 182096 INFO os_vif [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:75:a2,bridge_name='br-int',has_traffic_filtering=True,id=4318352b-3d26-4823-bf79-e962072b9e7d,network=Network(986514cb-1f26-4ae9-beac-032573b94c42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4318352b-3d')
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.100 182096 INFO nova.virt.libvirt.driver [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Deleting instance files /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8_del
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.100 182096 INFO nova.virt.libvirt.driver [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Deletion of /var/lib/nova/instances/de422406-f05b-4543-9135-d46371948cc8_del complete
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.228 182096 INFO nova.compute.manager [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Took 0.58 seconds to destroy the instance on the hypervisor.
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.229 182096 DEBUG oslo.service.loopingcall [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.229 182096 DEBUG nova.compute.manager [-] [instance: de422406-f05b-4543-9135-d46371948cc8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.229 182096 DEBUG nova.network.neutron [-] [instance: de422406-f05b-4543-9135-d46371948cc8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.760 182096 DEBUG nova.network.neutron [-] [instance: de422406-f05b-4543-9135-d46371948cc8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.772 182096 INFO nova.compute.manager [-] [instance: de422406-f05b-4543-9135-d46371948cc8] Took 0.54 seconds to deallocate network for instance.
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.825 182096 DEBUG nova.compute.manager [req-3e448403-4da3-492e-861d-bd7599a911e7 req-d25ccc0b-a9a8-483a-8da9-40ccfa1064b7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received event network-vif-deleted-4318352b-3d26-4823-bf79-e962072b9e7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.829 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.829 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.867 182096 DEBUG nova.compute.provider_tree [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.876 182096 DEBUG nova.scheduler.client.report [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.888 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.903 182096 INFO nova.scheduler.client.report [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Deleted allocations for instance de422406-f05b-4543-9135-d46371948cc8
Jan 23 09:30:04 compute-0 nova_compute[182092]: 2026-01-23 09:30:04.970 182096 DEBUG oslo_concurrency.lockutils [None req-2735a654-f22d-43be-a252-28e3efec700b 3fcfaa4d06934f93b9116c73f6f5e006 5d9bdf487901467e93e87ea077f02990 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.667 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.887 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.888 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5703MB free_disk=73.26361846923828GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.889 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.889 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.947 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:30:05 compute-0 nova_compute[182092]: 2026-01-23 09:30:05.948 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.047 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.061 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.079 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.079 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.174 182096 DEBUG nova.compute.manager [req-84b0c2cb-5bcb-47cd-89b5-94d559755af5 req-372ef6e3-9061-47a1-a44c-fb17987192f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received event network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.174 182096 DEBUG oslo_concurrency.lockutils [req-84b0c2cb-5bcb-47cd-89b5-94d559755af5 req-372ef6e3-9061-47a1-a44c-fb17987192f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "de422406-f05b-4543-9135-d46371948cc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.174 182096 DEBUG oslo_concurrency.lockutils [req-84b0c2cb-5bcb-47cd-89b5-94d559755af5 req-372ef6e3-9061-47a1-a44c-fb17987192f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.174 182096 DEBUG oslo_concurrency.lockutils [req-84b0c2cb-5bcb-47cd-89b5-94d559755af5 req-372ef6e3-9061-47a1-a44c-fb17987192f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "de422406-f05b-4543-9135-d46371948cc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.175 182096 DEBUG nova.compute.manager [req-84b0c2cb-5bcb-47cd-89b5-94d559755af5 req-372ef6e3-9061-47a1-a44c-fb17987192f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] No waiting events found dispatching network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:30:06 compute-0 nova_compute[182092]: 2026-01-23 09:30:06.175 182096 WARNING nova.compute.manager [req-84b0c2cb-5bcb-47cd-89b5-94d559755af5 req-372ef6e3-9061-47a1-a44c-fb17987192f4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: de422406-f05b-4543-9135-d46371948cc8] Received unexpected event network-vif-plugged-4318352b-3d26-4823-bf79-e962072b9e7d for instance with vm_state deleted and task_state None.
Jan 23 09:30:07 compute-0 nova_compute[182092]: 2026-01-23 09:30:07.161 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:07 compute-0 podman[223565]: 2026-01-23 09:30:07.221682723 +0000 UTC m=+0.058083745 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 23 09:30:07 compute-0 podman[223566]: 2026-01-23 09:30:07.233284412 +0000 UTC m=+0.068825565 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:30:07 compute-0 nova_compute[182092]: 2026-01-23 09:30:07.684 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:08 compute-0 nova_compute[182092]: 2026-01-23 09:30:08.080 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:08 compute-0 nova_compute[182092]: 2026-01-23 09:30:08.080 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:30:08 compute-0 nova_compute[182092]: 2026-01-23 09:30:08.080 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:30:08 compute-0 nova_compute[182092]: 2026-01-23 09:30:08.121 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:30:08 compute-0 nova_compute[182092]: 2026-01-23 09:30:08.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:08 compute-0 nova_compute[182092]: 2026-01-23 09:30:08.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:08 compute-0 nova_compute[182092]: 2026-01-23 09:30:08.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:30:09 compute-0 nova_compute[182092]: 2026-01-23 09:30:09.096 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:09.251 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:30:09 compute-0 nova_compute[182092]: 2026-01-23 09:30:09.251 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:09.252 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:30:09 compute-0 nova_compute[182092]: 2026-01-23 09:30:09.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:10 compute-0 nova_compute[182092]: 2026-01-23 09:30:10.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:30:12 compute-0 nova_compute[182092]: 2026-01-23 09:30:12.163 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:12 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:12.253 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.262 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "9e32f2ae-16f5-4322-9269-a8d794507b04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.262 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "9e32f2ae-16f5-4322-9269-a8d794507b04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.277 182096 DEBUG nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.381 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.381 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.387 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.388 182096 INFO nova.compute.claims [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.480 182096 DEBUG nova.compute.provider_tree [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.491 182096 DEBUG nova.scheduler.client.report [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.543 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.545 182096 DEBUG nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.597 182096 DEBUG nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.642 182096 INFO nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.665 182096 DEBUG nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.776 182096 DEBUG nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.778 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.778 182096 INFO nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Creating image(s)
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.779 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.779 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.780 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.792 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.847 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.848 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.848 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.858 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.914 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.915 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.947 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.948 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:13 compute-0 nova_compute[182092]: 2026-01-23 09:30:13.949 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.001 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.003 182096 DEBUG nova.virt.disk.api [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Checking if we can resize image /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.003 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.054 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.056 182096 DEBUG nova.virt.disk.api [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Cannot resize image /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.056 182096 DEBUG nova.objects.instance [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'migration_context' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.072 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.073 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Ensure instance console log exists: /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.073 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.074 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.074 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.077 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.082 182096 WARNING nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.087 182096 DEBUG nova.virt.libvirt.host [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.087 182096 DEBUG nova.virt.libvirt.host [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.091 182096 DEBUG nova.virt.libvirt.host [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.091 182096 DEBUG nova.virt.libvirt.host [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.092 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.093 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.093 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.093 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.093 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.094 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.094 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.094 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.094 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.094 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.095 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.095 182096 DEBUG nova.virt.hardware [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.098 182096 DEBUG nova.objects.instance [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.100 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.114 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <uuid>9e32f2ae-16f5-4322-9269-a8d794507b04</uuid>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <name>instance-0000006f</name>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerShowV254Test-server-643668904</nova:name>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:30:14</nova:creationTime>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:30:14 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:30:14 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:30:14 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:30:14 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:30:14 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:30:14 compute-0 nova_compute[182092]:         <nova:user uuid="6790476f858443308104f74531089647">tempest-ServerShowV254Test-457303037-project-member</nova:user>
Jan 23 09:30:14 compute-0 nova_compute[182092]:         <nova:project uuid="2765302652e64e52b800936b8f1c4640">tempest-ServerShowV254Test-457303037</nova:project>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <system>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <entry name="serial">9e32f2ae-16f5-4322-9269-a8d794507b04</entry>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <entry name="uuid">9e32f2ae-16f5-4322-9269-a8d794507b04</entry>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     </system>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <os>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   </os>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <features>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   </features>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.config"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/console.log" append="off"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <video>
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     </video>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:30:14 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:30:14 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:30:14 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:30:14 compute-0 nova_compute[182092]: </domain>
Jan 23 09:30:14 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.146 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.146 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.147 182096 INFO nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Using config drive
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.297 182096 INFO nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Creating config drive at /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.config
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.304 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpexpaabee execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.425 182096 DEBUG oslo_concurrency.processutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpexpaabee" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:14 compute-0 systemd-machined[153562]: New machine qemu-56-instance-0000006f.
Jan 23 09:30:14 compute-0 systemd[1]: Started Virtual Machine qemu-56-instance-0000006f.
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.847 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160614.8468492, 9e32f2ae-16f5-4322-9269-a8d794507b04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.847 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] VM Resumed (Lifecycle Event)
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.850 182096 DEBUG nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.851 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.853 182096 INFO nova.virt.libvirt.driver [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance spawned successfully.
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.854 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.867 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.871 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.874 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.875 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.875 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.875 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.876 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.876 182096 DEBUG nova.virt.libvirt.driver [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.892 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.893 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160614.847705, 9e32f2ae-16f5-4322-9269-a8d794507b04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.893 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] VM Started (Lifecycle Event)
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.934 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.936 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.955 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.959 182096 INFO nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Took 1.18 seconds to spawn the instance on the hypervisor.
Jan 23 09:30:14 compute-0 nova_compute[182092]: 2026-01-23 09:30:14.960 182096 DEBUG nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:15 compute-0 nova_compute[182092]: 2026-01-23 09:30:15.005 182096 INFO nova.compute.manager [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Took 1.67 seconds to build instance.
Jan 23 09:30:15 compute-0 nova_compute[182092]: 2026-01-23 09:30:15.020 182096 DEBUG oslo_concurrency.lockutils [None req-51272dcb-aa46-410b-a994-8634c81ae30f 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "9e32f2ae-16f5-4322-9269-a8d794507b04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 1.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:16 compute-0 nova_compute[182092]: 2026-01-23 09:30:16.185 182096 INFO nova.compute.manager [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Rebuilding instance
Jan 23 09:30:16 compute-0 nova_compute[182092]: 2026-01-23 09:30:16.412 182096 DEBUG nova.compute.manager [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:16 compute-0 nova_compute[182092]: 2026-01-23 09:30:16.476 182096 DEBUG nova.objects.instance [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:16 compute-0 nova_compute[182092]: 2026-01-23 09:30:16.484 182096 DEBUG nova.objects.instance [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:16 compute-0 nova_compute[182092]: 2026-01-23 09:30:16.495 182096 DEBUG nova.objects.instance [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'resources' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:16 compute-0 nova_compute[182092]: 2026-01-23 09:30:16.502 182096 DEBUG nova.objects.instance [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'migration_context' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:16 compute-0 nova_compute[182092]: 2026-01-23 09:30:16.521 182096 DEBUG nova.objects.instance [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:30:16 compute-0 nova_compute[182092]: 2026-01-23 09:30:16.523 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:30:17 compute-0 nova_compute[182092]: 2026-01-23 09:30:17.165 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:18 compute-0 nova_compute[182092]: 2026-01-23 09:30:18.909 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160603.9092178, de422406-f05b-4543-9135-d46371948cc8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:18 compute-0 nova_compute[182092]: 2026-01-23 09:30:18.910 182096 INFO nova.compute.manager [-] [instance: de422406-f05b-4543-9135-d46371948cc8] VM Stopped (Lifecycle Event)
Jan 23 09:30:18 compute-0 nova_compute[182092]: 2026-01-23 09:30:18.938 182096 DEBUG nova.compute.manager [None req-446daddf-0a30-4542-ad81-207cf1001cef - - - - - -] [instance: de422406-f05b-4543-9135-d46371948cc8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:19 compute-0 nova_compute[182092]: 2026-01-23 09:30:19.104 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:19 compute-0 podman[223646]: 2026-01-23 09:30:19.209350531 +0000 UTC m=+0.040987853 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:30:19 compute-0 podman[223647]: 2026-01-23 09:30:19.243263216 +0000 UTC m=+0.074086207 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:30:21 compute-0 podman[223682]: 2026-01-23 09:30:21.206227562 +0000 UTC m=+0.044373758 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:30:22 compute-0 nova_compute[182092]: 2026-01-23 09:30:22.168 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:24 compute-0 nova_compute[182092]: 2026-01-23 09:30:24.105 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:26 compute-0 nova_compute[182092]: 2026-01-23 09:30:26.555 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:30:27 compute-0 nova_compute[182092]: 2026-01-23 09:30:27.168 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:28 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 23 09:30:28 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000006f.scope: Consumed 10.636s CPU time.
Jan 23 09:30:28 compute-0 systemd-machined[153562]: Machine qemu-56-instance-0000006f terminated.
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.107 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.566 182096 INFO nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance shutdown successfully after 13 seconds.
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.570 182096 INFO nova.virt.libvirt.driver [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance destroyed successfully.
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.573 182096 INFO nova.virt.libvirt.driver [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance destroyed successfully.
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.573 182096 INFO nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Deleting instance files /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04_del
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.573 182096 INFO nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Deletion of /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04_del complete
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.726 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.727 182096 INFO nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Creating image(s)
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.727 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.728 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.728 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.739 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.786 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.787 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.787 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.796 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.843 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.844 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.865 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.866 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.866 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.912 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.913 182096 DEBUG nova.virt.disk.api [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Checking if we can resize image /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.913 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.960 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.961 182096 DEBUG nova.virt.disk.api [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Cannot resize image /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.961 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.961 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Ensure instance console log exists: /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.962 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.962 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.962 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.964 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.967 182096 WARNING nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.971 182096 DEBUG nova.virt.libvirt.host [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.972 182096 DEBUG nova.virt.libvirt.host [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.975 182096 DEBUG nova.virt.libvirt.host [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.976 182096 DEBUG nova.virt.libvirt.host [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.977 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.977 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.977 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.978 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.978 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.978 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.978 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.979 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.979 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.979 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.979 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.980 182096 DEBUG nova.virt.hardware [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.980 182096 DEBUG nova.objects.instance [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:29 compute-0 nova_compute[182092]: 2026-01-23 09:30:29.991 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <uuid>9e32f2ae-16f5-4322-9269-a8d794507b04</uuid>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <name>instance-0000006f</name>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerShowV254Test-server-643668904</nova:name>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:30:29</nova:creationTime>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:30:29 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:30:29 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:30:29 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:30:29 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:30:29 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:30:29 compute-0 nova_compute[182092]:         <nova:user uuid="6790476f858443308104f74531089647">tempest-ServerShowV254Test-457303037-project-member</nova:user>
Jan 23 09:30:29 compute-0 nova_compute[182092]:         <nova:project uuid="2765302652e64e52b800936b8f1c4640">tempest-ServerShowV254Test-457303037</nova:project>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="11cc348c-4b05-42ba-a4b9-513b91dede76"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <system>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <entry name="serial">9e32f2ae-16f5-4322-9269-a8d794507b04</entry>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <entry name="uuid">9e32f2ae-16f5-4322-9269-a8d794507b04</entry>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     </system>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <os>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   </os>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <features>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   </features>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.config"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/console.log" append="off"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <video>
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     </video>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:30:29 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:30:29 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:30:29 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:30:29 compute-0 nova_compute[182092]: </domain>
Jan 23 09:30:29 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.026 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.026 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.027 182096 INFO nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Using config drive
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.036 182096 DEBUG nova.objects.instance [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.240 182096 INFO nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Creating config drive at /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.config
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.245 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptxv969xc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.364 182096 DEBUG oslo_concurrency.processutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptxv969xc" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:30 compute-0 systemd-machined[153562]: New machine qemu-57-instance-0000006f.
Jan 23 09:30:30 compute-0 systemd[1]: Started Virtual Machine qemu-57-instance-0000006f.
Jan 23 09:30:30 compute-0 podman[223746]: 2026-01-23 09:30:30.486362191 +0000 UTC m=+0.075959784 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.750 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 9e32f2ae-16f5-4322-9269-a8d794507b04 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.751 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160630.7499783, 9e32f2ae-16f5-4322-9269-a8d794507b04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.752 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] VM Resumed (Lifecycle Event)
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.754 182096 DEBUG nova.compute.manager [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.755 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.757 182096 INFO nova.virt.libvirt.driver [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance spawned successfully.
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.758 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.778 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.780 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.789 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.789 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.789 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.790 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.790 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.791 182096 DEBUG nova.virt.libvirt.driver [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.808 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.809 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160630.7502134, 9e32f2ae-16f5-4322-9269-a8d794507b04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.809 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] VM Started (Lifecycle Event)
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.830 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.832 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.854 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.875 182096 DEBUG nova.compute.manager [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.927 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.928 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:30 compute-0 nova_compute[182092]: 2026-01-23 09:30:30.928 182096 DEBUG nova.objects.instance [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.030 182096 DEBUG oslo_concurrency.lockutils [None req-556e5f2d-61c8-44c1-97db-509afce00b0c 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.659 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "9e32f2ae-16f5-4322-9269-a8d794507b04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.660 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "9e32f2ae-16f5-4322-9269-a8d794507b04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.660 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "9e32f2ae-16f5-4322-9269-a8d794507b04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.661 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "9e32f2ae-16f5-4322-9269-a8d794507b04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.661 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "9e32f2ae-16f5-4322-9269-a8d794507b04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.668 182096 INFO nova.compute.manager [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Terminating instance
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.674 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "refresh_cache-9e32f2ae-16f5-4322-9269-a8d794507b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.674 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquired lock "refresh_cache-9e32f2ae-16f5-4322-9269-a8d794507b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.674 182096 DEBUG nova.network.neutron [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:30:31 compute-0 nova_compute[182092]: 2026-01-23 09:30:31.809 182096 DEBUG nova.network.neutron [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.169 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.344 182096 DEBUG nova.network.neutron [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.362 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Releasing lock "refresh_cache-9e32f2ae-16f5-4322-9269-a8d794507b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.363 182096 DEBUG nova.compute.manager [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:30:32 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 23 09:30:32 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000006f.scope: Consumed 1.924s CPU time.
Jan 23 09:30:32 compute-0 systemd-machined[153562]: Machine qemu-57-instance-0000006f terminated.
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.594 182096 INFO nova.virt.libvirt.driver [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance destroyed successfully.
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.594 182096 DEBUG nova.objects.instance [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lazy-loading 'resources' on Instance uuid 9e32f2ae-16f5-4322-9269-a8d794507b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.602 182096 INFO nova.virt.libvirt.driver [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Deleting instance files /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04_del
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.603 182096 INFO nova.virt.libvirt.driver [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Deletion of /var/lib/nova/instances/9e32f2ae-16f5-4322-9269-a8d794507b04_del complete
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.644 182096 INFO nova.compute.manager [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Took 0.28 seconds to destroy the instance on the hypervisor.
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.645 182096 DEBUG oslo.service.loopingcall [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.645 182096 DEBUG nova.compute.manager [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:30:32 compute-0 nova_compute[182092]: 2026-01-23 09:30:32.645 182096 DEBUG nova.network.neutron [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:32.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:30:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.135 182096 DEBUG nova.network.neutron [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.153 182096 DEBUG nova.network.neutron [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.164 182096 INFO nova.compute.manager [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Took 0.52 seconds to deallocate network for instance.
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.220 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.220 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.272 182096 DEBUG nova.compute.provider_tree [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.280 182096 DEBUG nova.scheduler.client.report [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.295 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.313 182096 INFO nova.scheduler.client.report [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Deleted allocations for instance 9e32f2ae-16f5-4322-9269-a8d794507b04
Jan 23 09:30:33 compute-0 nova_compute[182092]: 2026-01-23 09:30:33.370 182096 DEBUG oslo_concurrency.lockutils [None req-2f68eaf3-d140-477b-a9a4-87d9496461fb 6790476f858443308104f74531089647 2765302652e64e52b800936b8f1c4640 - - default default] Lock "9e32f2ae-16f5-4322-9269-a8d794507b04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:34 compute-0 nova_compute[182092]: 2026-01-23 09:30:34.109 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:37 compute-0 nova_compute[182092]: 2026-01-23 09:30:37.170 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:38 compute-0 podman[223796]: 2026-01-23 09:30:38.206806055 +0000 UTC m=+0.041497674 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:30:38 compute-0 podman[223795]: 2026-01-23 09:30:38.213046853 +0000 UTC m=+0.049603873 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:30:38 compute-0 nova_compute[182092]: 2026-01-23 09:30:38.875 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "b106bf75-2404-4c74-b1cf-66fe46b64f92" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:38 compute-0 nova_compute[182092]: 2026-01-23 09:30:38.875 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "b106bf75-2404-4c74-b1cf-66fe46b64f92" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:38 compute-0 nova_compute[182092]: 2026-01-23 09:30:38.894 182096 DEBUG nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:30:38 compute-0 nova_compute[182092]: 2026-01-23 09:30:38.993 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:38 compute-0 nova_compute[182092]: 2026-01-23 09:30:38.993 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:38 compute-0 nova_compute[182092]: 2026-01-23 09:30:38.998 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:30:38 compute-0 nova_compute[182092]: 2026-01-23 09:30:38.999 182096 INFO nova.compute.claims [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.099 182096 DEBUG nova.compute.provider_tree [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.110 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.126 182096 DEBUG nova.scheduler.client.report [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.157 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.157 182096 DEBUG nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.204 182096 DEBUG nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.216 182096 INFO nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.228 182096 DEBUG nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.302 182096 DEBUG nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.302 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.303 182096 INFO nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Creating image(s)
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.303 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.303 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.304 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.314 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.361 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.362 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.362 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.371 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.425 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.426 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.448 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.449 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.449 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.495 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.496 182096 DEBUG nova.virt.disk.api [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Checking if we can resize image /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.496 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.543 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.544 182096 DEBUG nova.virt.disk.api [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Cannot resize image /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.544 182096 DEBUG nova.objects.instance [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'migration_context' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.556 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.557 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Ensure instance console log exists: /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.557 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.557 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.558 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.559 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.562 182096 WARNING nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.566 182096 DEBUG nova.virt.libvirt.host [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.567 182096 DEBUG nova.virt.libvirt.host [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.569 182096 DEBUG nova.virt.libvirt.host [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.570 182096 DEBUG nova.virt.libvirt.host [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.571 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.571 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.571 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.572 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.572 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.572 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.572 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.572 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.573 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.573 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.573 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.573 182096 DEBUG nova.virt.hardware [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.576 182096 DEBUG nova.objects.instance [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'pci_devices' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.585 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <uuid>b106bf75-2404-4c74-b1cf-66fe46b64f92</uuid>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <name>instance-00000072</name>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerShowV257Test-server-1240058141</nova:name>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:30:39</nova:creationTime>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:30:39 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:30:39 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:30:39 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:30:39 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:30:39 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:30:39 compute-0 nova_compute[182092]:         <nova:user uuid="9ecdcc9910e04638a754ffa949986fda">tempest-ServerShowV257Test-511580540-project-member</nova:user>
Jan 23 09:30:39 compute-0 nova_compute[182092]:         <nova:project uuid="640cc1b30f2b44f695cda6aca7e0533c">tempest-ServerShowV257Test-511580540</nova:project>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <system>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <entry name="serial">b106bf75-2404-4c74-b1cf-66fe46b64f92</entry>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <entry name="uuid">b106bf75-2404-4c74-b1cf-66fe46b64f92</entry>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     </system>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <os>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   </os>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <features>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   </features>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.config"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/console.log" append="off"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <video>
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     </video>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:30:39 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:30:39 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:30:39 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:30:39 compute-0 nova_compute[182092]: </domain>
Jan 23 09:30:39 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.619 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.620 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:39 compute-0 nova_compute[182092]: 2026-01-23 09:30:39.621 182096 INFO nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Using config drive
Jan 23 09:30:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:39.864 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:39.864 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:39.865 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.263 182096 INFO nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Creating config drive at /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.config
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.267 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkwd01wav execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.387 182096 DEBUG oslo_concurrency.processutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkwd01wav" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:40 compute-0 systemd-machined[153562]: New machine qemu-58-instance-00000072.
Jan 23 09:30:40 compute-0 systemd[1]: Started Virtual Machine qemu-58-instance-00000072.
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.881 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160640.880923, b106bf75-2404-4c74-b1cf-66fe46b64f92 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.882 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] VM Resumed (Lifecycle Event)
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.884 182096 DEBUG nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.885 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.887 182096 INFO nova.virt.libvirt.driver [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance spawned successfully.
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.887 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.903 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.906 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.906 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.907 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.907 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.907 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.907 182096 DEBUG nova.virt.libvirt.driver [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.911 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.949 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.950 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160640.8841348, b106bf75-2404-4c74-b1cf-66fe46b64f92 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.951 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] VM Started (Lifecycle Event)
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.969 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.971 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.980 182096 INFO nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Took 1.68 seconds to spawn the instance on the hypervisor.
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.980 182096 DEBUG nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:40 compute-0 nova_compute[182092]: 2026-01-23 09:30:40.988 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:30:41 compute-0 nova_compute[182092]: 2026-01-23 09:30:41.038 182096 INFO nova.compute.manager [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Took 2.08 seconds to build instance.
Jan 23 09:30:41 compute-0 nova_compute[182092]: 2026-01-23 09:30:41.051 182096 DEBUG oslo_concurrency.lockutils [None req-90758d1e-03b9-4f7d-9a0a-705c25297b12 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "b106bf75-2404-4c74-b1cf-66fe46b64f92" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.171 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.534 182096 INFO nova.compute.manager [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Rebuilding instance
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.745 182096 DEBUG nova.compute.manager [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.788 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'pci_requests' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.795 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'pci_devices' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.801 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'resources' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.808 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'migration_context' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.813 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:30:42 compute-0 nova_compute[182092]: 2026-01-23 09:30:42.815 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:30:44 compute-0 nova_compute[182092]: 2026-01-23 09:30:44.112 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:47 compute-0 nova_compute[182092]: 2026-01-23 09:30:47.172 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:47 compute-0 nova_compute[182092]: 2026-01-23 09:30:47.593 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160632.5925915, 9e32f2ae-16f5-4322-9269-a8d794507b04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:47 compute-0 nova_compute[182092]: 2026-01-23 09:30:47.594 182096 INFO nova.compute.manager [-] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] VM Stopped (Lifecycle Event)
Jan 23 09:30:47 compute-0 nova_compute[182092]: 2026-01-23 09:30:47.614 182096 DEBUG nova.compute.manager [None req-195ea6cc-a34e-4a40-97db-943cee31408b - - - - - -] [instance: 9e32f2ae-16f5-4322-9269-a8d794507b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:49 compute-0 nova_compute[182092]: 2026-01-23 09:30:49.114 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:49 compute-0 nova_compute[182092]: 2026-01-23 09:30:49.870 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:49 compute-0 nova_compute[182092]: 2026-01-23 09:30:49.870 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:49 compute-0 nova_compute[182092]: 2026-01-23 09:30:49.884 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:30:49 compute-0 nova_compute[182092]: 2026-01-23 09:30:49.977 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:49 compute-0 nova_compute[182092]: 2026-01-23 09:30:49.978 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:49 compute-0 nova_compute[182092]: 2026-01-23 09:30:49.983 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:30:49 compute-0 nova_compute[182092]: 2026-01-23 09:30:49.983 182096 INFO nova.compute.claims [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.112 182096 DEBUG nova.compute.provider_tree [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.154 182096 DEBUG nova.scheduler.client.report [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.181 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.181 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:30:50 compute-0 podman[223887]: 2026-01-23 09:30:50.217375708 +0000 UTC m=+0.046646653 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 09:30:50 compute-0 podman[223888]: 2026-01-23 09:30:50.240245768 +0000 UTC m=+0.068282886 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.254 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.254 182096 DEBUG nova.network.neutron [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.305 182096 INFO nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.327 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.427 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.428 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.428 182096 INFO nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Creating image(s)
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.429 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "/var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.429 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "/var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.430 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "/var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.440 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.495 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.496 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.497 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.506 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.560 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.561 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.583 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.583 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.584 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.639 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.640 182096 DEBUG nova.virt.disk.api [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Checking if we can resize image /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.640 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.653 182096 DEBUG nova.policy [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f5124daa7654208bae14567ce2b792a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e78d7d9ad2184912a9a629d8089ee896', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.687 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.687 182096 DEBUG nova.virt.disk.api [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Cannot resize image /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.687 182096 DEBUG nova.objects.instance [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lazy-loading 'migration_context' on Instance uuid b1142f56-f6c2-43e9-8bca-7c8a7179c5b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.698 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.699 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Ensure instance console log exists: /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.699 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.699 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:50 compute-0 nova_compute[182092]: 2026-01-23 09:30:50.699 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:51 compute-0 nova_compute[182092]: 2026-01-23 09:30:51.261 182096 DEBUG nova.network.neutron [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Successfully created port: f92a90cf-14e3-462e-892b-e20e25f31f12 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.174 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:52 compute-0 podman[223940]: 2026-01-23 09:30:52.206241991 +0000 UTC m=+0.044457473 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.351 182096 DEBUG nova.network.neutron [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Successfully updated port: f92a90cf-14e3-462e-892b-e20e25f31f12 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.365 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "refresh_cache-b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.365 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquired lock "refresh_cache-b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.365 182096 DEBUG nova.network.neutron [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.433 182096 DEBUG nova.compute.manager [req-1b0b6cc7-8d83-4ca3-bc37-732a86351987 req-bad5ab9a-af8c-426d-9b12-895319c641b3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received event network-changed-f92a90cf-14e3-462e-892b-e20e25f31f12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.434 182096 DEBUG nova.compute.manager [req-1b0b6cc7-8d83-4ca3-bc37-732a86351987 req-bad5ab9a-af8c-426d-9b12-895319c641b3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Refreshing instance network info cache due to event network-changed-f92a90cf-14e3-462e-892b-e20e25f31f12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.434 182096 DEBUG oslo_concurrency.lockutils [req-1b0b6cc7-8d83-4ca3-bc37-732a86351987 req-bad5ab9a-af8c-426d-9b12-895319c641b3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.476 182096 DEBUG nova.network.neutron [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:30:52 compute-0 nova_compute[182092]: 2026-01-23 09:30:52.844 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:30:53 compute-0 ovn_controller[94697]: 2026-01-23T09:30:53Z|00420|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.608 182096 DEBUG nova.network.neutron [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Updating instance_info_cache with network_info: [{"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.628 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Releasing lock "refresh_cache-b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.628 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Instance network_info: |[{"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.629 182096 DEBUG oslo_concurrency.lockutils [req-1b0b6cc7-8d83-4ca3-bc37-732a86351987 req-bad5ab9a-af8c-426d-9b12-895319c641b3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.629 182096 DEBUG nova.network.neutron [req-1b0b6cc7-8d83-4ca3-bc37-732a86351987 req-bad5ab9a-af8c-426d-9b12-895319c641b3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Refreshing network info cache for port f92a90cf-14e3-462e-892b-e20e25f31f12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.631 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Start _get_guest_xml network_info=[{"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.634 182096 WARNING nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.637 182096 DEBUG nova.virt.libvirt.host [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.638 182096 DEBUG nova.virt.libvirt.host [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.640 182096 DEBUG nova.virt.libvirt.host [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.640 182096 DEBUG nova.virt.libvirt.host [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.641 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.641 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.642 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.642 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.642 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.642 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.643 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.643 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.643 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.643 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.643 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.644 182096 DEBUG nova.virt.hardware [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.646 182096 DEBUG nova.virt.libvirt.vif [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:30:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1930853264',display_name='tempest-ListServersNegativeTestJSON-server-1930853264-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1930853264-2',id=116,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e78d7d9ad2184912a9a629d8089ee896',ramdisk_id='',reservation_id='r-deszdhdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-889454144',owner_user_name='tempest-ListServersNegativeTestJSON-889454144-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:30:50Z,user_data=None,user_id='3f5124daa7654208bae14567ce2b792a',uuid=b1142f56-f6c2-43e9-8bca-7c8a7179c5b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.647 182096 DEBUG nova.network.os_vif_util [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Converting VIF {"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.647 182096 DEBUG nova.network.os_vif_util [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:16:73,bridge_name='br-int',has_traffic_filtering=True,id=f92a90cf-14e3-462e-892b-e20e25f31f12,network=Network(75c40f67-aae2-433e-97bf-0a10e0fbe68b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf92a90cf-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.648 182096 DEBUG nova.objects.instance [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lazy-loading 'pci_devices' on Instance uuid b1142f56-f6c2-43e9-8bca-7c8a7179c5b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.667 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <uuid>b1142f56-f6c2-43e9-8bca-7c8a7179c5b6</uuid>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <name>instance-00000074</name>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <nova:name>tempest-ListServersNegativeTestJSON-server-1930853264-2</nova:name>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:30:53</nova:creationTime>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:30:53 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:30:53 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:30:53 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:30:53 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:30:53 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:30:53 compute-0 nova_compute[182092]:         <nova:user uuid="3f5124daa7654208bae14567ce2b792a">tempest-ListServersNegativeTestJSON-889454144-project-member</nova:user>
Jan 23 09:30:53 compute-0 nova_compute[182092]:         <nova:project uuid="e78d7d9ad2184912a9a629d8089ee896">tempest-ListServersNegativeTestJSON-889454144</nova:project>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:30:53 compute-0 nova_compute[182092]:         <nova:port uuid="f92a90cf-14e3-462e-892b-e20e25f31f12">
Jan 23 09:30:53 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <system>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <entry name="serial">b1142f56-f6c2-43e9-8bca-7c8a7179c5b6</entry>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <entry name="uuid">b1142f56-f6c2-43e9-8bca-7c8a7179c5b6</entry>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </system>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <os>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   </os>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <features>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   </features>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk.config"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:87:16:73"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <target dev="tapf92a90cf-14"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/console.log" append="off"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <video>
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </video>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:30:53 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:30:53 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:30:53 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:30:53 compute-0 nova_compute[182092]: </domain>
Jan 23 09:30:53 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.668 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Preparing to wait for external event network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.668 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.668 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.668 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.669 182096 DEBUG nova.virt.libvirt.vif [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:30:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1930853264',display_name='tempest-ListServersNegativeTestJSON-server-1930853264-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1930853264-2',id=116,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e78d7d9ad2184912a9a629d8089ee896',ramdisk_id='',reservation_id='r-deszdhdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-889454144',owner_user_name='tempest-ListServersNegativeTestJSON-889454144-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:30:50Z,user_data=None,user_id='3f5124daa7654208bae14567ce2b792a',uuid=b1142f56-f6c2-43e9-8bca-7c8a7179c5b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.669 182096 DEBUG nova.network.os_vif_util [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Converting VIF {"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.669 182096 DEBUG nova.network.os_vif_util [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:16:73,bridge_name='br-int',has_traffic_filtering=True,id=f92a90cf-14e3-462e-892b-e20e25f31f12,network=Network(75c40f67-aae2-433e-97bf-0a10e0fbe68b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf92a90cf-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.670 182096 DEBUG os_vif [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:16:73,bridge_name='br-int',has_traffic_filtering=True,id=f92a90cf-14e3-462e-892b-e20e25f31f12,network=Network(75c40f67-aae2-433e-97bf-0a10e0fbe68b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf92a90cf-14') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.670 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.670 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.671 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.674 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.675 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf92a90cf-14, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.675 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf92a90cf-14, col_values=(('external_ids', {'iface-id': 'f92a90cf-14e3-462e-892b-e20e25f31f12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:16:73', 'vm-uuid': 'b1142f56-f6c2-43e9-8bca-7c8a7179c5b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.676 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.677 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:53 compute-0 NetworkManager[54920]: <info>  [1769160653.6787] manager: (tapf92a90cf-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.679 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.683 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.683 182096 INFO os_vif [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:16:73,bridge_name='br-int',has_traffic_filtering=True,id=f92a90cf-14e3-462e-892b-e20e25f31f12,network=Network(75c40f67-aae2-433e-97bf-0a10e0fbe68b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf92a90cf-14')
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.729 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.729 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.729 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] No VIF found with MAC fa:16:3e:87:16:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:30:53 compute-0 nova_compute[182092]: 2026-01-23 09:30:53.730 182096 INFO nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Using config drive
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.399 182096 INFO nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Creating config drive at /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk.config
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.403 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmyo5ysnk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.522 182096 DEBUG oslo_concurrency.processutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmyo5ysnk" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:54 compute-0 kernel: tapf92a90cf-14: entered promiscuous mode
Jan 23 09:30:54 compute-0 NetworkManager[54920]: <info>  [1769160654.5602] manager: (tapf92a90cf-14): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.561 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 ovn_controller[94697]: 2026-01-23T09:30:54Z|00421|binding|INFO|Claiming lport f92a90cf-14e3-462e-892b-e20e25f31f12 for this chassis.
Jan 23 09:30:54 compute-0 ovn_controller[94697]: 2026-01-23T09:30:54Z|00422|binding|INFO|f92a90cf-14e3-462e-892b-e20e25f31f12: Claiming fa:16:3e:87:16:73 10.100.0.7
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.567 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.573 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:16:73 10.100.0.7'], port_security=['fa:16:3e:87:16:73 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b1142f56-f6c2-43e9-8bca-7c8a7179c5b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75c40f67-aae2-433e-97bf-0a10e0fbe68b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e78d7d9ad2184912a9a629d8089ee896', 'neutron:revision_number': '2', 'neutron:security_group_ids': '92f26eae-6213-4d28-8f33-de0d4bc28e12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=debac405-cc19-4740-ab1b-49720fad733b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=f92a90cf-14e3-462e-892b-e20e25f31f12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.574 103978 INFO neutron.agent.ovn.metadata.agent [-] Port f92a90cf-14e3-462e-892b-e20e25f31f12 in datapath 75c40f67-aae2-433e-97bf-0a10e0fbe68b bound to our chassis
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.575 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 75c40f67-aae2-433e-97bf-0a10e0fbe68b
Jan 23 09:30:54 compute-0 systemd-udevd[223978]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.585 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[581f7f89-f4e0-47a7-a5e3-2b49b4f44e4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.586 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap75c40f67-a1 in ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.587 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap75c40f67-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.587 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e11adf5e-ccab-49d1-bd17-ed12394109d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.588 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[03d6e691-5e3e-4048-b309-d572144c369e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.596 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8803d8-d76f-45d7-a5ff-769e047fe3ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 NetworkManager[54920]: <info>  [1769160654.6054] device (tapf92a90cf-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:30:54 compute-0 NetworkManager[54920]: <info>  [1769160654.6058] device (tapf92a90cf-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:30:54 compute-0 systemd-machined[153562]: New machine qemu-59-instance-00000074.
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.622 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6314f0ae-fcad-42ae-8859-014e1853f87d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.623 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 systemd[1]: Started Virtual Machine qemu-59-instance-00000074.
Jan 23 09:30:54 compute-0 ovn_controller[94697]: 2026-01-23T09:30:54Z|00423|binding|INFO|Setting lport f92a90cf-14e3-462e-892b-e20e25f31f12 ovn-installed in OVS
Jan 23 09:30:54 compute-0 ovn_controller[94697]: 2026-01-23T09:30:54Z|00424|binding|INFO|Setting lport f92a90cf-14e3-462e-892b-e20e25f31f12 up in Southbound
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.627 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.645 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b68245-3ef3-4cc2-9af7-df4ac4c27898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 NetworkManager[54920]: <info>  [1769160654.6497] manager: (tap75c40f67-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/217)
Jan 23 09:30:54 compute-0 systemd-udevd[223983]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.649 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[86a8eff9-b9bb-41bf-a33e-3147d6eff9f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.676 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[dbad09ca-2054-49ea-a590-6c60cbe23af6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.679 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3dd23a-f753-426c-848c-18eadd7028c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 NetworkManager[54920]: <info>  [1769160654.6960] device (tap75c40f67-a0): carrier: link connected
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.700 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ac3d80-ec45-4c1a-8d5e-8416a181c58b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.712 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b36d79-53b7-4096-9604-fbdce4b0c836]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75c40f67-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:b8:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416018, 'reachable_time': 30614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224003, 'error': None, 'target': 'ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.723 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[98d4a13e-7bf2-474a-84e7-777ee4ca02e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:b80e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416018, 'tstamp': 416018}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224004, 'error': None, 'target': 'ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.736 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6db8af4b-19a4-4519-b167-8f243a526012]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75c40f67-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:b8:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416018, 'reachable_time': 30614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224005, 'error': None, 'target': 'ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.755 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9d64b5f0-f287-48d4-b77a-4fdb3b71ebaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.792 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2dfeb383-d180-46c6-b422-73a3d3cb6599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.793 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75c40f67-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.793 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.794 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75c40f67-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.795 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 kernel: tap75c40f67-a0: entered promiscuous mode
Jan 23 09:30:54 compute-0 NetworkManager[54920]: <info>  [1769160654.7963] manager: (tap75c40f67-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.797 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.802 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap75c40f67-a0, col_values=(('external_ids', {'iface-id': '0948d0d0-0fc7-4892-92d9-a00219e3c3ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.802 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 ovn_controller[94697]: 2026-01-23T09:30:54Z|00425|binding|INFO|Releasing lport 0948d0d0-0fc7-4892-92d9-a00219e3c3ea from this chassis (sb_readonly=0)
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.803 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.816 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/75c40f67-aae2-433e-97bf-0a10e0fbe68b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/75c40f67-aae2-433e-97bf-0a10e0fbe68b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:30:54 compute-0 nova_compute[182092]: 2026-01-23 09:30:54.815 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.816 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f144ccc2-cab4-4fb6-92bf-ecc5eeee8c3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.817 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-75c40f67-aae2-433e-97bf-0a10e0fbe68b
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/75c40f67-aae2-433e-97bf-0a10e0fbe68b.pid.haproxy
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 75c40f67-aae2-433e-97bf-0a10e0fbe68b
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:30:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:30:54.817 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b', 'env', 'PROCESS_TAG=haproxy-75c40f67-aae2-433e-97bf-0a10e0fbe68b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/75c40f67-aae2-433e-97bf-0a10e0fbe68b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:30:54 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000072.scope: Deactivated successfully.
Jan 23 09:30:54 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000072.scope: Consumed 10.311s CPU time.
Jan 23 09:30:54 compute-0 systemd-machined[153562]: Machine qemu-58-instance-00000072 terminated.
Jan 23 09:30:55 compute-0 podman[224034]: 2026-01-23 09:30:55.106158267 +0000 UTC m=+0.030852110 container create ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:30:55 compute-0 systemd[1]: Started libpod-conmon-ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322.scope.
Jan 23 09:30:55 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:30:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2053aa6cd570baad7a28eb960d44a21960402f7d6e68e159685445c654d387d8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:30:55 compute-0 podman[224034]: 2026-01-23 09:30:55.165576319 +0000 UTC m=+0.090270163 container init ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:30:55 compute-0 podman[224034]: 2026-01-23 09:30:55.169885398 +0000 UTC m=+0.094579241 container start ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 09:30:55 compute-0 podman[224034]: 2026-01-23 09:30:55.091789541 +0000 UTC m=+0.016483404 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:30:55 compute-0 neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b[224046]: [NOTICE]   (224055) : New worker (224060) forked
Jan 23 09:30:55 compute-0 neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b[224046]: [NOTICE]   (224055) : Loading success.
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.478 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160655.4782004, b1142f56-f6c2-43e9-8bca-7c8a7179c5b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.479 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] VM Started (Lifecycle Event)
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.511 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.514 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160655.4783616, b1142f56-f6c2-43e9-8bca-7c8a7179c5b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.514 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] VM Paused (Lifecycle Event)
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.537 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.539 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.555 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.725 182096 DEBUG nova.network.neutron [req-1b0b6cc7-8d83-4ca3-bc37-732a86351987 req-bad5ab9a-af8c-426d-9b12-895319c641b3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Updated VIF entry in instance network info cache for port f92a90cf-14e3-462e-892b-e20e25f31f12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.726 182096 DEBUG nova.network.neutron [req-1b0b6cc7-8d83-4ca3-bc37-732a86351987 req-bad5ab9a-af8c-426d-9b12-895319c641b3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Updating instance_info_cache with network_info: [{"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.741 182096 DEBUG oslo_concurrency.lockutils [req-1b0b6cc7-8d83-4ca3-bc37-732a86351987 req-bad5ab9a-af8c-426d-9b12-895319c641b3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.854 182096 INFO nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance shutdown successfully after 13 seconds.
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.858 182096 INFO nova.virt.libvirt.driver [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance destroyed successfully.
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.861 182096 INFO nova.virt.libvirt.driver [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance destroyed successfully.
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.861 182096 INFO nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Deleting instance files /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92_del
Jan 23 09:30:55 compute-0 nova_compute[182092]: 2026-01-23 09:30:55.862 182096 INFO nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Deletion of /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92_del complete
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.037 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.038 182096 INFO nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Creating image(s)
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.038 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.039 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.039 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.049 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.096 182096 DEBUG nova.compute.manager [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received event network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.096 182096 DEBUG oslo_concurrency.lockutils [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.097 182096 DEBUG oslo_concurrency.lockutils [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.097 182096 DEBUG oslo_concurrency.lockutils [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.097 182096 DEBUG nova.compute.manager [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Processing event network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.097 182096 DEBUG nova.compute.manager [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received event network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.097 182096 DEBUG oslo_concurrency.lockutils [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.098 182096 DEBUG oslo_concurrency.lockutils [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.098 182096 DEBUG oslo_concurrency.lockutils [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.098 182096 DEBUG nova.compute.manager [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] No waiting events found dispatching network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.098 182096 WARNING nova.compute.manager [req-0b619220-684d-4a8c-9e19-232e7e2b4d40 req-a79f6d8e-c64e-4ce3-bd93-70712970d161 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received unexpected event network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 for instance with vm_state building and task_state spawning.
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.099 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.100 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.102 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.102 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.112 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.124 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160656.1019416, b1142f56-f6c2-43e9-8bca-7c8a7179c5b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.125 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] VM Resumed (Lifecycle Event)
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.127 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.130 182096 INFO nova.virt.libvirt.driver [-] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Instance spawned successfully.
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.130 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.151 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.153 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.159 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.159 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.175 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.175 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.176 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.176 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.177 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.177 182096 DEBUG nova.virt.libvirt.driver [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.181 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.182 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c,backing_fmt=raw /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.182 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "3ab4138fc997788c63c306bdd47c259649cf0f6c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.183 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.229 182096 INFO nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Took 5.80 seconds to spawn the instance on the hypervisor.
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.229 182096 DEBUG nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.231 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.231 182096 DEBUG nova.virt.disk.api [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Checking if we can resize image /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.232 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.290 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.294 182096 DEBUG nova.virt.disk.api [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Cannot resize image /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.295 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.295 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Ensure instance console log exists: /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.295 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.296 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.296 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.297 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.300 182096 WARNING nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.307 182096 DEBUG nova.virt.libvirt.host [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.307 182096 DEBUG nova.virt.libvirt.host [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.311 182096 DEBUG nova.virt.libvirt.host [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.312 182096 DEBUG nova.virt.libvirt.host [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.313 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.313 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:57Z,direct_url=<?>,disk_format='qcow2',id=11cc348c-4b05-42ba-a4b9-513b91dede76,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.314 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.314 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.314 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.314 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.315 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.315 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.315 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.315 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.316 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.316 182096 DEBUG nova.virt.hardware [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.316 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'vcpu_model' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.317 182096 INFO nova.compute.manager [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Took 6.38 seconds to build instance.
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.333 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <uuid>b106bf75-2404-4c74-b1cf-66fe46b64f92</uuid>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <name>instance-00000072</name>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerShowV257Test-server-1240058141</nova:name>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:30:56</nova:creationTime>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:30:56 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:30:56 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:30:56 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:30:56 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:30:56 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:30:56 compute-0 nova_compute[182092]:         <nova:user uuid="9ecdcc9910e04638a754ffa949986fda">tempest-ServerShowV257Test-511580540-project-member</nova:user>
Jan 23 09:30:56 compute-0 nova_compute[182092]:         <nova:project uuid="640cc1b30f2b44f695cda6aca7e0533c">tempest-ServerShowV257Test-511580540</nova:project>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="11cc348c-4b05-42ba-a4b9-513b91dede76"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <nova:ports/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <system>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <entry name="serial">b106bf75-2404-4c74-b1cf-66fe46b64f92</entry>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <entry name="uuid">b106bf75-2404-4c74-b1cf-66fe46b64f92</entry>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     </system>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <os>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   </os>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <features>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   </features>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.config"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/console.log" append="off"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <video>
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     </video>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:30:56 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:30:56 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:30:56 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:30:56 compute-0 nova_compute[182092]: </domain>
Jan 23 09:30:56 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.339 182096 DEBUG oslo_concurrency.lockutils [None req-13bf6da6-21b8-4651-9d24-5b7063909c4a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.369 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.369 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.370 182096 INFO nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Using config drive
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.383 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'ec2_ids' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:56 compute-0 nova_compute[182092]: 2026-01-23 09:30:56.404 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'keypairs' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.174 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.245 182096 INFO nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Creating config drive at /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.config
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.250 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp17m2vsaw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.369 182096 DEBUG oslo_concurrency.processutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp17m2vsaw" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:30:57 compute-0 systemd-machined[153562]: New machine qemu-60-instance-00000072.
Jan 23 09:30:57 compute-0 systemd[1]: Started Virtual Machine qemu-60-instance-00000072.
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.728 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for b106bf75-2404-4c74-b1cf-66fe46b64f92 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.728 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160657.7279181, b106bf75-2404-4c74-b1cf-66fe46b64f92 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.728 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] VM Resumed (Lifecycle Event)
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.735 182096 DEBUG nova.compute.manager [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.735 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.738 182096 INFO nova.virt.libvirt.driver [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance spawned successfully.
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.738 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.748 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.753 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.756 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.756 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.756 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.757 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.757 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.757 182096 DEBUG nova.virt.libvirt.driver [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.772 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.772 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160657.734719, b106bf75-2404-4c74-b1cf-66fe46b64f92 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.772 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] VM Started (Lifecycle Event)
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.790 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.792 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.806 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.812 182096 DEBUG nova.compute.manager [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.880 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.880 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.881 182096 DEBUG nova.objects.instance [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 23 09:30:57 compute-0 nova_compute[182092]: 2026-01-23 09:30:57.953 182096 DEBUG oslo_concurrency.lockutils [None req-fc834632-96ea-400f-8268-ef0b31959f98 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:58 compute-0 nova_compute[182092]: 2026-01-23 09:30:58.677 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.405 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "b106bf75-2404-4c74-b1cf-66fe46b64f92" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.405 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "b106bf75-2404-4c74-b1cf-66fe46b64f92" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.405 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "b106bf75-2404-4c74-b1cf-66fe46b64f92-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.405 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "b106bf75-2404-4c74-b1cf-66fe46b64f92-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.405 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "b106bf75-2404-4c74-b1cf-66fe46b64f92-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.413 182096 INFO nova.compute.manager [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Terminating instance
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.418 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "refresh_cache-b106bf75-2404-4c74-b1cf-66fe46b64f92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.418 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquired lock "refresh_cache-b106bf75-2404-4c74-b1cf-66fe46b64f92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.419 182096 DEBUG nova.network.neutron [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:30:59 compute-0 nova_compute[182092]: 2026-01-23 09:30:59.521 182096 DEBUG nova.network.neutron [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.344 182096 DEBUG nova.network.neutron [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.361 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Releasing lock "refresh_cache-b106bf75-2404-4c74-b1cf-66fe46b64f92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.361 182096 DEBUG nova.compute.manager [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:31:00 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000072.scope: Deactivated successfully.
Jan 23 09:31:00 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000072.scope: Consumed 2.920s CPU time.
Jan 23 09:31:00 compute-0 systemd-machined[153562]: Machine qemu-60-instance-00000072 terminated.
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.593 182096 INFO nova.virt.libvirt.driver [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance destroyed successfully.
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.594 182096 DEBUG nova.objects.instance [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lazy-loading 'resources' on Instance uuid b106bf75-2404-4c74-b1cf-66fe46b64f92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.604 182096 INFO nova.virt.libvirt.driver [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Deleting instance files /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92_del
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.604 182096 INFO nova.virt.libvirt.driver [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Deletion of /var/lib/nova/instances/b106bf75-2404-4c74-b1cf-66fe46b64f92_del complete
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.673 182096 INFO nova.compute.manager [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Took 0.31 seconds to destroy the instance on the hypervisor.
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.674 182096 DEBUG oslo.service.loopingcall [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.675 182096 DEBUG nova.compute.manager [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.675 182096 DEBUG nova.network.neutron [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.784 182096 DEBUG nova.network.neutron [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.792 182096 DEBUG nova.network.neutron [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.800 182096 INFO nova.compute.manager [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Took 0.13 seconds to deallocate network for instance.
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.870 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.871 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.972 182096 DEBUG nova.compute.provider_tree [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:31:00 compute-0 nova_compute[182092]: 2026-01-23 09:31:00.992 182096 DEBUG nova.scheduler.client.report [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:31:01 compute-0 nova_compute[182092]: 2026-01-23 09:31:01.014 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:01 compute-0 nova_compute[182092]: 2026-01-23 09:31:01.049 182096 INFO nova.scheduler.client.report [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Deleted allocations for instance b106bf75-2404-4c74-b1cf-66fe46b64f92
Jan 23 09:31:01 compute-0 nova_compute[182092]: 2026-01-23 09:31:01.107 182096 DEBUG oslo_concurrency.lockutils [None req-d1e1e76b-0b87-4344-9862-20b414540d6e 9ecdcc9910e04638a754ffa949986fda 640cc1b30f2b44f695cda6aca7e0533c - - default default] Lock "b106bf75-2404-4c74-b1cf-66fe46b64f92" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:01 compute-0 podman[224124]: 2026-01-23 09:31:01.233341755 +0000 UTC m=+0.067010125 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:31:02 compute-0 nova_compute[182092]: 2026-01-23 09:31:02.175 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:02 compute-0 nova_compute[182092]: 2026-01-23 09:31:02.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:03 compute-0 nova_compute[182092]: 2026-01-23 09:31:03.680 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.789 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.790 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.790 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.790 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.790 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.797 182096 INFO nova.compute.manager [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Terminating instance
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.802 182096 DEBUG nova.compute.manager [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:31:04 compute-0 kernel: tapf92a90cf-14 (unregistering): left promiscuous mode
Jan 23 09:31:04 compute-0 NetworkManager[54920]: <info>  [1769160664.8204] device (tapf92a90cf-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.826 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:04 compute-0 ovn_controller[94697]: 2026-01-23T09:31:04Z|00426|binding|INFO|Releasing lport f92a90cf-14e3-462e-892b-e20e25f31f12 from this chassis (sb_readonly=0)
Jan 23 09:31:04 compute-0 ovn_controller[94697]: 2026-01-23T09:31:04Z|00427|binding|INFO|Setting lport f92a90cf-14e3-462e-892b-e20e25f31f12 down in Southbound
Jan 23 09:31:04 compute-0 ovn_controller[94697]: 2026-01-23T09:31:04Z|00428|binding|INFO|Removing iface tapf92a90cf-14 ovn-installed in OVS
Jan 23 09:31:04 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:04.832 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:16:73 10.100.0.7'], port_security=['fa:16:3e:87:16:73 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b1142f56-f6c2-43e9-8bca-7c8a7179c5b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75c40f67-aae2-433e-97bf-0a10e0fbe68b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e78d7d9ad2184912a9a629d8089ee896', 'neutron:revision_number': '4', 'neutron:security_group_ids': '92f26eae-6213-4d28-8f33-de0d4bc28e12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=debac405-cc19-4740-ab1b-49720fad733b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=f92a90cf-14e3-462e-892b-e20e25f31f12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:31:04 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:04.834 103978 INFO neutron.agent.ovn.metadata.agent [-] Port f92a90cf-14e3-462e-892b-e20e25f31f12 in datapath 75c40f67-aae2-433e-97bf-0a10e0fbe68b unbound from our chassis
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.835 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:04 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:04.837 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75c40f67-aae2-433e-97bf-0a10e0fbe68b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:31:04 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:04.839 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae04466-5d10-4ab3-8813-07f2ff2c35f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:04 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:04.839 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b namespace which is not needed anymore
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.844 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:04 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000074.scope: Deactivated successfully.
Jan 23 09:31:04 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000074.scope: Consumed 9.575s CPU time.
Jan 23 09:31:04 compute-0 systemd-machined[153562]: Machine qemu-59-instance-00000074 terminated.
Jan 23 09:31:04 compute-0 neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b[224046]: [NOTICE]   (224055) : haproxy version is 2.8.14-c23fe91
Jan 23 09:31:04 compute-0 neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b[224046]: [NOTICE]   (224055) : path to executable is /usr/sbin/haproxy
Jan 23 09:31:04 compute-0 neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b[224046]: [WARNING]  (224055) : Exiting Master process...
Jan 23 09:31:04 compute-0 neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b[224046]: [ALERT]    (224055) : Current worker (224060) exited with code 143 (Terminated)
Jan 23 09:31:04 compute-0 neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b[224046]: [WARNING]  (224055) : All workers exited. Exiting... (0)
Jan 23 09:31:04 compute-0 systemd[1]: libpod-ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322.scope: Deactivated successfully.
Jan 23 09:31:04 compute-0 podman[224171]: 2026-01-23 09:31:04.937618678 +0000 UTC m=+0.034911196 container died ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:31:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322-userdata-shm.mount: Deactivated successfully.
Jan 23 09:31:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-2053aa6cd570baad7a28eb960d44a21960402f7d6e68e159685445c654d387d8-merged.mount: Deactivated successfully.
Jan 23 09:31:04 compute-0 podman[224171]: 2026-01-23 09:31:04.959214686 +0000 UTC m=+0.056507204 container cleanup ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:31:04 compute-0 systemd[1]: libpod-conmon-ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322.scope: Deactivated successfully.
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.986 182096 DEBUG nova.compute.manager [req-47f29036-1bb0-425d-90b8-2ead79c56eb0 req-98029451-978c-47be-8185-3ec16ff1cdfe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received event network-vif-unplugged-f92a90cf-14e3-462e-892b-e20e25f31f12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.986 182096 DEBUG oslo_concurrency.lockutils [req-47f29036-1bb0-425d-90b8-2ead79c56eb0 req-98029451-978c-47be-8185-3ec16ff1cdfe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.986 182096 DEBUG oslo_concurrency.lockutils [req-47f29036-1bb0-425d-90b8-2ead79c56eb0 req-98029451-978c-47be-8185-3ec16ff1cdfe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.986 182096 DEBUG oslo_concurrency.lockutils [req-47f29036-1bb0-425d-90b8-2ead79c56eb0 req-98029451-978c-47be-8185-3ec16ff1cdfe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.987 182096 DEBUG nova.compute.manager [req-47f29036-1bb0-425d-90b8-2ead79c56eb0 req-98029451-978c-47be-8185-3ec16ff1cdfe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] No waiting events found dispatching network-vif-unplugged-f92a90cf-14e3-462e-892b-e20e25f31f12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:31:04 compute-0 nova_compute[182092]: 2026-01-23 09:31:04.987 182096 DEBUG nova.compute.manager [req-47f29036-1bb0-425d-90b8-2ead79c56eb0 req-98029451-978c-47be-8185-3ec16ff1cdfe 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received event network-vif-unplugged-f92a90cf-14e3-462e-892b-e20e25f31f12 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:31:04 compute-0 podman[224194]: 2026-01-23 09:31:04.998420953 +0000 UTC m=+0.024350355 container remove ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.002 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2889ad-1311-47ca-b4d6-25c6b8b8c57b]: (4, ('Fri Jan 23 09:31:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b (ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322)\nca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322\nFri Jan 23 09:31:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b (ca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322)\nca58b0f1062aafb1f895514f8fe330678f8c9d024b6f3172f6dfd333995a9322\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.003 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[50fe0b76-9f84-473f-924b-25d3421b7378]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.004 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75c40f67-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:05 compute-0 kernel: tap75c40f67-a0: left promiscuous mode
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.005 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:05 compute-0 NetworkManager[54920]: <info>  [1769160665.0165] manager: (tapf92a90cf-14): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.023 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.024 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb78c3a-77ed-4a84-b0b9-494b7e81c83a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.034 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[106839f4-8a33-4f2f-a8bd-eb30d55374c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.034 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8d30a864-125b-4b4e-a484-a82940ad95b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.046 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[eefd1a90-de1d-4aa9-bff7-bf2dcab9f999]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416012, 'reachable_time': 39942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224214, 'error': None, 'target': 'ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.047 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-75c40f67-aae2-433e-97bf-0a10e0fbe68b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:31:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:05.047 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[d79aba6c-2569-4d6b-85f4-1742e30494d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d75c40f67\x2daae2\x2d433e\x2d97bf\x2d0a10e0fbe68b.mount: Deactivated successfully.
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.051 182096 INFO nova.virt.libvirt.driver [-] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Instance destroyed successfully.
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.052 182096 DEBUG nova.objects.instance [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lazy-loading 'resources' on Instance uuid b1142f56-f6c2-43e9-8bca-7c8a7179c5b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.064 182096 DEBUG nova.virt.libvirt.vif [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:30:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1930853264',display_name='tempest-ListServersNegativeTestJSON-server-1930853264-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1930853264-2',id=116,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-23T09:30:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e78d7d9ad2184912a9a629d8089ee896',ramdisk_id='',reservation_id='r-deszdhdd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-889454144',owner_user_name='tempest-ListServersNegativeTestJSON-889454144-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:30:56Z,user_data=None,user_id='3f5124daa7654208bae14567ce2b792a',uuid=b1142f56-f6c2-43e9-8bca-7c8a7179c5b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.065 182096 DEBUG nova.network.os_vif_util [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Converting VIF {"id": "f92a90cf-14e3-462e-892b-e20e25f31f12", "address": "fa:16:3e:87:16:73", "network": {"id": "75c40f67-aae2-433e-97bf-0a10e0fbe68b", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-24765961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e78d7d9ad2184912a9a629d8089ee896", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf92a90cf-14", "ovs_interfaceid": "f92a90cf-14e3-462e-892b-e20e25f31f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.065 182096 DEBUG nova.network.os_vif_util [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:16:73,bridge_name='br-int',has_traffic_filtering=True,id=f92a90cf-14e3-462e-892b-e20e25f31f12,network=Network(75c40f67-aae2-433e-97bf-0a10e0fbe68b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf92a90cf-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.066 182096 DEBUG os_vif [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:16:73,bridge_name='br-int',has_traffic_filtering=True,id=f92a90cf-14e3-462e-892b-e20e25f31f12,network=Network(75c40f67-aae2-433e-97bf-0a10e0fbe68b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf92a90cf-14') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.166 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.166 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf92a90cf-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.168 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.171 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.172 182096 INFO os_vif [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:16:73,bridge_name='br-int',has_traffic_filtering=True,id=f92a90cf-14e3-462e-892b-e20e25f31f12,network=Network(75c40f67-aae2-433e-97bf-0a10e0fbe68b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf92a90cf-14')
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.173 182096 INFO nova.virt.libvirt.driver [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Deleting instance files /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6_del
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.173 182096 INFO nova.virt.libvirt.driver [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Deletion of /var/lib/nova/instances/b1142f56-f6c2-43e9-8bca-7c8a7179c5b6_del complete
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.227 182096 INFO nova.compute.manager [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.228 182096 DEBUG oslo.service.loopingcall [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.229 182096 DEBUG nova.compute.manager [-] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.229 182096 DEBUG nova.network.neutron [-] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.683 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.684 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.684 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.684 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.760 182096 DEBUG nova.network.neutron [-] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.784 182096 INFO nova.compute.manager [-] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Took 0.55 seconds to deallocate network for instance.
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.835 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.835 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.846 182096 DEBUG nova.compute.manager [req-c5c0f18d-c8c6-4b07-bfa0-629071924f3a req-024aad9b-727e-4980-bdbc-17d2436a313f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received event network-vif-deleted-f92a90cf-14e3-462e-892b-e20e25f31f12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.872 182096 DEBUG nova.compute.provider_tree [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.882 182096 DEBUG nova.scheduler.client.report [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.895 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.913 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.915 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5711MB free_disk=73.26358032226562GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.915 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.915 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.929 182096 INFO nova.scheduler.client.report [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Deleted allocations for instance b1142f56-f6c2-43e9-8bca-7c8a7179c5b6
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.977 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.977 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.988 182096 DEBUG oslo_concurrency.lockutils [None req-a7383408-e1ea-4947-8053-0906c827a23a 3f5124daa7654208bae14567ce2b792a e78d7d9ad2184912a9a629d8089ee896 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:05 compute-0 nova_compute[182092]: 2026-01-23 09:31:05.994 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:31:06 compute-0 nova_compute[182092]: 2026-01-23 09:31:06.007 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:31:06 compute-0 nova_compute[182092]: 2026-01-23 09:31:06.025 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:31:06 compute-0 nova_compute[182092]: 2026-01-23 09:31:06.025 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:07 compute-0 nova_compute[182092]: 2026-01-23 09:31:07.096 182096 DEBUG nova.compute.manager [req-0faba8aa-f1d5-4075-ad00-27dcf82f80e3 req-940a752a-b0fe-4c8b-a389-66ae6cb3ee33 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received event network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:07 compute-0 nova_compute[182092]: 2026-01-23 09:31:07.097 182096 DEBUG oslo_concurrency.lockutils [req-0faba8aa-f1d5-4075-ad00-27dcf82f80e3 req-940a752a-b0fe-4c8b-a389-66ae6cb3ee33 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:07 compute-0 nova_compute[182092]: 2026-01-23 09:31:07.097 182096 DEBUG oslo_concurrency.lockutils [req-0faba8aa-f1d5-4075-ad00-27dcf82f80e3 req-940a752a-b0fe-4c8b-a389-66ae6cb3ee33 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:07 compute-0 nova_compute[182092]: 2026-01-23 09:31:07.097 182096 DEBUG oslo_concurrency.lockutils [req-0faba8aa-f1d5-4075-ad00-27dcf82f80e3 req-940a752a-b0fe-4c8b-a389-66ae6cb3ee33 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "b1142f56-f6c2-43e9-8bca-7c8a7179c5b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:07 compute-0 nova_compute[182092]: 2026-01-23 09:31:07.097 182096 DEBUG nova.compute.manager [req-0faba8aa-f1d5-4075-ad00-27dcf82f80e3 req-940a752a-b0fe-4c8b-a389-66ae6cb3ee33 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] No waiting events found dispatching network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:31:07 compute-0 nova_compute[182092]: 2026-01-23 09:31:07.098 182096 WARNING nova.compute.manager [req-0faba8aa-f1d5-4075-ad00-27dcf82f80e3 req-940a752a-b0fe-4c8b-a389-66ae6cb3ee33 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Received unexpected event network-vif-plugged-f92a90cf-14e3-462e-892b-e20e25f31f12 for instance with vm_state deleted and task_state None.
Jan 23 09:31:07 compute-0 nova_compute[182092]: 2026-01-23 09:31:07.177 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:08 compute-0 nova_compute[182092]: 2026-01-23 09:31:08.025 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:08 compute-0 nova_compute[182092]: 2026-01-23 09:31:08.026 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:31:08 compute-0 nova_compute[182092]: 2026-01-23 09:31:08.026 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:31:08 compute-0 nova_compute[182092]: 2026-01-23 09:31:08.044 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:31:08 compute-0 nova_compute[182092]: 2026-01-23 09:31:08.046 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:08 compute-0 nova_compute[182092]: 2026-01-23 09:31:08.667 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:09 compute-0 podman[224219]: 2026-01-23 09:31:09.204926937 +0000 UTC m=+0.039493830 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 23 09:31:09 compute-0 podman[224220]: 2026-01-23 09:31:09.206224314 +0000 UTC m=+0.040317103 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:31:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:09.669 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:31:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:09.670 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:31:09 compute-0 nova_compute[182092]: 2026-01-23 09:31:09.671 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:10 compute-0 nova_compute[182092]: 2026-01-23 09:31:10.117 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:10 compute-0 nova_compute[182092]: 2026-01-23 09:31:10.167 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:10 compute-0 nova_compute[182092]: 2026-01-23 09:31:10.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:10 compute-0 nova_compute[182092]: 2026-01-23 09:31:10.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:31:11 compute-0 nova_compute[182092]: 2026-01-23 09:31:11.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:31:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:11.672 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:12 compute-0 nova_compute[182092]: 2026-01-23 09:31:12.178 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.348 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.348 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.370 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.466 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.466 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.473 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.473 182096 INFO nova.compute.claims [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.600 182096 DEBUG nova.compute.provider_tree [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.620 182096 DEBUG nova.scheduler.client.report [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.634 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.634 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.680 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.680 182096 DEBUG nova.network.neutron [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.695 182096 INFO nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.712 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.782 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.783 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.783 182096 INFO nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Creating image(s)
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.783 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "/var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.784 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "/var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.784 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "/var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.795 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.842 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.842 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.843 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.852 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.900 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.900 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.921 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.923 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.923 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.968 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.969 182096 DEBUG nova.virt.disk.api [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Checking if we can resize image /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:31:14 compute-0 nova_compute[182092]: 2026-01-23 09:31:14.969 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.013 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.014 182096 DEBUG nova.virt.disk.api [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Cannot resize image /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.014 182096 DEBUG nova.objects.instance [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 6eb3a9e9-805d-4468-bfd1-03aa390682f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.029 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.030 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Ensure instance console log exists: /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.030 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.030 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.030 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.168 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.305 182096 DEBUG nova.policy [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c664df671f084e15bedf8f948ca3d38c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1b83842511b438480c657f4b89702d0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.592 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160660.591089, b106bf75-2404-4c74-b1cf-66fe46b64f92 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.592 182096 INFO nova.compute.manager [-] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] VM Stopped (Lifecycle Event)
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.607 182096 DEBUG nova.compute.manager [None req-89ba0883-4450-405b-96cc-1e8cf5d692fd - - - - - -] [instance: b106bf75-2404-4c74-b1cf-66fe46b64f92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:15 compute-0 nova_compute[182092]: 2026-01-23 09:31:15.907 182096 DEBUG nova.network.neutron [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Successfully created port: c4101a62-8d2e-4ff2-b17d-95110c75a4bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:31:16 compute-0 nova_compute[182092]: 2026-01-23 09:31:16.505 182096 DEBUG nova.network.neutron [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Successfully updated port: c4101a62-8d2e-4ff2-b17d-95110c75a4bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:31:16 compute-0 nova_compute[182092]: 2026-01-23 09:31:16.515 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:31:16 compute-0 nova_compute[182092]: 2026-01-23 09:31:16.515 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquired lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:31:16 compute-0 nova_compute[182092]: 2026-01-23 09:31:16.515 182096 DEBUG nova.network.neutron [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:31:16 compute-0 nova_compute[182092]: 2026-01-23 09:31:16.591 182096 DEBUG nova.compute.manager [req-41d21c88-5ba8-49ed-844f-a8fc46c9e60a req-c35f2eeb-5911-45c2-98f4-0a4653581909 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received event network-changed-c4101a62-8d2e-4ff2-b17d-95110c75a4bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:16 compute-0 nova_compute[182092]: 2026-01-23 09:31:16.591 182096 DEBUG nova.compute.manager [req-41d21c88-5ba8-49ed-844f-a8fc46c9e60a req-c35f2eeb-5911-45c2-98f4-0a4653581909 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Refreshing instance network info cache due to event network-changed-c4101a62-8d2e-4ff2-b17d-95110c75a4bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:31:16 compute-0 nova_compute[182092]: 2026-01-23 09:31:16.591 182096 DEBUG oslo_concurrency.lockutils [req-41d21c88-5ba8-49ed-844f-a8fc46c9e60a req-c35f2eeb-5911-45c2-98f4-0a4653581909 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:31:16 compute-0 nova_compute[182092]: 2026-01-23 09:31:16.707 182096 DEBUG nova.network.neutron [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:31:17 compute-0 nova_compute[182092]: 2026-01-23 09:31:17.179 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:17 compute-0 nova_compute[182092]: 2026-01-23 09:31:17.753 182096 DEBUG nova.network.neutron [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Updating instance_info_cache with network_info: [{"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.691 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Releasing lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.691 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Instance network_info: |[{"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.692 182096 DEBUG oslo_concurrency.lockutils [req-41d21c88-5ba8-49ed-844f-a8fc46c9e60a req-c35f2eeb-5911-45c2-98f4-0a4653581909 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.692 182096 DEBUG nova.network.neutron [req-41d21c88-5ba8-49ed-844f-a8fc46c9e60a req-c35f2eeb-5911-45c2-98f4-0a4653581909 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Refreshing network info cache for port c4101a62-8d2e-4ff2-b17d-95110c75a4bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.694 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Start _get_guest_xml network_info=[{"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.697 182096 WARNING nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.703 182096 DEBUG nova.virt.libvirt.host [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.704 182096 DEBUG nova.virt.libvirt.host [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.708 182096 DEBUG nova.virt.libvirt.host [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.708 182096 DEBUG nova.virt.libvirt.host [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.709 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.709 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.710 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.710 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.710 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.710 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.710 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.710 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.711 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.711 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.711 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.711 182096 DEBUG nova.virt.hardware [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.714 182096 DEBUG nova.virt.libvirt.vif [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:31:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1468833441',display_name='tempest-ServersNegativeTestJSON-server-1468833441',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1468833441',id=120,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1b83842511b438480c657f4b89702d0',ramdisk_id='',reservation_id='r-6eeyckqc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-977727523',owner_user_name='tempest-ServersNegativeTestJSON-977727523-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:31:14Z,user_data=None,user_id='c664df671f084e15bedf8f948ca3d38c',uuid=6eb3a9e9-805d-4468-bfd1-03aa390682f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.714 182096 DEBUG nova.network.os_vif_util [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Converting VIF {"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.715 182096 DEBUG nova.network.os_vif_util [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:77:97,bridge_name='br-int',has_traffic_filtering=True,id=c4101a62-8d2e-4ff2-b17d-95110c75a4bd,network=Network(3a9e1d00-8438-4823-956e-6cae137c7678),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4101a62-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.715 182096 DEBUG nova.objects.instance [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6eb3a9e9-805d-4468-bfd1-03aa390682f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.785 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <uuid>6eb3a9e9-805d-4468-bfd1-03aa390682f8</uuid>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <name>instance-00000078</name>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersNegativeTestJSON-server-1468833441</nova:name>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:31:18</nova:creationTime>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:31:18 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:31:18 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:31:18 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:31:18 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:31:18 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:31:18 compute-0 nova_compute[182092]:         <nova:user uuid="c664df671f084e15bedf8f948ca3d38c">tempest-ServersNegativeTestJSON-977727523-project-member</nova:user>
Jan 23 09:31:18 compute-0 nova_compute[182092]:         <nova:project uuid="b1b83842511b438480c657f4b89702d0">tempest-ServersNegativeTestJSON-977727523</nova:project>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:31:18 compute-0 nova_compute[182092]:         <nova:port uuid="c4101a62-8d2e-4ff2-b17d-95110c75a4bd">
Jan 23 09:31:18 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <system>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <entry name="serial">6eb3a9e9-805d-4468-bfd1-03aa390682f8</entry>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <entry name="uuid">6eb3a9e9-805d-4468-bfd1-03aa390682f8</entry>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </system>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <os>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   </os>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <features>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   </features>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk.config"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:4f:77:97"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <target dev="tapc4101a62-8d"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/console.log" append="off"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <video>
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </video>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:31:18 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:31:18 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:31:18 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:31:18 compute-0 nova_compute[182092]: </domain>
Jan 23 09:31:18 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.786 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Preparing to wait for external event network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.786 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.786 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.787 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.787 182096 DEBUG nova.virt.libvirt.vif [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:31:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1468833441',display_name='tempest-ServersNegativeTestJSON-server-1468833441',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1468833441',id=120,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1b83842511b438480c657f4b89702d0',ramdisk_id='',reservation_id='r-6eeyckqc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-977727523',owner_user_name='tempest-ServersNegativeTestJSON-977727523-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:31:14Z,user_data=None,user_id='c664df671f084e15bedf8f948ca3d38c',uuid=6eb3a9e9-805d-4468-bfd1-03aa390682f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.787 182096 DEBUG nova.network.os_vif_util [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Converting VIF {"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.788 182096 DEBUG nova.network.os_vif_util [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:77:97,bridge_name='br-int',has_traffic_filtering=True,id=c4101a62-8d2e-4ff2-b17d-95110c75a4bd,network=Network(3a9e1d00-8438-4823-956e-6cae137c7678),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4101a62-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.788 182096 DEBUG os_vif [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:77:97,bridge_name='br-int',has_traffic_filtering=True,id=c4101a62-8d2e-4ff2-b17d-95110c75a4bd,network=Network(3a9e1d00-8438-4823-956e-6cae137c7678),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4101a62-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.789 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.789 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.789 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.792 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.793 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4101a62-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.793 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc4101a62-8d, col_values=(('external_ids', {'iface-id': 'c4101a62-8d2e-4ff2-b17d-95110c75a4bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:77:97', 'vm-uuid': '6eb3a9e9-805d-4468-bfd1-03aa390682f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.794 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:18 compute-0 NetworkManager[54920]: <info>  [1769160678.7952] manager: (tapc4101a62-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.797 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.798 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.799 182096 INFO os_vif [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:77:97,bridge_name='br-int',has_traffic_filtering=True,id=c4101a62-8d2e-4ff2-b17d-95110c75a4bd,network=Network(3a9e1d00-8438-4823-956e-6cae137c7678),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4101a62-8d')
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.855 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.856 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.856 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] No VIF found with MAC fa:16:3e:4f:77:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:31:18 compute-0 nova_compute[182092]: 2026-01-23 09:31:18.856 182096 INFO nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Using config drive
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.047 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160665.0468612, b1142f56-f6c2-43e9-8bca-7c8a7179c5b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.047 182096 INFO nova.compute.manager [-] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] VM Stopped (Lifecycle Event)
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.099 182096 DEBUG nova.compute.manager [None req-441d353d-e7a9-4361-bf94-b43d52fdcf6b - - - - - -] [instance: b1142f56-f6c2-43e9-8bca-7c8a7179c5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.365 182096 INFO nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Creating config drive at /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk.config
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.369 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg7is55ef execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.487 182096 DEBUG oslo_concurrency.processutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg7is55ef" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:20 compute-0 kernel: tapc4101a62-8d: entered promiscuous mode
Jan 23 09:31:20 compute-0 NetworkManager[54920]: <info>  [1769160680.5339] manager: (tapc4101a62-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.536 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:20 compute-0 ovn_controller[94697]: 2026-01-23T09:31:20Z|00429|binding|INFO|Claiming lport c4101a62-8d2e-4ff2-b17d-95110c75a4bd for this chassis.
Jan 23 09:31:20 compute-0 ovn_controller[94697]: 2026-01-23T09:31:20Z|00430|binding|INFO|c4101a62-8d2e-4ff2-b17d-95110c75a4bd: Claiming fa:16:3e:4f:77:97 10.100.0.5
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.541 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.548 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:77:97 10.100.0.5'], port_security=['fa:16:3e:4f:77:97 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6eb3a9e9-805d-4468-bfd1-03aa390682f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a9e1d00-8438-4823-956e-6cae137c7678', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1b83842511b438480c657f4b89702d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '894695d7-c37a-4406-98cf-62b3145309c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=854f12a8-7cf0-4ea1-9bb7-22830c9849dd, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=c4101a62-8d2e-4ff2-b17d-95110c75a4bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.549 103978 INFO neutron.agent.ovn.metadata.agent [-] Port c4101a62-8d2e-4ff2-b17d-95110c75a4bd in datapath 3a9e1d00-8438-4823-956e-6cae137c7678 bound to our chassis
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.550 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a9e1d00-8438-4823-956e-6cae137c7678
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.561 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b8caeb7c-afef-4197-8d1f-0456a156a8ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.562 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a9e1d00-81 in ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.564 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a9e1d00-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.564 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[257aa50a-5cbb-40b5-9a69-42cdea99e3ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 systemd-udevd[224305]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.568 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b966ecc5-e8a3-4a22-8267-8965a2fbe91c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.580 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[edc23bd9-e3e6-4a7c-bea4-4119831ddbc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 systemd-machined[153562]: New machine qemu-61-instance-00000078.
Jan 23 09:31:20 compute-0 NetworkManager[54920]: <info>  [1769160680.5840] device (tapc4101a62-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:31:20 compute-0 NetworkManager[54920]: <info>  [1769160680.5847] device (tapc4101a62-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:31:20 compute-0 systemd[1]: Started Virtual Machine qemu-61-instance-00000078.
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.604 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8fadad-57f4-48e1-be1b-4a0b4fd7698f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_controller[94697]: 2026-01-23T09:31:20Z|00431|binding|INFO|Setting lport c4101a62-8d2e-4ff2-b17d-95110c75a4bd ovn-installed in OVS
Jan 23 09:31:20 compute-0 ovn_controller[94697]: 2026-01-23T09:31:20Z|00432|binding|INFO|Setting lport c4101a62-8d2e-4ff2-b17d-95110c75a4bd up in Southbound
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.610 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:20 compute-0 podman[224284]: 2026-01-23 09:31:20.624411321 +0000 UTC m=+0.092332853 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:31:20 compute-0 podman[224283]: 2026-01-23 09:31:20.62712973 +0000 UTC m=+0.096741180 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.630 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[76acd5c4-2775-4afb-b052-bae11009811e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.635 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[804240d9-70a9-43af-b94a-1e4244b14e5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 NetworkManager[54920]: <info>  [1769160680.6361] manager: (tap3a9e1d00-80): new Veth device (/org/freedesktop/NetworkManager/Devices/222)
Jan 23 09:31:20 compute-0 systemd-udevd[224313]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.660 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a541e10f-2f48-4e06-9cc5-246c7cea9b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.662 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[11417972-51ac-4d86-96e3-1a46f425b978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 NetworkManager[54920]: <info>  [1769160680.6758] device (tap3a9e1d00-80): carrier: link connected
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.678 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e02b0e-a63d-474b-a694-13b0c8c16ed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.690 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[50be1045-be8c-49bb-911f-a6d7cc3271f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a9e1d00-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:da:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418616, 'reachable_time': 39096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224353, 'error': None, 'target': 'ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.701 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b25e01b-0230-438b-b2d3-00c54c8577ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:dacb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418616, 'tstamp': 418616}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224354, 'error': None, 'target': 'ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.715 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[900fa446-3dda-4294-aa77-b467409e909f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a9e1d00-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:da:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418616, 'reachable_time': 39096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224356, 'error': None, 'target': 'ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.735 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[32cb1f39-9bb8-4d3b-8a82-fd8f59df9829]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.773 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dac0b890-a519-4bb6-b391-020d4a351391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.774 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9e1d00-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.774 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.775 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a9e1d00-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:20 compute-0 NetworkManager[54920]: <info>  [1769160680.7767] manager: (tap3a9e1d00-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Jan 23 09:31:20 compute-0 kernel: tap3a9e1d00-80: entered promiscuous mode
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.778 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a9e1d00-80, col_values=(('external_ids', {'iface-id': '2a93a9c2-b647-41f7-a2b2-cd58e456413a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.778 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:20 compute-0 ovn_controller[94697]: 2026-01-23T09:31:20Z|00433|binding|INFO|Releasing lport 2a93a9c2-b647-41f7-a2b2-cd58e456413a from this chassis (sb_readonly=0)
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.780 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.780 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a9e1d00-8438-4823-956e-6cae137c7678.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a9e1d00-8438-4823-956e-6cae137c7678.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.781 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f4342eaa-1c0d-4c87-ad74-9908f099ea58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.784 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-3a9e1d00-8438-4823-956e-6cae137c7678
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/3a9e1d00-8438-4823-956e-6cae137c7678.pid.haproxy
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 3a9e1d00-8438-4823-956e-6cae137c7678
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:31:20 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:20.785 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678', 'env', 'PROCESS_TAG=haproxy-3a9e1d00-8438-4823-956e-6cae137c7678', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a9e1d00-8438-4823-956e-6cae137c7678.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.788 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160680.787756, 6eb3a9e9-805d-4468-bfd1-03aa390682f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.788 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] VM Started (Lifecycle Event)
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.792 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.804 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.807 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160680.7879398, 6eb3a9e9-805d-4468-bfd1-03aa390682f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.808 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] VM Paused (Lifecycle Event)
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.836 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.838 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:31:20 compute-0 nova_compute[182092]: 2026-01-23 09:31:20.867 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:31:21 compute-0 podman[224391]: 2026-01-23 09:31:21.067893435 +0000 UTC m=+0.037105125 container create 0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 09:31:21 compute-0 systemd[1]: Started libpod-conmon-0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c.scope.
Jan 23 09:31:21 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:31:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0157c71dae9db8b4776ec3374422eadf2a7870f710cdb0c2a51fc5dbe2826e2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:31:21 compute-0 podman[224391]: 2026-01-23 09:31:21.111287972 +0000 UTC m=+0.080499672 container init 0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:31:21 compute-0 podman[224391]: 2026-01-23 09:31:21.115681721 +0000 UTC m=+0.084893411 container start 0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 09:31:21 compute-0 podman[224391]: 2026-01-23 09:31:21.046550485 +0000 UTC m=+0.015762195 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:31:21 compute-0 neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678[224403]: [NOTICE]   (224407) : New worker (224409) forked
Jan 23 09:31:21 compute-0 neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678[224403]: [NOTICE]   (224407) : Loading success.
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.064 182096 DEBUG nova.compute.manager [req-4be65496-6de4-494a-961b-885c1ba67b67 req-f4c62659-f7f7-4fe7-8d9a-33d2a8cf8f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received event network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.065 182096 DEBUG oslo_concurrency.lockutils [req-4be65496-6de4-494a-961b-885c1ba67b67 req-f4c62659-f7f7-4fe7-8d9a-33d2a8cf8f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.065 182096 DEBUG oslo_concurrency.lockutils [req-4be65496-6de4-494a-961b-885c1ba67b67 req-f4c62659-f7f7-4fe7-8d9a-33d2a8cf8f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.066 182096 DEBUG oslo_concurrency.lockutils [req-4be65496-6de4-494a-961b-885c1ba67b67 req-f4c62659-f7f7-4fe7-8d9a-33d2a8cf8f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.066 182096 DEBUG nova.compute.manager [req-4be65496-6de4-494a-961b-885c1ba67b67 req-f4c62659-f7f7-4fe7-8d9a-33d2a8cf8f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Processing event network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.066 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.069 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160682.0697064, 6eb3a9e9-805d-4468-bfd1-03aa390682f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.070 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] VM Resumed (Lifecycle Event)
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.071 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.073 182096 INFO nova.virt.libvirt.driver [-] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Instance spawned successfully.
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.073 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.134 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.135 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.135 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.135 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.136 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.136 182096 DEBUG nova.virt.libvirt.driver [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.139 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.141 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.181 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.192 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.227 182096 INFO nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Took 7.44 seconds to spawn the instance on the hypervisor.
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.227 182096 DEBUG nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.307 182096 DEBUG nova.network.neutron [req-41d21c88-5ba8-49ed-844f-a8fc46c9e60a req-c35f2eeb-5911-45c2-98f4-0a4653581909 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Updated VIF entry in instance network info cache for port c4101a62-8d2e-4ff2-b17d-95110c75a4bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.308 182096 DEBUG nova.network.neutron [req-41d21c88-5ba8-49ed-844f-a8fc46c9e60a req-c35f2eeb-5911-45c2-98f4-0a4653581909 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Updating instance_info_cache with network_info: [{"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.315 182096 INFO nova.compute.manager [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Took 7.89 seconds to build instance.
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.333 182096 DEBUG oslo_concurrency.lockutils [req-41d21c88-5ba8-49ed-844f-a8fc46c9e60a req-c35f2eeb-5911-45c2-98f4-0a4653581909 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:31:22 compute-0 nova_compute[182092]: 2026-01-23 09:31:22.334 182096 DEBUG oslo_concurrency.lockutils [None req-8a4528bf-0fa1-4077-9510-bc0c21f387c5 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:23 compute-0 podman[224414]: 2026-01-23 09:31:23.21819398 +0000 UTC m=+0.054805132 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Jan 23 09:31:23 compute-0 nova_compute[182092]: 2026-01-23 09:31:23.796 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:24 compute-0 nova_compute[182092]: 2026-01-23 09:31:24.282 182096 DEBUG nova.compute.manager [req-cc0655b0-7037-4167-bad9-4cb9ab91ff79 req-4afca186-6b71-467b-8ea8-520d2a7cf1da 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received event network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:24 compute-0 nova_compute[182092]: 2026-01-23 09:31:24.283 182096 DEBUG oslo_concurrency.lockutils [req-cc0655b0-7037-4167-bad9-4cb9ab91ff79 req-4afca186-6b71-467b-8ea8-520d2a7cf1da 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:24 compute-0 nova_compute[182092]: 2026-01-23 09:31:24.283 182096 DEBUG oslo_concurrency.lockutils [req-cc0655b0-7037-4167-bad9-4cb9ab91ff79 req-4afca186-6b71-467b-8ea8-520d2a7cf1da 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:24 compute-0 nova_compute[182092]: 2026-01-23 09:31:24.283 182096 DEBUG oslo_concurrency.lockutils [req-cc0655b0-7037-4167-bad9-4cb9ab91ff79 req-4afca186-6b71-467b-8ea8-520d2a7cf1da 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:24 compute-0 nova_compute[182092]: 2026-01-23 09:31:24.283 182096 DEBUG nova.compute.manager [req-cc0655b0-7037-4167-bad9-4cb9ab91ff79 req-4afca186-6b71-467b-8ea8-520d2a7cf1da 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] No waiting events found dispatching network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:31:24 compute-0 nova_compute[182092]: 2026-01-23 09:31:24.283 182096 WARNING nova.compute.manager [req-cc0655b0-7037-4167-bad9-4cb9ab91ff79 req-4afca186-6b71-467b-8ea8-520d2a7cf1da 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received unexpected event network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd for instance with vm_state active and task_state None.
Jan 23 09:31:27 compute-0 nova_compute[182092]: 2026-01-23 09:31:27.182 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:28 compute-0 nova_compute[182092]: 2026-01-23 09:31:28.799 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:32 compute-0 ovn_controller[94697]: 2026-01-23T09:31:32Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:77:97 10.100.0.5
Jan 23 09:31:32 compute-0 ovn_controller[94697]: 2026-01-23T09:31:32Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:77:97 10.100.0.5
Jan 23 09:31:32 compute-0 nova_compute[182092]: 2026-01-23 09:31:32.183 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:32 compute-0 podman[224442]: 2026-01-23 09:31:32.222408083 +0000 UTC m=+0.061075762 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:31:33 compute-0 nova_compute[182092]: 2026-01-23 09:31:33.802 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.515 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "24d1d7fc-0955-4f04-bc30-e81259733fe0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.516 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.538 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.681 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.682 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.688 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.688 182096 INFO nova.compute.claims [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.784 182096 DEBUG nova.compute.provider_tree [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.795 182096 DEBUG nova.scheduler.client.report [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.808 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.809 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.854 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.855 182096 DEBUG nova.network.neutron [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.867 182096 INFO nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.880 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.990 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.991 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.991 182096 INFO nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Creating image(s)
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.991 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "/var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.992 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "/var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:34 compute-0 nova_compute[182092]: 2026-01-23 09:31:34.992 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "/var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.003 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.049 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.051 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.051 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.060 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.106 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.107 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.128 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.129 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.129 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.175 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.176 182096 DEBUG nova.virt.disk.api [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Checking if we can resize image /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.176 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.222 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.222 182096 DEBUG nova.virt.disk.api [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Cannot resize image /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.223 182096 DEBUG nova.objects.instance [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lazy-loading 'migration_context' on Instance uuid 24d1d7fc-0955-4f04-bc30-e81259733fe0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.254 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.254 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Ensure instance console log exists: /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.255 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.255 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.255 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.320 182096 DEBUG nova.policy [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e63c2d78d6f14c07bb779029e80cdefc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fef146fc9ddc405e809e63a1908b742b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:31:35 compute-0 nova_compute[182092]: 2026-01-23 09:31:35.926 182096 DEBUG nova.network.neutron [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Successfully created port: 2a71439a-3529-478d-8d96-d237fe2b6ba0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:31:36 compute-0 nova_compute[182092]: 2026-01-23 09:31:36.619 182096 DEBUG nova.network.neutron [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Successfully updated port: 2a71439a-3529-478d-8d96-d237fe2b6ba0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:31:36 compute-0 nova_compute[182092]: 2026-01-23 09:31:36.630 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:31:36 compute-0 nova_compute[182092]: 2026-01-23 09:31:36.630 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquired lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:31:36 compute-0 nova_compute[182092]: 2026-01-23 09:31:36.630 182096 DEBUG nova.network.neutron [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:31:36 compute-0 nova_compute[182092]: 2026-01-23 09:31:36.734 182096 DEBUG nova.compute.manager [req-1dce525c-a31c-494a-a126-320de7ca89dd req-ecd8432c-a4a6-4a19-a1d8-cab4597b683d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received event network-changed-2a71439a-3529-478d-8d96-d237fe2b6ba0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:36 compute-0 nova_compute[182092]: 2026-01-23 09:31:36.735 182096 DEBUG nova.compute.manager [req-1dce525c-a31c-494a-a126-320de7ca89dd req-ecd8432c-a4a6-4a19-a1d8-cab4597b683d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Refreshing instance network info cache due to event network-changed-2a71439a-3529-478d-8d96-d237fe2b6ba0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:31:36 compute-0 nova_compute[182092]: 2026-01-23 09:31:36.735 182096 DEBUG oslo_concurrency.lockutils [req-1dce525c-a31c-494a-a126-320de7ca89dd req-ecd8432c-a4a6-4a19-a1d8-cab4597b683d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.094 182096 DEBUG nova.network.neutron [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.185 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.777 182096 DEBUG nova.network.neutron [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Updating instance_info_cache with network_info: [{"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.791 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Releasing lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.792 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Instance network_info: |[{"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.792 182096 DEBUG oslo_concurrency.lockutils [req-1dce525c-a31c-494a-a126-320de7ca89dd req-ecd8432c-a4a6-4a19-a1d8-cab4597b683d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.792 182096 DEBUG nova.network.neutron [req-1dce525c-a31c-494a-a126-320de7ca89dd req-ecd8432c-a4a6-4a19-a1d8-cab4597b683d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Refreshing network info cache for port 2a71439a-3529-478d-8d96-d237fe2b6ba0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.794 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Start _get_guest_xml network_info=[{"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.798 182096 WARNING nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.804 182096 DEBUG nova.virt.libvirt.host [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.804 182096 DEBUG nova.virt.libvirt.host [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.806 182096 DEBUG nova.virt.libvirt.host [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.807 182096 DEBUG nova.virt.libvirt.host [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.807 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.808 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.808 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.808 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.808 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.808 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.808 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.809 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.809 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.809 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.809 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.809 182096 DEBUG nova.virt.hardware [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.811 182096 DEBUG nova.virt.libvirt.vif [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:31:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1299732424',display_name='tempest-ServerActionsTestOtherB-server-1299732424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1299732424',id=123,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fef146fc9ddc405e809e63a1908b742b',ramdisk_id='',reservation_id='r-aw2o04p5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-672595475',owner_user_name='tempest-ServerActionsTestOtherB-672595475-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:31:34Z,user_data=None,user_id='e63c2d78d6f14c07bb779029e80cdefc',uuid=24d1d7fc-0955-4f04-bc30-e81259733fe0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.812 182096 DEBUG nova.network.os_vif_util [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Converting VIF {"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.812 182096 DEBUG nova.network.os_vif_util [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:12:1d,bridge_name='br-int',has_traffic_filtering=True,id=2a71439a-3529-478d-8d96-d237fe2b6ba0,network=Network(71b50fec-aa37-4677-8fe9-5af0665ed5a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a71439a-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.813 182096 DEBUG nova.objects.instance [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lazy-loading 'pci_devices' on Instance uuid 24d1d7fc-0955-4f04-bc30-e81259733fe0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.823 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <uuid>24d1d7fc-0955-4f04-bc30-e81259733fe0</uuid>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <name>instance-0000007b</name>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerActionsTestOtherB-server-1299732424</nova:name>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:31:37</nova:creationTime>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:31:37 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:31:37 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:31:37 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:31:37 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:31:37 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:31:37 compute-0 nova_compute[182092]:         <nova:user uuid="e63c2d78d6f14c07bb779029e80cdefc">tempest-ServerActionsTestOtherB-672595475-project-member</nova:user>
Jan 23 09:31:37 compute-0 nova_compute[182092]:         <nova:project uuid="fef146fc9ddc405e809e63a1908b742b">tempest-ServerActionsTestOtherB-672595475</nova:project>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:31:37 compute-0 nova_compute[182092]:         <nova:port uuid="2a71439a-3529-478d-8d96-d237fe2b6ba0">
Jan 23 09:31:37 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <system>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <entry name="serial">24d1d7fc-0955-4f04-bc30-e81259733fe0</entry>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <entry name="uuid">24d1d7fc-0955-4f04-bc30-e81259733fe0</entry>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </system>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <os>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   </os>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <features>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   </features>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk.config"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:b7:12:1d"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <target dev="tap2a71439a-35"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/console.log" append="off"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <video>
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </video>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:31:37 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:31:37 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:31:37 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:31:37 compute-0 nova_compute[182092]: </domain>
Jan 23 09:31:37 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.823 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Preparing to wait for external event network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.824 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.824 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.824 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.824 182096 DEBUG nova.virt.libvirt.vif [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:31:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1299732424',display_name='tempest-ServerActionsTestOtherB-server-1299732424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1299732424',id=123,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fef146fc9ddc405e809e63a1908b742b',ramdisk_id='',reservation_id='r-aw2o04p5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-672595475',owner_user_name='tempest-ServerActionsTestOtherB-672595475-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:31:34Z,user_data=None,user_id='e63c2d78d6f14c07bb779029e80cdefc',uuid=24d1d7fc-0955-4f04-bc30-e81259733fe0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.824 182096 DEBUG nova.network.os_vif_util [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Converting VIF {"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.825 182096 DEBUG nova.network.os_vif_util [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:12:1d,bridge_name='br-int',has_traffic_filtering=True,id=2a71439a-3529-478d-8d96-d237fe2b6ba0,network=Network(71b50fec-aa37-4677-8fe9-5af0665ed5a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a71439a-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.825 182096 DEBUG os_vif [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:12:1d,bridge_name='br-int',has_traffic_filtering=True,id=2a71439a-3529-478d-8d96-d237fe2b6ba0,network=Network(71b50fec-aa37-4677-8fe9-5af0665ed5a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a71439a-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.825 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.826 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.826 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.828 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.828 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a71439a-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.828 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a71439a-35, col_values=(('external_ids', {'iface-id': '2a71439a-3529-478d-8d96-d237fe2b6ba0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:12:1d', 'vm-uuid': '24d1d7fc-0955-4f04-bc30-e81259733fe0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:37 compute-0 NetworkManager[54920]: <info>  [1769160697.8299] manager: (tap2a71439a-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.829 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.831 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.834 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.834 182096 INFO os_vif [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:12:1d,bridge_name='br-int',has_traffic_filtering=True,id=2a71439a-3529-478d-8d96-d237fe2b6ba0,network=Network(71b50fec-aa37-4677-8fe9-5af0665ed5a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a71439a-35')
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.866 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.866 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.867 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] No VIF found with MAC fa:16:3e:b7:12:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:31:37 compute-0 nova_compute[182092]: 2026-01-23 09:31:37.867 182096 INFO nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Using config drive
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.233 182096 INFO nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Creating config drive at /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk.config
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.237 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq368gbc0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.356 182096 DEBUG oslo_concurrency.processutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq368gbc0" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:38 compute-0 kernel: tap2a71439a-35: entered promiscuous mode
Jan 23 09:31:38 compute-0 NetworkManager[54920]: <info>  [1769160698.3887] manager: (tap2a71439a-35): new Tun device (/org/freedesktop/NetworkManager/Devices/225)
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.391 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.393 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 ovn_controller[94697]: 2026-01-23T09:31:38Z|00434|binding|INFO|Claiming lport 2a71439a-3529-478d-8d96-d237fe2b6ba0 for this chassis.
Jan 23 09:31:38 compute-0 ovn_controller[94697]: 2026-01-23T09:31:38Z|00435|binding|INFO|2a71439a-3529-478d-8d96-d237fe2b6ba0: Claiming fa:16:3e:b7:12:1d 10.100.0.7
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.396 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.403 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 NetworkManager[54920]: <info>  [1769160698.4057] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 23 09:31:38 compute-0 NetworkManager[54920]: <info>  [1769160698.4062] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.408 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:12:1d 10.100.0.7'], port_security=['fa:16:3e:b7:12:1d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '24d1d7fc-0955-4f04-bc30-e81259733fe0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71b50fec-aa37-4677-8fe9-5af0665ed5a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fef146fc9ddc405e809e63a1908b742b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '408973b4-49fc-4c4a-a8b2-e0ee180a79a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79ea1b6e-a817-4282-97f2-ebc8668f9238, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2a71439a-3529-478d-8d96-d237fe2b6ba0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.409 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2a71439a-3529-478d-8d96-d237fe2b6ba0 in datapath 71b50fec-aa37-4677-8fe9-5af0665ed5a0 bound to our chassis
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.410 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 71b50fec-aa37-4677-8fe9-5af0665ed5a0
Jan 23 09:31:38 compute-0 systemd-udevd[224501]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.419 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bc6d9d-d9c3-4893-ac60-70a721bb3c50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.420 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap71b50fec-a1 in ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.422 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap71b50fec-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.422 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b1903d10-1e74-4a94-93e0-ba35616699a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.422 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ee1ee6-234b-4275-890d-0d5fab31cdc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 systemd-machined[153562]: New machine qemu-62-instance-0000007b.
Jan 23 09:31:38 compute-0 NetworkManager[54920]: <info>  [1769160698.4320] device (tap2a71439a-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:31:38 compute-0 NetworkManager[54920]: <info>  [1769160698.4328] device (tap2a71439a-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.433 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[905f8525-ad1b-4243-809b-43dc9d6711e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 systemd[1]: Started Virtual Machine qemu-62-instance-0000007b.
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.458 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f4afd237-b0e9-4389-b0bb-1517095b5281]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.514 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[91e251b3-4bdb-4a79-bba2-cd710bf6f255]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.532 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf916bf-0eb4-4c65-9b45-39e7aec4be5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 NetworkManager[54920]: <info>  [1769160698.5333] manager: (tap71b50fec-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/228)
Jan 23 09:31:38 compute-0 systemd-udevd[224505]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.556 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[76128efb-99de-458f-b53b-ac324f5f853c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.558 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[85eb4ce7-b87a-4bae-8ad9-1cd21c65d098]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 NetworkManager[54920]: <info>  [1769160698.5720] device (tap71b50fec-a0): carrier: link connected
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.574 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf233ba-4f97-4ded-823e-53f4cbacdc3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.587 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.586 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0a456abf-66d3-4714-9c35-1d6056fac6b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap71b50fec-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:72:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420405, 'reachable_time': 38760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224526, 'error': None, 'target': 'ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_controller[94697]: 2026-01-23T09:31:38Z|00436|binding|INFO|Releasing lport 2a93a9c2-b647-41f7-a2b2-cd58e456413a from this chassis (sb_readonly=0)
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.600 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a8358552-f4b8-42a1-989f-80361bd47006]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:7269'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 420405, 'tstamp': 420405}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224527, 'error': None, 'target': 'ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.604 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.612 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[74c8f551-650b-4eec-9e45-310ab9016c30]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap71b50fec-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:72:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420405, 'reachable_time': 38760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224528, 'error': None, 'target': 'ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_controller[94697]: 2026-01-23T09:31:38Z|00437|binding|INFO|Setting lport 2a71439a-3529-478d-8d96-d237fe2b6ba0 ovn-installed in OVS
Jan 23 09:31:38 compute-0 ovn_controller[94697]: 2026-01-23T09:31:38Z|00438|binding|INFO|Setting lport 2a71439a-3529-478d-8d96-d237fe2b6ba0 up in Southbound
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.616 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.631 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d2695247-5552-4ced-ad97-bac5ee74bc6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.669 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b79ba7e4-0e8d-4775-a7d1-510f895d7349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.670 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71b50fec-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.670 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.671 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71b50fec-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:38 compute-0 kernel: tap71b50fec-a0: entered promiscuous mode
Jan 23 09:31:38 compute-0 NetworkManager[54920]: <info>  [1769160698.6734] manager: (tap71b50fec-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.673 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.677 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap71b50fec-a0, col_values=(('external_ids', {'iface-id': 'f485acd3-6e23-47ae-a955-6b3b7f7fb913'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:38 compute-0 ovn_controller[94697]: 2026-01-23T09:31:38Z|00439|binding|INFO|Releasing lport f485acd3-6e23-47ae-a955-6b3b7f7fb913 from this chassis (sb_readonly=0)
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.678 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.679 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/71b50fec-aa37-4677-8fe9-5af0665ed5a0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/71b50fec-aa37-4677-8fe9-5af0665ed5a0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.681 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e65f1f1f-8510-4753-8f77-d30ce374fd38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.681 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-71b50fec-aa37-4677-8fe9-5af0665ed5a0
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/71b50fec-aa37-4677-8fe9-5af0665ed5a0.pid.haproxy
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 71b50fec-aa37-4677-8fe9-5af0665ed5a0
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:31:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:38.683 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0', 'env', 'PROCESS_TAG=haproxy-71b50fec-aa37-4677-8fe9-5af0665ed5a0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/71b50fec-aa37-4677-8fe9-5af0665ed5a0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.691 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.731 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160698.730789, 24d1d7fc-0955-4f04-bc30-e81259733fe0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.731 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] VM Started (Lifecycle Event)
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.744 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.746 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160698.7308867, 24d1d7fc-0955-4f04-bc30-e81259733fe0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.746 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] VM Paused (Lifecycle Event)
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.757 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.759 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.770 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.942 182096 DEBUG nova.network.neutron [req-1dce525c-a31c-494a-a126-320de7ca89dd req-ecd8432c-a4a6-4a19-a1d8-cab4597b683d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Updated VIF entry in instance network info cache for port 2a71439a-3529-478d-8d96-d237fe2b6ba0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.942 182096 DEBUG nova.network.neutron [req-1dce525c-a31c-494a-a126-320de7ca89dd req-ecd8432c-a4a6-4a19-a1d8-cab4597b683d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Updating instance_info_cache with network_info: [{"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:38 compute-0 nova_compute[182092]: 2026-01-23 09:31:38.962 182096 DEBUG oslo_concurrency.lockutils [req-1dce525c-a31c-494a-a126-320de7ca89dd req-ecd8432c-a4a6-4a19-a1d8-cab4597b683d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:31:38 compute-0 podman[224563]: 2026-01-23 09:31:38.975451283 +0000 UTC m=+0.030320896 container create 4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:31:38 compute-0 systemd[1]: Started libpod-conmon-4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5.scope.
Jan 23 09:31:39 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:31:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/072b479e071e2253452a949758b6b9ee994ebab32866b6ca9bcafd46b7fa640c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:31:39 compute-0 podman[224563]: 2026-01-23 09:31:39.027832463 +0000 UTC m=+0.082702065 container init 4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:31:39 compute-0 podman[224563]: 2026-01-23 09:31:39.031697974 +0000 UTC m=+0.086567577 container start 4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:31:39 compute-0 podman[224563]: 2026-01-23 09:31:38.961417099 +0000 UTC m=+0.016286712 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:31:39 compute-0 neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0[224575]: [NOTICE]   (224579) : New worker (224581) forked
Jan 23 09:31:39 compute-0 neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0[224575]: [NOTICE]   (224579) : Loading success.
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.203 182096 DEBUG nova.compute.manager [req-44d0e840-3cbc-482b-9e61-4c09f657f92d req-ca8808db-c392-4ed0-9b59-d897d58831d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received event network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.204 182096 DEBUG oslo_concurrency.lockutils [req-44d0e840-3cbc-482b-9e61-4c09f657f92d req-ca8808db-c392-4ed0-9b59-d897d58831d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.204 182096 DEBUG oslo_concurrency.lockutils [req-44d0e840-3cbc-482b-9e61-4c09f657f92d req-ca8808db-c392-4ed0-9b59-d897d58831d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.205 182096 DEBUG oslo_concurrency.lockutils [req-44d0e840-3cbc-482b-9e61-4c09f657f92d req-ca8808db-c392-4ed0-9b59-d897d58831d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.205 182096 DEBUG nova.compute.manager [req-44d0e840-3cbc-482b-9e61-4c09f657f92d req-ca8808db-c392-4ed0-9b59-d897d58831d1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Processing event network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.205 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.208 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160699.2083387, 24d1d7fc-0955-4f04-bc30-e81259733fe0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.208 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] VM Resumed (Lifecycle Event)
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.209 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.211 182096 INFO nova.virt.libvirt.driver [-] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Instance spawned successfully.
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.211 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.226 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.227 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.227 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.227 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.227 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.228 182096 DEBUG nova.virt.libvirt.driver [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.244 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.245 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.264 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.277 182096 INFO nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Took 4.29 seconds to spawn the instance on the hypervisor.
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.277 182096 DEBUG nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.325 182096 INFO nova.compute.manager [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Took 4.70 seconds to build instance.
Jan 23 09:31:39 compute-0 nova_compute[182092]: 2026-01-23 09:31:39.338 182096 DEBUG oslo_concurrency.lockutils [None req-160951ac-51e5-471f-aea5-7063f71c6858 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.823s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:39.864 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:39.865 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:39.866 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:40 compute-0 podman[224586]: 2026-01-23 09:31:40.213317285 +0000 UTC m=+0.048710998 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:31:40 compute-0 podman[224587]: 2026-01-23 09:31:40.237164559 +0000 UTC m=+0.070010476 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:31:40 compute-0 nova_compute[182092]: 2026-01-23 09:31:40.583 182096 INFO nova.compute.manager [None req-8ce386d6-7505-47d6-9558-12a4e23dd5d7 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Pausing
Jan 23 09:31:40 compute-0 nova_compute[182092]: 2026-01-23 09:31:40.585 182096 DEBUG nova.objects.instance [None req-8ce386d6-7505-47d6-9558-12a4e23dd5d7 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lazy-loading 'flavor' on Instance uuid 24d1d7fc-0955-4f04-bc30-e81259733fe0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:40 compute-0 nova_compute[182092]: 2026-01-23 09:31:40.618 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160700.6183078, 24d1d7fc-0955-4f04-bc30-e81259733fe0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:40 compute-0 nova_compute[182092]: 2026-01-23 09:31:40.618 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] VM Paused (Lifecycle Event)
Jan 23 09:31:40 compute-0 nova_compute[182092]: 2026-01-23 09:31:40.620 182096 DEBUG nova.compute.manager [None req-8ce386d6-7505-47d6-9558-12a4e23dd5d7 e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:40 compute-0 nova_compute[182092]: 2026-01-23 09:31:40.645 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:40 compute-0 nova_compute[182092]: 2026-01-23 09:31:40.647 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:31:40 compute-0 nova_compute[182092]: 2026-01-23 09:31:40.670 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 23 09:31:41 compute-0 nova_compute[182092]: 2026-01-23 09:31:41.274 182096 DEBUG nova.compute.manager [req-51d6ba2d-55a8-4cb6-95b7-6cbaae475639 req-42de6fab-9218-4541-99fc-50c7db94b975 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received event network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:41 compute-0 nova_compute[182092]: 2026-01-23 09:31:41.275 182096 DEBUG oslo_concurrency.lockutils [req-51d6ba2d-55a8-4cb6-95b7-6cbaae475639 req-42de6fab-9218-4541-99fc-50c7db94b975 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:41 compute-0 nova_compute[182092]: 2026-01-23 09:31:41.275 182096 DEBUG oslo_concurrency.lockutils [req-51d6ba2d-55a8-4cb6-95b7-6cbaae475639 req-42de6fab-9218-4541-99fc-50c7db94b975 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:41 compute-0 nova_compute[182092]: 2026-01-23 09:31:41.275 182096 DEBUG oslo_concurrency.lockutils [req-51d6ba2d-55a8-4cb6-95b7-6cbaae475639 req-42de6fab-9218-4541-99fc-50c7db94b975 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:41 compute-0 nova_compute[182092]: 2026-01-23 09:31:41.275 182096 DEBUG nova.compute.manager [req-51d6ba2d-55a8-4cb6-95b7-6cbaae475639 req-42de6fab-9218-4541-99fc-50c7db94b975 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] No waiting events found dispatching network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:31:41 compute-0 nova_compute[182092]: 2026-01-23 09:31:41.276 182096 WARNING nova.compute.manager [req-51d6ba2d-55a8-4cb6-95b7-6cbaae475639 req-42de6fab-9218-4541-99fc-50c7db94b975 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received unexpected event network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 for instance with vm_state paused and task_state None.
Jan 23 09:31:42 compute-0 ovn_controller[94697]: 2026-01-23T09:31:42Z|00440|binding|INFO|Releasing lport f485acd3-6e23-47ae-a955-6b3b7f7fb913 from this chassis (sb_readonly=0)
Jan 23 09:31:42 compute-0 ovn_controller[94697]: 2026-01-23T09:31:42Z|00441|binding|INFO|Releasing lport 2a93a9c2-b647-41f7-a2b2-cd58e456413a from this chassis (sb_readonly=0)
Jan 23 09:31:42 compute-0 nova_compute[182092]: 2026-01-23 09:31:42.174 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:42 compute-0 nova_compute[182092]: 2026-01-23 09:31:42.186 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:42 compute-0 nova_compute[182092]: 2026-01-23 09:31:42.829 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:42 compute-0 nova_compute[182092]: 2026-01-23 09:31:42.833 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "24d1d7fc-0955-4f04-bc30-e81259733fe0" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:42 compute-0 nova_compute[182092]: 2026-01-23 09:31:42.833 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:42 compute-0 nova_compute[182092]: 2026-01-23 09:31:42.834 182096 INFO nova.compute.manager [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Shelving
Jan 23 09:31:42 compute-0 kernel: tap2a71439a-35 (unregistering): left promiscuous mode
Jan 23 09:31:42 compute-0 NetworkManager[54920]: <info>  [1769160702.8881] device (tap2a71439a-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:31:42 compute-0 ovn_controller[94697]: 2026-01-23T09:31:42Z|00442|binding|INFO|Releasing lport 2a71439a-3529-478d-8d96-d237fe2b6ba0 from this chassis (sb_readonly=0)
Jan 23 09:31:42 compute-0 ovn_controller[94697]: 2026-01-23T09:31:42Z|00443|binding|INFO|Setting lport 2a71439a-3529-478d-8d96-d237fe2b6ba0 down in Southbound
Jan 23 09:31:42 compute-0 ovn_controller[94697]: 2026-01-23T09:31:42Z|00444|binding|INFO|Removing iface tap2a71439a-35 ovn-installed in OVS
Jan 23 09:31:42 compute-0 nova_compute[182092]: 2026-01-23 09:31:42.892 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:42.896 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:12:1d 10.100.0.7'], port_security=['fa:16:3e:b7:12:1d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '24d1d7fc-0955-4f04-bc30-e81259733fe0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71b50fec-aa37-4677-8fe9-5af0665ed5a0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fef146fc9ddc405e809e63a1908b742b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '408973b4-49fc-4c4a-a8b2-e0ee180a79a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79ea1b6e-a817-4282-97f2-ebc8668f9238, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2a71439a-3529-478d-8d96-d237fe2b6ba0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:31:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:42.897 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2a71439a-3529-478d-8d96-d237fe2b6ba0 in datapath 71b50fec-aa37-4677-8fe9-5af0665ed5a0 unbound from our chassis
Jan 23 09:31:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:42.898 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 71b50fec-aa37-4677-8fe9-5af0665ed5a0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:31:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:42.899 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[66383344-1150-46ac-96bc-48ab5c329e32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:42.899 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0 namespace which is not needed anymore
Jan 23 09:31:42 compute-0 nova_compute[182092]: 2026-01-23 09:31:42.916 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:42 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Jan 23 09:31:42 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007b.scope: Consumed 1.672s CPU time.
Jan 23 09:31:42 compute-0 systemd-machined[153562]: Machine qemu-62-instance-0000007b terminated.
Jan 23 09:31:42 compute-0 neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0[224575]: [NOTICE]   (224579) : haproxy version is 2.8.14-c23fe91
Jan 23 09:31:42 compute-0 neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0[224575]: [NOTICE]   (224579) : path to executable is /usr/sbin/haproxy
Jan 23 09:31:42 compute-0 neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0[224575]: [WARNING]  (224579) : Exiting Master process...
Jan 23 09:31:42 compute-0 neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0[224575]: [ALERT]    (224579) : Current worker (224581) exited with code 143 (Terminated)
Jan 23 09:31:42 compute-0 neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0[224575]: [WARNING]  (224579) : All workers exited. Exiting... (0)
Jan 23 09:31:42 compute-0 systemd[1]: libpod-4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5.scope: Deactivated successfully.
Jan 23 09:31:43 compute-0 podman[224646]: 2026-01-23 09:31:43.000449278 +0000 UTC m=+0.037450186 container died 4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:31:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5-userdata-shm.mount: Deactivated successfully.
Jan 23 09:31:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-072b479e071e2253452a949758b6b9ee994ebab32866b6ca9bcafd46b7fa640c-merged.mount: Deactivated successfully.
Jan 23 09:31:43 compute-0 podman[224646]: 2026-01-23 09:31:43.019521726 +0000 UTC m=+0.056522643 container cleanup 4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:31:43 compute-0 systemd[1]: libpod-conmon-4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5.scope: Deactivated successfully.
Jan 23 09:31:43 compute-0 podman[224670]: 2026-01-23 09:31:43.060175002 +0000 UTC m=+0.024795874 container remove 4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.063 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5cdb9eab-757b-4fa9-aee5-1ee9040e5595]: (4, ('Fri Jan 23 09:31:42 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0 (4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5)\n4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5\nFri Jan 23 09:31:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0 (4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5)\n4eafed66fb4f06a74794e2f256e4399be877392d3b5f933d070789c68194c7e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.064 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[37ad0130-bb1b-4a9d-bb72-28da5c406ed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.065 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71b50fec-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:43 compute-0 kernel: tap71b50fec-a0: left promiscuous mode
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.067 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.083 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.087 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a4017372-bb68-4318-a54b-ac92524ead15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:43 compute-0 NetworkManager[54920]: <info>  [1769160703.0878] manager: (tap2a71439a-35): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.098 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f4180d-48ea-470c-a5b1-cd0d05fd8dd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.099 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c70fe044-f99b-475d-b26a-58d1cfb775bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.112 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6add4c10-24ca-4526-b164-e12b395918b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420399, 'reachable_time': 30946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224695, 'error': None, 'target': 'ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.114 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-71b50fec-aa37-4677-8fe9-5af0665ed5a0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:31:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:31:43.114 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa1f5d5-6f37-45ec-84af-cdc5a938f160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:31:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d71b50fec\x2daa37\x2d4677\x2d8fe9\x2d5af0665ed5a0.mount: Deactivated successfully.
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.118 182096 INFO nova.virt.libvirt.driver [-] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Instance destroyed successfully.
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.118 182096 DEBUG nova.objects.instance [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lazy-loading 'numa_topology' on Instance uuid 24d1d7fc-0955-4f04-bc30-e81259733fe0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.349 182096 DEBUG nova.compute.manager [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received event network-vif-unplugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.349 182096 DEBUG oslo_concurrency.lockutils [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.349 182096 DEBUG oslo_concurrency.lockutils [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.349 182096 DEBUG oslo_concurrency.lockutils [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.350 182096 DEBUG nova.compute.manager [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] No waiting events found dispatching network-vif-unplugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.350 182096 WARNING nova.compute.manager [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received unexpected event network-vif-unplugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 for instance with vm_state paused and task_state shelving.
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.350 182096 DEBUG nova.compute.manager [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received event network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.350 182096 DEBUG oslo_concurrency.lockutils [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.350 182096 DEBUG oslo_concurrency.lockutils [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.350 182096 DEBUG oslo_concurrency.lockutils [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.350 182096 DEBUG nova.compute.manager [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] No waiting events found dispatching network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.351 182096 WARNING nova.compute.manager [req-6237d9ec-dc0b-4171-926f-0bf103636ead req-ea89fc18-36df-43f6-914e-b48c7b5a92fb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received unexpected event network-vif-plugged-2a71439a-3529-478d-8d96-d237fe2b6ba0 for instance with vm_state paused and task_state shelving.
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.443 182096 INFO nova.virt.libvirt.driver [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Beginning cold snapshot process
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.596 182096 DEBUG nova.privsep.utils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.597 182096 DEBUG oslo_concurrency.processutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk /var/lib/nova/instances/snapshots/tmpyhv5lkj9/b064a87a9f6c4df1a93983816fef925d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.651 182096 DEBUG oslo_concurrency.processutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0/disk /var/lib/nova/instances/snapshots/tmpyhv5lkj9/b064a87a9f6c4df1a93983816fef925d" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:31:43 compute-0 nova_compute[182092]: 2026-01-23 09:31:43.652 182096 INFO nova.virt.libvirt.driver [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Snapshot extracted, beginning image upload
Jan 23 09:31:45 compute-0 nova_compute[182092]: 2026-01-23 09:31:45.669 182096 INFO nova.virt.libvirt.driver [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Snapshot image upload complete
Jan 23 09:31:45 compute-0 nova_compute[182092]: 2026-01-23 09:31:45.669 182096 DEBUG nova.compute.manager [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:45 compute-0 nova_compute[182092]: 2026-01-23 09:31:45.726 182096 INFO nova.compute.manager [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Shelve offloading
Jan 23 09:31:45 compute-0 nova_compute[182092]: 2026-01-23 09:31:45.736 182096 INFO nova.virt.libvirt.driver [-] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Instance destroyed successfully.
Jan 23 09:31:45 compute-0 nova_compute[182092]: 2026-01-23 09:31:45.736 182096 DEBUG nova.compute.manager [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:45 compute-0 nova_compute[182092]: 2026-01-23 09:31:45.738 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:31:45 compute-0 nova_compute[182092]: 2026-01-23 09:31:45.738 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquired lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:31:45 compute-0 nova_compute[182092]: 2026-01-23 09:31:45.738 182096 DEBUG nova.network.neutron [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:31:47 compute-0 nova_compute[182092]: 2026-01-23 09:31:47.187 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:47 compute-0 nova_compute[182092]: 2026-01-23 09:31:47.748 182096 DEBUG nova.network.neutron [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Updating instance_info_cache with network_info: [{"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:47 compute-0 nova_compute[182092]: 2026-01-23 09:31:47.767 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Releasing lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:31:47 compute-0 nova_compute[182092]: 2026-01-23 09:31:47.831 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.593 182096 INFO nova.virt.libvirt.driver [-] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Instance destroyed successfully.
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.594 182096 DEBUG nova.objects.instance [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lazy-loading 'resources' on Instance uuid 24d1d7fc-0955-4f04-bc30-e81259733fe0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.612 182096 DEBUG nova.virt.libvirt.vif [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:31:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1299732424',display_name='tempest-ServerActionsTestOtherB-server-1299732424',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1299732424',id=123,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:31:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='fef146fc9ddc405e809e63a1908b742b',ramdisk_id='',reservation_id='r-aw2o04p5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-672595475',owner_user_name='tempest-ServerActionsTestOtherB-672595475-project-member',shelved_at='2026-01-23T09:31:45.669756',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='c3d279dc-8812-48ee-8666-b03843a3473b'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:31:43Z,user_data=None,user_id='e63c2d78d6f14c07bb779029e80cdefc',uuid=24d1d7fc-0955-4f04-bc30-e81259733fe0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.612 182096 DEBUG nova.network.os_vif_util [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Converting VIF {"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a71439a-35", "ovs_interfaceid": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.613 182096 DEBUG nova.network.os_vif_util [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:12:1d,bridge_name='br-int',has_traffic_filtering=True,id=2a71439a-3529-478d-8d96-d237fe2b6ba0,network=Network(71b50fec-aa37-4677-8fe9-5af0665ed5a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a71439a-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.613 182096 DEBUG os_vif [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:12:1d,bridge_name='br-int',has_traffic_filtering=True,id=2a71439a-3529-478d-8d96-d237fe2b6ba0,network=Network(71b50fec-aa37-4677-8fe9-5af0665ed5a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a71439a-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.614 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.615 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a71439a-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.616 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.617 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.619 182096 INFO os_vif [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:12:1d,bridge_name='br-int',has_traffic_filtering=True,id=2a71439a-3529-478d-8d96-d237fe2b6ba0,network=Network(71b50fec-aa37-4677-8fe9-5af0665ed5a0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a71439a-35')
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.619 182096 INFO nova.virt.libvirt.driver [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Deleting instance files /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0_del
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.620 182096 INFO nova.virt.libvirt.driver [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Deletion of /var/lib/nova/instances/24d1d7fc-0955-4f04-bc30-e81259733fe0_del complete
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.678 182096 DEBUG nova.compute.manager [req-051cd401-9e40-4d25-a66a-b8b6212c60f7 req-0405f40f-682a-404d-93d3-98ae1bc66d96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Received event network-changed-2a71439a-3529-478d-8d96-d237fe2b6ba0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.678 182096 DEBUG nova.compute.manager [req-051cd401-9e40-4d25-a66a-b8b6212c60f7 req-0405f40f-682a-404d-93d3-98ae1bc66d96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Refreshing instance network info cache due to event network-changed-2a71439a-3529-478d-8d96-d237fe2b6ba0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.678 182096 DEBUG oslo_concurrency.lockutils [req-051cd401-9e40-4d25-a66a-b8b6212c60f7 req-0405f40f-682a-404d-93d3-98ae1bc66d96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.679 182096 DEBUG oslo_concurrency.lockutils [req-051cd401-9e40-4d25-a66a-b8b6212c60f7 req-0405f40f-682a-404d-93d3-98ae1bc66d96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.679 182096 DEBUG nova.network.neutron [req-051cd401-9e40-4d25-a66a-b8b6212c60f7 req-0405f40f-682a-404d-93d3-98ae1bc66d96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Refreshing network info cache for port 2a71439a-3529-478d-8d96-d237fe2b6ba0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.701 182096 INFO nova.scheduler.client.report [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Deleted allocations for instance 24d1d7fc-0955-4f04-bc30-e81259733fe0
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.757 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.758 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.795 182096 DEBUG nova.compute.provider_tree [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.807 182096 DEBUG nova.scheduler.client.report [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.823 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:48 compute-0 nova_compute[182092]: 2026-01-23 09:31:48.870 182096 DEBUG oslo_concurrency.lockutils [None req-b0b8ee4b-df4f-4d60-96be-99b5fc6ad2af e63c2d78d6f14c07bb779029e80cdefc fef146fc9ddc405e809e63a1908b742b - - default default] Lock "24d1d7fc-0955-4f04-bc30-e81259733fe0" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 6.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:31:49 compute-0 nova_compute[182092]: 2026-01-23 09:31:49.052 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:50 compute-0 nova_compute[182092]: 2026-01-23 09:31:50.318 182096 DEBUG nova.network.neutron [req-051cd401-9e40-4d25-a66a-b8b6212c60f7 req-0405f40f-682a-404d-93d3-98ae1bc66d96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Updated VIF entry in instance network info cache for port 2a71439a-3529-478d-8d96-d237fe2b6ba0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:31:50 compute-0 nova_compute[182092]: 2026-01-23 09:31:50.319 182096 DEBUG nova.network.neutron [req-051cd401-9e40-4d25-a66a-b8b6212c60f7 req-0405f40f-682a-404d-93d3-98ae1bc66d96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Updating instance_info_cache with network_info: [{"id": "2a71439a-3529-478d-8d96-d237fe2b6ba0", "address": "fa:16:3e:b7:12:1d", "network": {"id": "71b50fec-aa37-4677-8fe9-5af0665ed5a0", "bridge": null, "label": "tempest-ServerActionsTestOtherB-1393258748-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fef146fc9ddc405e809e63a1908b742b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap2a71439a-35", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:31:50 compute-0 nova_compute[182092]: 2026-01-23 09:31:50.346 182096 DEBUG oslo_concurrency.lockutils [req-051cd401-9e40-4d25-a66a-b8b6212c60f7 req-0405f40f-682a-404d-93d3-98ae1bc66d96 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-24d1d7fc-0955-4f04-bc30-e81259733fe0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:31:51 compute-0 podman[224709]: 2026-01-23 09:31:51.205203908 +0000 UTC m=+0.041770716 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:31:51 compute-0 podman[224710]: 2026-01-23 09:31:51.20577257 +0000 UTC m=+0.041729406 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:31:52 compute-0 nova_compute[182092]: 2026-01-23 09:31:52.189 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:53 compute-0 nova_compute[182092]: 2026-01-23 09:31:53.375 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:53 compute-0 nova_compute[182092]: 2026-01-23 09:31:53.616 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:54 compute-0 podman[224746]: 2026-01-23 09:31:54.200441291 +0000 UTC m=+0.037427151 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 23 09:31:57 compute-0 nova_compute[182092]: 2026-01-23 09:31:57.189 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:31:58 compute-0 nova_compute[182092]: 2026-01-23 09:31:58.114 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160703.113767, 24d1d7fc-0955-4f04-bc30-e81259733fe0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:31:58 compute-0 nova_compute[182092]: 2026-01-23 09:31:58.114 182096 INFO nova.compute.manager [-] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] VM Stopped (Lifecycle Event)
Jan 23 09:31:58 compute-0 nova_compute[182092]: 2026-01-23 09:31:58.151 182096 DEBUG nova.compute.manager [None req-2ee5793d-7b13-4509-82c8-dd7cefb6d3db - - - - - -] [instance: 24d1d7fc-0955-4f04-bc30-e81259733fe0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:31:58 compute-0 nova_compute[182092]: 2026-01-23 09:31:58.618 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:00 compute-0 ovn_controller[94697]: 2026-01-23T09:32:00Z|00445|binding|INFO|Releasing lport 2a93a9c2-b647-41f7-a2b2-cd58e456413a from this chassis (sb_readonly=0)
Jan 23 09:32:00 compute-0 nova_compute[182092]: 2026-01-23 09:32:00.618 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:00 compute-0 ovn_controller[94697]: 2026-01-23T09:32:00Z|00446|binding|INFO|Releasing lport 2a93a9c2-b647-41f7-a2b2-cd58e456413a from this chassis (sb_readonly=0)
Jan 23 09:32:00 compute-0 nova_compute[182092]: 2026-01-23 09:32:00.779 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:01 compute-0 anacron[84233]: Job `cron.daily' started
Jan 23 09:32:01 compute-0 anacron[84233]: Job `cron.daily' terminated
Jan 23 09:32:02 compute-0 nova_compute[182092]: 2026-01-23 09:32:02.190 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:03 compute-0 podman[224767]: 2026-01-23 09:32:03.225265602 +0000 UTC m=+0.063131901 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:32:03 compute-0 nova_compute[182092]: 2026-01-23 09:32:03.619 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.117 182096 INFO nova.compute.manager [None req-74270f11-3cd2-4a53-9d79-7a9c57c4e969 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Pausing
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.118 182096 DEBUG nova.objects.instance [None req-74270f11-3cd2-4a53-9d79-7a9c57c4e969 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lazy-loading 'flavor' on Instance uuid 6eb3a9e9-805d-4468-bfd1-03aa390682f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.145 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160724.1457126, 6eb3a9e9-805d-4468-bfd1-03aa390682f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.146 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] VM Paused (Lifecycle Event)
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.147 182096 DEBUG nova.compute.manager [None req-74270f11-3cd2-4a53-9d79-7a9c57c4e969 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.166 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.167 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.188 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 23 09:32:04 compute-0 nova_compute[182092]: 2026-01-23 09:32:04.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.663 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.663 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.663 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.663 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.731 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.779 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.780 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:05 compute-0 nova_compute[182092]: 2026-01-23 09:32:05.837 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.061 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.062 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5591MB free_disk=73.23438262939453GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.062 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.062 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.115 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.115 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.115 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.165 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.181 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.207 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.207 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.249 182096 INFO nova.compute.manager [None req-d589ab20-4753-452c-bf90-36a36a576286 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Unpausing
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.250 182096 DEBUG nova.objects.instance [None req-d589ab20-4753-452c-bf90-36a36a576286 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lazy-loading 'flavor' on Instance uuid 6eb3a9e9-805d-4468-bfd1-03aa390682f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.278 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160726.2777424, 6eb3a9e9-805d-4468-bfd1-03aa390682f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.278 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] VM Resumed (Lifecycle Event)
Jan 23 09:32:06 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.281 182096 DEBUG nova.virt.libvirt.guest [None req-d589ab20-4753-452c-bf90-36a36a576286 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.281 182096 DEBUG nova.compute.manager [None req-d589ab20-4753-452c-bf90-36a36a576286 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.299 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.301 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:32:06 compute-0 nova_compute[182092]: 2026-01-23 09:32:06.330 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] During sync_power_state the instance has a pending task (unpausing). Skip.
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.192 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.208 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.208 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.907 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.907 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.907 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:32:07 compute-0 nova_compute[182092]: 2026-01-23 09:32:07.907 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6eb3a9e9-805d-4468-bfd1-03aa390682f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:08 compute-0 nova_compute[182092]: 2026-01-23 09:32:08.621 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:09 compute-0 nova_compute[182092]: 2026-01-23 09:32:09.228 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Updating instance_info_cache with network_info: [{"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:09 compute-0 nova_compute[182092]: 2026-01-23 09:32:09.247 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:09 compute-0 nova_compute[182092]: 2026-01-23 09:32:09.247 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:32:09 compute-0 nova_compute[182092]: 2026-01-23 09:32:09.247 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:10 compute-0 nova_compute[182092]: 2026-01-23 09:32:10.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:10 compute-0 nova_compute[182092]: 2026-01-23 09:32:10.651 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:10 compute-0 nova_compute[182092]: 2026-01-23 09:32:10.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:32:11 compute-0 podman[224798]: 2026-01-23 09:32:11.20788719 +0000 UTC m=+0.039017842 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:32:11 compute-0 podman[224797]: 2026-01-23 09:32:11.212275248 +0000 UTC m=+0.045703152 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 09:32:11 compute-0 nova_compute[182092]: 2026-01-23 09:32:11.968 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:11 compute-0 nova_compute[182092]: 2026-01-23 09:32:11.968 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:11 compute-0 nova_compute[182092]: 2026-01-23 09:32:11.984 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.064 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.064 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.068 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.069 182096 INFO nova.compute.claims [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.185 182096 DEBUG nova.compute.provider_tree [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.193 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.196 182096 DEBUG nova.scheduler.client.report [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.211 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.211 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.253 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.254 182096 DEBUG nova.network.neutron [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.266 182096 INFO nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.277 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.373 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.373 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.374 182096 INFO nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Creating image(s)
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.374 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.375 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.375 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.387 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.434 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.435 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.436 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.445 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.492 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.493 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.526 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.527 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.528 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.585 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.586 182096 DEBUG nova.virt.disk.api [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Checking if we can resize image /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.587 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.641 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.642 182096 DEBUG nova.virt.disk.api [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Cannot resize image /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.642 182096 DEBUG nova.objects.instance [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'migration_context' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.652 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.652 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Ensure instance console log exists: /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.652 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.653 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.653 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:12 compute-0 nova_compute[182092]: 2026-01-23 09:32:12.833 182096 DEBUG nova.policy [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:32:13 compute-0 nova_compute[182092]: 2026-01-23 09:32:13.301 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:13.300 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:32:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:13.303 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:32:13 compute-0 nova_compute[182092]: 2026-01-23 09:32:13.380 182096 DEBUG nova.network.neutron [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Successfully created port: 82ac778e-8b80-408b-8434-4614306b12a2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:32:13 compute-0 nova_compute[182092]: 2026-01-23 09:32:13.623 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:13 compute-0 nova_compute[182092]: 2026-01-23 09:32:13.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:13 compute-0 nova_compute[182092]: 2026-01-23 09:32:13.667 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:32:14 compute-0 nova_compute[182092]: 2026-01-23 09:32:14.018 182096 DEBUG nova.network.neutron [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Successfully updated port: 82ac778e-8b80-408b-8434-4614306b12a2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:32:14 compute-0 nova_compute[182092]: 2026-01-23 09:32:14.044 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:14 compute-0 nova_compute[182092]: 2026-01-23 09:32:14.045 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquired lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:14 compute-0 nova_compute[182092]: 2026-01-23 09:32:14.045 182096 DEBUG nova.network.neutron [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:32:14 compute-0 nova_compute[182092]: 2026-01-23 09:32:14.270 182096 DEBUG nova.network.neutron [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:32:14 compute-0 nova_compute[182092]: 2026-01-23 09:32:14.354 182096 DEBUG nova.compute.manager [req-ee4bc6a7-117e-4bfa-aae4-94336a152438 req-4842f0be-9596-4556-94a8-e2e9ede414e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-changed-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:14 compute-0 nova_compute[182092]: 2026-01-23 09:32:14.354 182096 DEBUG nova.compute.manager [req-ee4bc6a7-117e-4bfa-aae4-94336a152438 req-4842f0be-9596-4556-94a8-e2e9ede414e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Refreshing instance network info cache due to event network-changed-82ac778e-8b80-408b-8434-4614306b12a2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:32:14 compute-0 nova_compute[182092]: 2026-01-23 09:32:14.354 182096 DEBUG oslo_concurrency.lockutils [req-ee4bc6a7-117e-4bfa-aae4-94336a152438 req-4842f0be-9596-4556-94a8-e2e9ede414e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.378 182096 DEBUG nova.network.neutron [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Updating instance_info_cache with network_info: [{"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.393 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Releasing lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.394 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance network_info: |[{"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.394 182096 DEBUG oslo_concurrency.lockutils [req-ee4bc6a7-117e-4bfa-aae4-94336a152438 req-4842f0be-9596-4556-94a8-e2e9ede414e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.394 182096 DEBUG nova.network.neutron [req-ee4bc6a7-117e-4bfa-aae4-94336a152438 req-4842f0be-9596-4556-94a8-e2e9ede414e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Refreshing network info cache for port 82ac778e-8b80-408b-8434-4614306b12a2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.396 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Start _get_guest_xml network_info=[{"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.400 182096 WARNING nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.406 182096 DEBUG nova.virt.libvirt.host [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.407 182096 DEBUG nova.virt.libvirt.host [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.410 182096 DEBUG nova.virt.libvirt.host [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.410 182096 DEBUG nova.virt.libvirt.host [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.411 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.411 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.411 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.412 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.412 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.412 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.412 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.412 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.413 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.413 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.413 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.413 182096 DEBUG nova.virt.hardware [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.416 182096 DEBUG nova.virt.libvirt.vif [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-580802037',display_name='tempest-TestNetworkAdvancedServerOps-server-580802037',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-580802037',id=125,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG7XLzIgUzZQEdBK1DT4VJUEFsSE7ZjpBHSslVMBkSX7VE0L01lxxor2j6G45z6oPLU17GVmBem+PADDQDczbCXPhidh22jGIwjzDpR0mmF8KGuBBI4iMTTeM+hCT4rUiA==',key_name='tempest-TestNetworkAdvancedServerOps-470226595',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-knag5dj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:32:12Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=4764c1b5-427e-4933-ae62-df47006aa673,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.416 182096 DEBUG nova.network.os_vif_util [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.416 182096 DEBUG nova.network.os_vif_util [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.417 182096 DEBUG nova.objects.instance [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.426 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <uuid>4764c1b5-427e-4933-ae62-df47006aa673</uuid>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <name>instance-0000007d</name>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-580802037</nova:name>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:32:16</nova:creationTime>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:32:16 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:32:16 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:32:16 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:32:16 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:32:16 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:32:16 compute-0 nova_compute[182092]:         <nova:user uuid="2880f53bded147989ea61dc68ec0880e">tempest-TestNetworkAdvancedServerOps-169193993-project-member</nova:user>
Jan 23 09:32:16 compute-0 nova_compute[182092]:         <nova:project uuid="5a5525bfc549464cace77d44548fb012">tempest-TestNetworkAdvancedServerOps-169193993</nova:project>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:32:16 compute-0 nova_compute[182092]:         <nova:port uuid="82ac778e-8b80-408b-8434-4614306b12a2">
Jan 23 09:32:16 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <system>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <entry name="serial">4764c1b5-427e-4933-ae62-df47006aa673</entry>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <entry name="uuid">4764c1b5-427e-4933-ae62-df47006aa673</entry>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </system>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <os>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   </os>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <features>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   </features>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.config"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:45:d9:b0"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <target dev="tap82ac778e-8b"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/console.log" append="off"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <video>
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </video>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:32:16 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:32:16 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:32:16 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:32:16 compute-0 nova_compute[182092]: </domain>
Jan 23 09:32:16 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.427 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Preparing to wait for external event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.427 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.427 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.427 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.428 182096 DEBUG nova.virt.libvirt.vif [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-580802037',display_name='tempest-TestNetworkAdvancedServerOps-server-580802037',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-580802037',id=125,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG7XLzIgUzZQEdBK1DT4VJUEFsSE7ZjpBHSslVMBkSX7VE0L01lxxor2j6G45z6oPLU17GVmBem+PADDQDczbCXPhidh22jGIwjzDpR0mmF8KGuBBI4iMTTeM+hCT4rUiA==',key_name='tempest-TestNetworkAdvancedServerOps-470226595',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-knag5dj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:32:12Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=4764c1b5-427e-4933-ae62-df47006aa673,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.428 182096 DEBUG nova.network.os_vif_util [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.429 182096 DEBUG nova.network.os_vif_util [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.429 182096 DEBUG os_vif [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.429 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.429 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.430 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.440 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.440 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82ac778e-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.441 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82ac778e-8b, col_values=(('external_ids', {'iface-id': '82ac778e-8b80-408b-8434-4614306b12a2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:d9:b0', 'vm-uuid': '4764c1b5-427e-4933-ae62-df47006aa673'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.442 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:16 compute-0 NetworkManager[54920]: <info>  [1769160736.4430] manager: (tap82ac778e-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.444 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.449 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.450 182096 INFO os_vif [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b')
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.479 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.479 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.479 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No VIF found with MAC fa:16:3e:45:d9:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:32:16 compute-0 nova_compute[182092]: 2026-01-23 09:32:16.480 182096 INFO nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Using config drive
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.195 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.306 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.455 182096 INFO nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Creating config drive at /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.config
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.462 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpelf5z8zv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.583 182096 DEBUG oslo_concurrency.processutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpelf5z8zv" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:17 compute-0 kernel: tap82ac778e-8b: entered promiscuous mode
Jan 23 09:32:17 compute-0 ovn_controller[94697]: 2026-01-23T09:32:17Z|00447|binding|INFO|Claiming lport 82ac778e-8b80-408b-8434-4614306b12a2 for this chassis.
Jan 23 09:32:17 compute-0 ovn_controller[94697]: 2026-01-23T09:32:17Z|00448|binding|INFO|82ac778e-8b80-408b-8434-4614306b12a2: Claiming fa:16:3e:45:d9:b0 10.100.0.12
Jan 23 09:32:17 compute-0 NetworkManager[54920]: <info>  [1769160737.6408] manager: (tap82ac778e-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.640 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.647 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:d9:b0 10.100.0.12'], port_security=['fa:16:3e:45:d9:b0 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4604008e-5083-4e5d-81a5-72c28448263a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d568202-ce92-4fcd-8e08-378aa90d8dd0, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=82ac778e-8b80-408b-8434-4614306b12a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.648 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 82ac778e-8b80-408b-8434-4614306b12a2 in datapath baaff0ba-bd4f-42f9-a73e-ba8b1647f64f bound to our chassis
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.649 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network baaff0ba-bd4f-42f9-a73e-ba8b1647f64f
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.661 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[667fc8db-deb1-4639-a456-42d9f63f5876]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.661 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbaaff0ba-b1 in ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.664 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbaaff0ba-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.664 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b603e6a7-4f71-40ba-98db-e4e14e6e6db1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.665 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6615dd77-ae4a-466b-8c2b-d7f1f3527508]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.673 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfd2750-73c6-462e-a1aa-26a458ae4ce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 systemd-udevd[224873]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:32:17 compute-0 systemd-machined[153562]: New machine qemu-63-instance-0000007d.
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.695 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc56782-61f2-4ed3-a8e7-63425e8aae3e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 NetworkManager[54920]: <info>  [1769160737.6995] device (tap82ac778e-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:32:17 compute-0 NetworkManager[54920]: <info>  [1769160737.7003] device (tap82ac778e-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:32:17 compute-0 systemd[1]: Started Virtual Machine qemu-63-instance-0000007d.
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.696 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.701 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:17 compute-0 ovn_controller[94697]: 2026-01-23T09:32:17Z|00449|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 ovn-installed in OVS
Jan 23 09:32:17 compute-0 ovn_controller[94697]: 2026-01-23T09:32:17Z|00450|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 up in Southbound
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.727 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[0627aaf4-a598-4795-aa1b-d7df922b26ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 NetworkManager[54920]: <info>  [1769160737.7331] manager: (tapbaaff0ba-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/233)
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.732 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fc054e44-51af-4250-87f6-d8818d6a592b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.759 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[15908d0f-fa31-404f-945c-ed19516ea5be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.765 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9d781c-bc60-4c43-8533-6d0057c865eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 NetworkManager[54920]: <info>  [1769160737.7891] device (tapbaaff0ba-b0): carrier: link connected
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.794 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb93251-99bb-4cb1-b498-f11f5923d551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.807 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aa26988d-7608-479c-9ff8-a59ec175ab3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaaff0ba-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:07:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424327, 'reachable_time': 42193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224895, 'error': None, 'target': 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.823 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d8760895-8957-469c-bbb9-2846f170b88b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef3:7af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424327, 'tstamp': 424327}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224896, 'error': None, 'target': 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.841 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9eadc43c-2928-4bba-999f-0c4b9a1239ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaaff0ba-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:07:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424327, 'reachable_time': 42193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224897, 'error': None, 'target': 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.878 182096 DEBUG nova.compute.manager [req-5d4c55d7-1036-4b4a-bd19-1a2dbd5eb724 req-bd4e1658-7a21-4b3b-bb44-4bda018340df 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.878 182096 DEBUG oslo_concurrency.lockutils [req-5d4c55d7-1036-4b4a-bd19-1a2dbd5eb724 req-bd4e1658-7a21-4b3b-bb44-4bda018340df 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.878 182096 DEBUG oslo_concurrency.lockutils [req-5d4c55d7-1036-4b4a-bd19-1a2dbd5eb724 req-bd4e1658-7a21-4b3b-bb44-4bda018340df 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.878 182096 DEBUG oslo_concurrency.lockutils [req-5d4c55d7-1036-4b4a-bd19-1a2dbd5eb724 req-bd4e1658-7a21-4b3b-bb44-4bda018340df 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.879 182096 DEBUG nova.compute.manager [req-5d4c55d7-1036-4b4a-bd19-1a2dbd5eb724 req-bd4e1658-7a21-4b3b-bb44-4bda018340df 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Processing event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.884 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2f508c1f-1705-4e41-8373-c3609bb8465c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.949 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0b73e12c-8e66-4786-b6fe-485ebc9552f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.950 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaaff0ba-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.950 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.951 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbaaff0ba-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:17 compute-0 NetworkManager[54920]: <info>  [1769160737.9531] manager: (tapbaaff0ba-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Jan 23 09:32:17 compute-0 kernel: tapbaaff0ba-b0: entered promiscuous mode
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.954 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbaaff0ba-b0, col_values=(('external_ids', {'iface-id': '715202ac-6ce8-4fb8-a4a3-465e6f0c31a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:17 compute-0 ovn_controller[94697]: 2026-01-23T09:32:17Z|00451|binding|INFO|Releasing lport 715202ac-6ce8-4fb8-a4a3-465e6f0c31a7 from this chassis (sb_readonly=0)
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.967 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.969 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/baaff0ba-bd4f-42f9-a73e-ba8b1647f64f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/baaff0ba-bd4f-42f9-a73e-ba8b1647f64f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.970 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[512590dd-463d-48d4-a1bf-87a6a751d827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:17 compute-0 nova_compute[182092]: 2026-01-23 09:32:17.971 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.972 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/baaff0ba-bd4f-42f9-a73e-ba8b1647f64f.pid.haproxy
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID baaff0ba-bd4f-42f9-a73e-ba8b1647f64f
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:32:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:17.974 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'env', 'PROCESS_TAG=haproxy-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/baaff0ba-bd4f-42f9-a73e-ba8b1647f64f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:32:18 compute-0 podman[224925]: 2026-01-23 09:32:18.289936373 +0000 UTC m=+0.034949969 container create 28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 09:32:18 compute-0 systemd[1]: Started libpod-conmon-28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506.scope.
Jan 23 09:32:18 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43809e5ad80077a2406d21182bfa1d1d0798285c9c47851d61f50cbeddb240df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:32:18 compute-0 podman[224925]: 2026-01-23 09:32:18.343852036 +0000 UTC m=+0.088865632 container init 28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:32:18 compute-0 podman[224925]: 2026-01-23 09:32:18.348790752 +0000 UTC m=+0.093804349 container start 28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:32:18 compute-0 podman[224925]: 2026-01-23 09:32:18.273932734 +0000 UTC m=+0.018946350 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:32:18 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[224937]: [NOTICE]   (224941) : New worker (224943) forked
Jan 23 09:32:18 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[224937]: [NOTICE]   (224941) : Loading success.
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.383 182096 DEBUG nova.network.neutron [req-ee4bc6a7-117e-4bfa-aae4-94336a152438 req-4842f0be-9596-4556-94a8-e2e9ede414e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Updated VIF entry in instance network info cache for port 82ac778e-8b80-408b-8434-4614306b12a2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.384 182096 DEBUG nova.network.neutron [req-ee4bc6a7-117e-4bfa-aae4-94336a152438 req-4842f0be-9596-4556-94a8-e2e9ede414e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Updating instance_info_cache with network_info: [{"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.395 182096 DEBUG oslo_concurrency.lockutils [req-ee4bc6a7-117e-4bfa-aae4-94336a152438 req-4842f0be-9596-4556-94a8-e2e9ede414e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.482 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160738.4818041, 4764c1b5-427e-4933-ae62-df47006aa673 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.482 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] VM Started (Lifecycle Event)
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.484 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.490 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.493 182096 INFO nova.virt.libvirt.driver [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance spawned successfully.
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.494 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.495 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.498 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.509 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.510 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.510 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.510 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.511 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.511 182096 DEBUG nova.virt.libvirt.driver [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.513 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.514 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160738.4818997, 4764c1b5-427e-4933-ae62-df47006aa673 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.514 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] VM Paused (Lifecycle Event)
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.534 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.536 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160738.4859023, 4764c1b5-427e-4933-ae62-df47006aa673 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.536 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] VM Resumed (Lifecycle Event)
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.549 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.554 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.564 182096 INFO nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Took 6.19 seconds to spawn the instance on the hypervisor.
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.565 182096 DEBUG nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.570 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.615 182096 INFO nova.compute.manager [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Took 6.58 seconds to build instance.
Jan 23 09:32:18 compute-0 nova_compute[182092]: 2026-01-23 09:32:18.625 182096 DEBUG oslo_concurrency.lockutils [None req-4be18187-dc81-46b4-a1a1-1607c7acc730 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:19 compute-0 nova_compute[182092]: 2026-01-23 09:32:19.946 182096 DEBUG nova.compute.manager [req-a69b02b2-18ed-41d8-a659-8e4f4b5c55c4 req-adfde46a-c0ca-4471-8b67-8c6099c0e6ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:19 compute-0 nova_compute[182092]: 2026-01-23 09:32:19.947 182096 DEBUG oslo_concurrency.lockutils [req-a69b02b2-18ed-41d8-a659-8e4f4b5c55c4 req-adfde46a-c0ca-4471-8b67-8c6099c0e6ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:19 compute-0 nova_compute[182092]: 2026-01-23 09:32:19.947 182096 DEBUG oslo_concurrency.lockutils [req-a69b02b2-18ed-41d8-a659-8e4f4b5c55c4 req-adfde46a-c0ca-4471-8b67-8c6099c0e6ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:19 compute-0 nova_compute[182092]: 2026-01-23 09:32:19.947 182096 DEBUG oslo_concurrency.lockutils [req-a69b02b2-18ed-41d8-a659-8e4f4b5c55c4 req-adfde46a-c0ca-4471-8b67-8c6099c0e6ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:19 compute-0 nova_compute[182092]: 2026-01-23 09:32:19.947 182096 DEBUG nova.compute.manager [req-a69b02b2-18ed-41d8-a659-8e4f4b5c55c4 req-adfde46a-c0ca-4471-8b67-8c6099c0e6ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] No waiting events found dispatching network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:19 compute-0 nova_compute[182092]: 2026-01-23 09:32:19.947 182096 WARNING nova.compute.manager [req-a69b02b2-18ed-41d8-a659-8e4f4b5c55c4 req-adfde46a-c0ca-4471-8b67-8c6099c0e6ab 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received unexpected event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 for instance with vm_state active and task_state None.
Jan 23 09:32:21 compute-0 nova_compute[182092]: 2026-01-23 09:32:21.444 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:22 compute-0 nova_compute[182092]: 2026-01-23 09:32:22.196 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:22 compute-0 podman[224964]: 2026-01-23 09:32:22.220933679 +0000 UTC m=+0.054341799 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:32:22 compute-0 podman[224963]: 2026-01-23 09:32:22.24522415 +0000 UTC m=+0.078781762 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 09:32:23 compute-0 NetworkManager[54920]: <info>  [1769160743.2747] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 23 09:32:23 compute-0 NetworkManager[54920]: <info>  [1769160743.2754] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Jan 23 09:32:23 compute-0 nova_compute[182092]: 2026-01-23 09:32:23.280 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:23 compute-0 nova_compute[182092]: 2026-01-23 09:32:23.379 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:23 compute-0 ovn_controller[94697]: 2026-01-23T09:32:23Z|00452|binding|INFO|Releasing lport 715202ac-6ce8-4fb8-a4a3-465e6f0c31a7 from this chassis (sb_readonly=0)
Jan 23 09:32:23 compute-0 ovn_controller[94697]: 2026-01-23T09:32:23Z|00453|binding|INFO|Releasing lport 2a93a9c2-b647-41f7-a2b2-cd58e456413a from this chassis (sb_readonly=0)
Jan 23 09:32:23 compute-0 nova_compute[182092]: 2026-01-23 09:32:23.396 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:23 compute-0 nova_compute[182092]: 2026-01-23 09:32:23.545 182096 DEBUG nova.compute.manager [req-ad089c5e-39f0-43ef-b5fc-dcefa7099975 req-19910e71-9214-4fb5-9fdc-996dc36df005 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-changed-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:23 compute-0 nova_compute[182092]: 2026-01-23 09:32:23.546 182096 DEBUG nova.compute.manager [req-ad089c5e-39f0-43ef-b5fc-dcefa7099975 req-19910e71-9214-4fb5-9fdc-996dc36df005 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Refreshing instance network info cache due to event network-changed-82ac778e-8b80-408b-8434-4614306b12a2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:32:23 compute-0 nova_compute[182092]: 2026-01-23 09:32:23.547 182096 DEBUG oslo_concurrency.lockutils [req-ad089c5e-39f0-43ef-b5fc-dcefa7099975 req-19910e71-9214-4fb5-9fdc-996dc36df005 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:23 compute-0 nova_compute[182092]: 2026-01-23 09:32:23.547 182096 DEBUG oslo_concurrency.lockutils [req-ad089c5e-39f0-43ef-b5fc-dcefa7099975 req-19910e71-9214-4fb5-9fdc-996dc36df005 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:23 compute-0 nova_compute[182092]: 2026-01-23 09:32:23.547 182096 DEBUG nova.network.neutron [req-ad089c5e-39f0-43ef-b5fc-dcefa7099975 req-19910e71-9214-4fb5-9fdc-996dc36df005 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Refreshing network info cache for port 82ac778e-8b80-408b-8434-4614306b12a2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:32:24 compute-0 nova_compute[182092]: 2026-01-23 09:32:24.940 182096 DEBUG nova.network.neutron [req-ad089c5e-39f0-43ef-b5fc-dcefa7099975 req-19910e71-9214-4fb5-9fdc-996dc36df005 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Updated VIF entry in instance network info cache for port 82ac778e-8b80-408b-8434-4614306b12a2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:32:24 compute-0 nova_compute[182092]: 2026-01-23 09:32:24.941 182096 DEBUG nova.network.neutron [req-ad089c5e-39f0-43ef-b5fc-dcefa7099975 req-19910e71-9214-4fb5-9fdc-996dc36df005 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Updating instance_info_cache with network_info: [{"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:24 compute-0 nova_compute[182092]: 2026-01-23 09:32:24.966 182096 DEBUG oslo_concurrency.lockutils [req-ad089c5e-39f0-43ef-b5fc-dcefa7099975 req-19910e71-9214-4fb5-9fdc-996dc36df005 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:25 compute-0 podman[225000]: 2026-01-23 09:32:25.216148581 +0000 UTC m=+0.043707078 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Jan 23 09:32:25 compute-0 nova_compute[182092]: 2026-01-23 09:32:25.916 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:25 compute-0 nova_compute[182092]: 2026-01-23 09:32:25.917 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:25 compute-0 nova_compute[182092]: 2026-01-23 09:32:25.917 182096 INFO nova.compute.manager [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Shelving
Jan 23 09:32:25 compute-0 nova_compute[182092]: 2026-01-23 09:32:25.954 182096 DEBUG nova.virt.libvirt.driver [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:32:26 compute-0 nova_compute[182092]: 2026-01-23 09:32:26.446 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:27 compute-0 nova_compute[182092]: 2026-01-23 09:32:27.199 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:28 compute-0 kernel: tapc4101a62-8d (unregistering): left promiscuous mode
Jan 23 09:32:28 compute-0 NetworkManager[54920]: <info>  [1769160748.0720] device (tapc4101a62-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:32:28 compute-0 nova_compute[182092]: 2026-01-23 09:32:28.077 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:28 compute-0 ovn_controller[94697]: 2026-01-23T09:32:28Z|00454|binding|INFO|Releasing lport c4101a62-8d2e-4ff2-b17d-95110c75a4bd from this chassis (sb_readonly=0)
Jan 23 09:32:28 compute-0 ovn_controller[94697]: 2026-01-23T09:32:28Z|00455|binding|INFO|Setting lport c4101a62-8d2e-4ff2-b17d-95110c75a4bd down in Southbound
Jan 23 09:32:28 compute-0 ovn_controller[94697]: 2026-01-23T09:32:28Z|00456|binding|INFO|Removing iface tapc4101a62-8d ovn-installed in OVS
Jan 23 09:32:28 compute-0 nova_compute[182092]: 2026-01-23 09:32:28.079 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.087 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:77:97 10.100.0.5'], port_security=['fa:16:3e:4f:77:97 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6eb3a9e9-805d-4468-bfd1-03aa390682f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a9e1d00-8438-4823-956e-6cae137c7678', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1b83842511b438480c657f4b89702d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '894695d7-c37a-4406-98cf-62b3145309c4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=854f12a8-7cf0-4ea1-9bb7-22830c9849dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=c4101a62-8d2e-4ff2-b17d-95110c75a4bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:32:28 compute-0 nova_compute[182092]: 2026-01-23 09:32:28.090 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.091 103978 INFO neutron.agent.ovn.metadata.agent [-] Port c4101a62-8d2e-4ff2-b17d-95110c75a4bd in datapath 3a9e1d00-8438-4823-956e-6cae137c7678 unbound from our chassis
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.093 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a9e1d00-8438-4823-956e-6cae137c7678, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.096 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[55266e0e-9d01-418a-9bb0-47bd88277561]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.097 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678 namespace which is not needed anymore
Jan 23 09:32:28 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 23 09:32:28 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000078.scope: Consumed 11.588s CPU time.
Jan 23 09:32:28 compute-0 systemd-machined[153562]: Machine qemu-61-instance-00000078 terminated.
Jan 23 09:32:28 compute-0 neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678[224403]: [NOTICE]   (224407) : haproxy version is 2.8.14-c23fe91
Jan 23 09:32:28 compute-0 neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678[224403]: [NOTICE]   (224407) : path to executable is /usr/sbin/haproxy
Jan 23 09:32:28 compute-0 neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678[224403]: [ALERT]    (224407) : Current worker (224409) exited with code 143 (Terminated)
Jan 23 09:32:28 compute-0 neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678[224403]: [WARNING]  (224407) : All workers exited. Exiting... (0)
Jan 23 09:32:28 compute-0 systemd[1]: libpod-0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c.scope: Deactivated successfully.
Jan 23 09:32:28 compute-0 podman[225037]: 2026-01-23 09:32:28.215563465 +0000 UTC m=+0.035932974 container died 0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:32:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c-userdata-shm.mount: Deactivated successfully.
Jan 23 09:32:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-0157c71dae9db8b4776ec3374422eadf2a7870f710cdb0c2a51fc5dbe2826e2a-merged.mount: Deactivated successfully.
Jan 23 09:32:28 compute-0 podman[225037]: 2026-01-23 09:32:28.24303772 +0000 UTC m=+0.063407219 container cleanup 0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:32:28 compute-0 systemd[1]: libpod-conmon-0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c.scope: Deactivated successfully.
Jan 23 09:32:28 compute-0 podman[225074]: 2026-01-23 09:32:28.287750344 +0000 UTC m=+0.027427710 container remove 0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.291 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ad187be7-48d3-47ca-8694-11552b794a39]: (4, ('Fri Jan 23 09:32:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678 (0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c)\n0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c\nFri Jan 23 09:32:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678 (0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c)\n0f0a61b2a71d7b29ee3c9d5c471f766ecabd3f52ccbb261f4503c34315664a8c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.296 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[28d4a218-225c-4977-9425-6a14bf23d694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:28 compute-0 nova_compute[182092]: 2026-01-23 09:32:28.299 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.297 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9e1d00-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:28 compute-0 kernel: tap3a9e1d00-80: left promiscuous mode
Jan 23 09:32:28 compute-0 NetworkManager[54920]: <info>  [1769160748.3092] manager: (tapc4101a62-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/237)
Jan 23 09:32:28 compute-0 nova_compute[182092]: 2026-01-23 09:32:28.321 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.322 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0d34ffe1-d4f4-4311-ab23-ff478ada1675]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.333 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[05197a8d-474e-42e4-b64b-8fb6058035c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.333 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[98ce034f-8802-4792-bc61-e8ba7058c4cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.352 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0a5245-7c33-4803-8bfe-e2c1307113c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418611, 'reachable_time': 33102, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225097, 'error': None, 'target': 'ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d3a9e1d00\x2d8438\x2d4823\x2d956e\x2d6cae137c7678.mount: Deactivated successfully.
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.358 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a9e1d00-8438-4823-956e-6cae137c7678 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:32:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:28.358 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[cd50b1c7-14f6-4e76-b30f-3d0585803466]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:28 compute-0 nova_compute[182092]: 2026-01-23 09:32:28.972 182096 INFO nova.virt.libvirt.driver [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Instance shutdown successfully after 3 seconds.
Jan 23 09:32:28 compute-0 nova_compute[182092]: 2026-01-23 09:32:28.976 182096 INFO nova.virt.libvirt.driver [-] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Instance destroyed successfully.
Jan 23 09:32:28 compute-0 nova_compute[182092]: 2026-01-23 09:32:28.977 182096 DEBUG nova.objects.instance [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 6eb3a9e9-805d-4468-bfd1-03aa390682f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:29 compute-0 nova_compute[182092]: 2026-01-23 09:32:29.249 182096 INFO nova.virt.libvirt.driver [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Beginning cold snapshot process
Jan 23 09:32:29 compute-0 ovn_controller[94697]: 2026-01-23T09:32:29Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:d9:b0 10.100.0.12
Jan 23 09:32:29 compute-0 ovn_controller[94697]: 2026-01-23T09:32:29Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:d9:b0 10.100.0.12
Jan 23 09:32:29 compute-0 nova_compute[182092]: 2026-01-23 09:32:29.383 182096 DEBUG nova.privsep.utils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:32:29 compute-0 nova_compute[182092]: 2026-01-23 09:32:29.383 182096 DEBUG oslo_concurrency.processutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk /var/lib/nova/instances/snapshots/tmptmptz6_k/5d57250525c6481897a289d95fb2b746 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:29 compute-0 nova_compute[182092]: 2026-01-23 09:32:29.529 182096 DEBUG oslo_concurrency.processutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8/disk /var/lib/nova/instances/snapshots/tmptmptz6_k/5d57250525c6481897a289d95fb2b746" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:29 compute-0 nova_compute[182092]: 2026-01-23 09:32:29.530 182096 INFO nova.virt.libvirt.driver [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Snapshot extracted, beginning image upload
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.493 182096 DEBUG nova.compute.manager [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received event network-vif-unplugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.494 182096 DEBUG oslo_concurrency.lockutils [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.494 182096 DEBUG oslo_concurrency.lockutils [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.495 182096 DEBUG oslo_concurrency.lockutils [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.495 182096 DEBUG nova.compute.manager [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] No waiting events found dispatching network-vif-unplugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.495 182096 WARNING nova.compute.manager [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received unexpected event network-vif-unplugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd for instance with vm_state active and task_state shelving_image_uploading.
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.495 182096 DEBUG nova.compute.manager [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received event network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.495 182096 DEBUG oslo_concurrency.lockutils [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.495 182096 DEBUG oslo_concurrency.lockutils [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.496 182096 DEBUG oslo_concurrency.lockutils [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.496 182096 DEBUG nova.compute.manager [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] No waiting events found dispatching network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:30 compute-0 nova_compute[182092]: 2026-01-23 09:32:30.496 182096 WARNING nova.compute.manager [req-1fdb300d-c7da-4dc0-a60c-a82aa7471563 req-3425a194-7a3a-44b0-af80-54eb85c2d3f0 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received unexpected event network-vif-plugged-c4101a62-8d2e-4ff2-b17d-95110c75a4bd for instance with vm_state active and task_state shelving_image_uploading.
Jan 23 09:32:31 compute-0 ovn_controller[94697]: 2026-01-23T09:32:31Z|00457|binding|INFO|Releasing lport 715202ac-6ce8-4fb8-a4a3-465e6f0c31a7 from this chassis (sb_readonly=0)
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.061 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.448 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.622 182096 INFO nova.virt.libvirt.driver [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Snapshot image upload complete
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.623 182096 DEBUG nova.compute.manager [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.686 182096 INFO nova.compute.manager [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Shelve offloading
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.697 182096 INFO nova.virt.libvirt.driver [-] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Instance destroyed successfully.
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.697 182096 DEBUG nova.compute.manager [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.699 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.699 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquired lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:31 compute-0 nova_compute[182092]: 2026-01-23 09:32:31.699 182096 DEBUG nova.network.neutron [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:32:32 compute-0 nova_compute[182092]: 2026-01-23 09:32:32.201 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.002 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6eb3a9e9-805d-4468-bfd1-03aa390682f8', 'name': 'tempest-ServersNegativeTestJSON-server-1468833441', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000078', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'b1b83842511b438480c657f4b89702d0', 'user_id': 'c664df671f084e15bedf8f948ca3d38c', 'hostId': '9b5456620e79645bd16e05a9af7d0080c13ed8ae1837bfa74ec7f3df', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.003 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4764c1b5-427e-4933-ae62-df47006aa673', 'name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5a5525bfc549464cace77d44548fb012', 'user_id': '2880f53bded147989ea61dc68ec0880e', 'hostId': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.004 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.004 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1468833441>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-580802037>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1468833441>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-580802037>]
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.005 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.013 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.014 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c06c4a67-1d81-4107-941d-7146b4eef4f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.004816', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71d83d68-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.534749916, 'message_signature': '97fdc52e64a994c1c6d1f45400ae9998669ebb63f8a873b6b54524b56d16e475'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.004816', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71d84826-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.534749916, 'message_signature': '5309942a3fdced30dfc24616c6d0072f4fdd4821d37b4ce5f19c1e48cde4e64f'}]}, 'timestamp': '2026-01-23 09:32:33.014703', '_unique_id': '5ba8e1a66f8e482cb0541412e419185b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.015 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.016 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cbe8b77-bbdd-47e1-a863-27f767a33cd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.016525', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71d8ad0c-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.534749916, 'message_signature': 'a8fd6114eee5686b75804c7c21132bc5e86f58eea691cbc0d0829ca27779d01a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.016525', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71d8b6b2-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.534749916, 'message_signature': '5430b65d9ac13077d6b3135bf35fa25c31c08bd188579acefa6cb786211eaa44'}]}, 'timestamp': '2026-01-23 09:32:33.017506', '_unique_id': '7010829093ce49948e16ad3673a06ef4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.017 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.018 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.018 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1468833441>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-580802037>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1468833441>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-580802037>]
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.019 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.020 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4764c1b5-427e-4933-ae62-df47006aa673 / tap82ac778e-8b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73a67e52-c5a9-4376-8ec7-a9eaad40cc3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.019194', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71d94b2c-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': 'b9d305a922b2e1d8b3b69243ccda51a7ed115b8b22cd93939f12fb05a8d8d4a8'}]}, 'timestamp': '2026-01-23 09:32:33.021333', '_unique_id': '41db520f15a24ddbb7e2a2a2c80b41c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.021 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.022 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.incoming.bytes volume: 1842 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '026d10b4-6589-437d-b954-57d80faa8001', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1842, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.022768', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71d9a022-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': 'b92cde8161528ae646c706de986417b2a53c6459536396d577be6a932465bd64'}]}, 'timestamp': '2026-01-23 09:32:33.023495', '_unique_id': 'dd16e8f9cef248c1aaf400354c822843'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.023 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.025 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.026 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.027 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b720665-4556-4722-9224-bc6bdf2b3612', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.024903', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71da2c5e-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.534749916, 'message_signature': '754a598f0197d3bc42b16cd72eada556dac5b24dadf72744fce4b4cb269d86ee'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.024903', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71da3cbc-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.534749916, 'message_signature': '7ef4262a8370852d85c6c250511218bd7690b8d5060318a4339ed9bb904a43c2'}]}, 'timestamp': '2026-01-23 09:32:33.027532', '_unique_id': '10416acce2e646c4a4292f376d0e448d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.030 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.031 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.042 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/cpu volume: 9800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d99c726-b68d-4e2f-b0be-006162c4c9ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9800000000, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'timestamp': '2026-01-23T09:32:33.030528', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '71dc9b56-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.572066778, 'message_signature': 'b215f7ebf3a835603404dfab4dfe2a15eb39abe0d8a2796231b7e7c6d6a19f9f'}]}, 'timestamp': '2026-01-23 09:32:33.043034', '_unique_id': '3c1ca95f766a4287a85d5bf8b3f713fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.043 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.044 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.045 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.045 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76196acd-f902-4ee6-8dca-ff14ac979da5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.044729', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71dcfe20-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': '00b9d67bec9303aa541923402d4b190eaa6df1137470254214e6a745cac2620c'}]}, 'timestamp': '2026-01-23 09:32:33.045562', '_unique_id': 'c9a48442601a4507a87a38ebc0543288'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.047 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.047 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85701a41-0de9-43ad-afc0-557d044fdda1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.047098', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71dd5b18-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': '21b6ba34401200ccda1380d9fdf892385bf99f76965b4bfbbc719d05384f8009'}]}, 'timestamp': '2026-01-23 09:32:33.047941', '_unique_id': '0835ed455f6e4793a6a4ad86928626d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.048 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.049 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.049 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1468833441>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-580802037>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1468833441>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-580802037>]
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.049 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.050 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.050 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/memory.usage volume: 40.4296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff211ff6-4a93-4bb7-9175-446a98b6fabe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4296875, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'timestamp': '2026-01-23T09:32:33.049915', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '71ddc6b6-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.572066778, 'message_signature': '1749b4764c49309d8e676fec626609afdf817fdfedbcd08578b5c56a3623ba11'}]}, 'timestamp': '2026-01-23 09:32:33.050715', '_unique_id': 'd90b10a1b148432095bfdf170e7a2627'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.052 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.073 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.write.bytes volume: 72777728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.073 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd9beccb-8eca-4366-a0d7-bad3e403710b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72777728, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.052201', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71e139a4-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': 'a64e091dd824739729ded86632bb6989e46848301bba5a1221860870e8361678'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.052201', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71e14250-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': '212e52d7a0cabd71ae3e4b366def3de71303325a7a4761ad85125296a28fae3d'}]}, 'timestamp': '2026-01-23 09:32:33.073508', '_unique_id': 'c93ad59b2b1b4cd5a92a35a73c4e14e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2f44cc1-e7d4-42cd-b0b8-e061715c769d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.074634', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71e18ef4-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': '61f334d0387f167d7268b45f610d3daea97220ed1eeab69e4e799c34a9a53b1a'}]}, 'timestamp': '2026-01-23 09:32:33.075485', '_unique_id': '37f04ed984b64a098218fc2ae0af58a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.077 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.077 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.read.requests volume: 1105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.077 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3811af9-87a0-4b71-86a1-4e03cef3d3cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1105, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.076558', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71e1e778-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': '3d5673c3229e1be0617e0b6b8c002716a58e8d5c96261acb1b7bd1f09f2cb021'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.076558', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71e1f1fa-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': 'e3a76ae8d0687886c663c83b3bb0940e421be77d03ce169d070d66e75c0b04ba'}]}, 'timestamp': '2026-01-23 09:32:33.078006', '_unique_id': 'a9c04eaa7cf547cda55580cc875f659f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.078 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.079 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57182f64-32de-4af3-a324-bf94835967b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.079563', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71e24cea-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': '816a60081d25235f419e1ee222e10787f56026606a607a522383c54fa5dac4f8'}]}, 'timestamp': '2026-01-23 09:32:33.080360', '_unique_id': 'c665ebfbc7544a0b81dfa937222475c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.080 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.082 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.082 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82320741-019b-4b43-af39-4c97bef35490', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.081874', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71e2a712-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': '3c04ec9170627e1a568042a98f87370e993cabf0072e5674cb09816be08fe315'}]}, 'timestamp': '2026-01-23 09:32:33.082678', '_unique_id': '7cab62a1c07d4a1e959d7df0595a6b59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.083 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.084 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.084 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e0bca10-b2b1-4eeb-85f6-4b52afe9c93d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.084176', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71e3019e-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': '034ca20ce0a23f41cff86782d5182dc60ce124a0219dd44453e9eccea22d9507'}]}, 'timestamp': '2026-01-23 09:32:33.084974', '_unique_id': 'b1cdb6da56cf492db00f1f1f300d0ba3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.085 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.086 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.086 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c25ce95-9182-416b-a08a-f4e3e08ed78d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.086472', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71e35a18-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': 'f7cdc74ca79ec739454eea2f39faea45ee3e233abd58d745c0339bafa7d3a226'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.086472', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71e363fa-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': 'c03f66f52c0ce8f140aa092bc12b80bf3121685a7b2d365a70279d5cebbff890'}]}, 'timestamp': '2026-01-23 09:32:33.087477', '_unique_id': '2ff3f6169eef49cfa9c0831e54c78a9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.087 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.089 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.089 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.read.latency volume: 212487023 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.read.latency volume: 60522520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83ff82e8-cc9c-4766-99a6-4ae8d83c19db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 212487023, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.089193', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71e3c714-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': '6cad21a088dd246a32320339d1e7d9c03846d0e80d8662ea983dcf439c5be31e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60522520, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.089193', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71e3d060-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': '42419184926d01e0bcca3013760de3897a081e2847e85831a2fb6c841da34878'}]}, 'timestamp': '2026-01-23 09:32:33.090268', '_unique_id': '34a939ef004d4c9aa69a615d72329662'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.092 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.092 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cdc0826-170a-469a-b207-437aa31f6256', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.091791', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71e42aa6-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': 'f34505fe6725bf405bb3431103a0f7b49ec91670d51169228a092a4a9a17a1fe'}]}, 'timestamp': '2026-01-23 09:32:33.092577', '_unique_id': 'f8a0e269ce434d40b9b1ce86ad4a539f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.093 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.095 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.095 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.095 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e160750-30dd-4feb-92db-b9388d00ba37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.094075', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71e4acec-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': '647cf8e8c5b9778dd760beb5641be22123089173438e1d963fa140df970b97a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.094075', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71e4b642-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': '6046a5966e16047b7e3e45b75194f5f530ce4040f8a9bc2fd15f1e9231d42f0a'}]}, 'timestamp': '2026-01-23 09:32:33.096139', '_unique_id': 'bb804946454643b59163328eb57843bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.096 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.098 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6002e068-02da-46e3-a6b0-183905764017', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': 'instance-0000007d-4764c1b5-427e-4933-ae62-df47006aa673-tap82ac778e-8b', 'timestamp': '2026-01-23T09:32:33.097636', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'tap82ac778e-8b', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:d9:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82ac778e-8b'}, 'message_id': '71e52fbe-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.549113391, 'message_signature': '8702df9ff509e68d30f9c14ebca9a4c61501acf6d376e12e3c5dadb3d8f7cbd0'}]}, 'timestamp': '2026-01-23 09:32:33.099270', '_unique_id': '78076fe664634c98986ace4f97b809a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.099 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.101 12 DEBUG ceilometer.compute.pollsters [-] Instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000078, id=6eb3a9e9-805d-4468-bfd1-03aa390682f8>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.102 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.write.latency volume: 620554196 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.102 12 DEBUG ceilometer.compute.pollsters [-] 4764c1b5-427e-4933-ae62-df47006aa673/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56d35e85-4b27-4edc-a709-6accaa2717ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 620554196, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-vda', 'timestamp': '2026-01-23T09:32:33.100905', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71e5a750-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': '6776938f615b0733f8e2ed95f9eaa1d4a8d7e38514667a2f92f975e24f372974'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_name': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_name': None, 'resource_id': '4764c1b5-427e-4933-ae62-df47006aa673-sda', 'timestamp': '2026-01-23T09:32:33.100905', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-580802037', 'name': 'instance-0000007d', 'instance_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'instance_type': 'm1.nano', 'host': '77762c52c7acb5c6bd3c5af0ac485514ccc60c6532b9b89b4ee35dc4', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71e5b11e-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4258.582109101, 'message_signature': 'dd0ccb9d4e7f6119db74fd37344cf008e535461364650b522e9bf384f58c77a6'}]}, 'timestamp': '2026-01-23 09:32:33.102561', '_unique_id': 'f6d2beb7c6f14780b2caa609e30e29b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.104 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:32:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:32:33.104 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1468833441>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-580802037>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1468833441>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-580802037>]
Jan 23 09:32:33 compute-0 nova_compute[182092]: 2026-01-23 09:32:33.509 182096 DEBUG nova.network.neutron [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Updating instance_info_cache with network_info: [{"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:33 compute-0 nova_compute[182092]: 2026-01-23 09:32:33.526 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Releasing lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:34 compute-0 podman[225115]: 2026-01-23 09:32:34.23256947 +0000 UTC m=+0.064982581 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.559 182096 INFO nova.virt.libvirt.driver [-] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Instance destroyed successfully.
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.560 182096 DEBUG nova.objects.instance [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lazy-loading 'resources' on Instance uuid 6eb3a9e9-805d-4468-bfd1-03aa390682f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.572 182096 DEBUG nova.virt.libvirt.vif [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:31:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1468833441',display_name='tempest-ServersNegativeTestJSON-server-1468833441',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1468833441',id=120,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:31:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b1b83842511b438480c657f4b89702d0',ramdisk_id='',reservation_id='r-6eeyckqc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-977727523',owner_user_name='tempest-ServersNegativeTestJSON-977727523-project-member',shelved_at='2026-01-23T09:32:31.623633',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='2ceba02a-285c-4918-a5bc-dd5d3f254255'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:32:29Z,user_data=None,user_id='c664df671f084e15bedf8f948ca3d38c',uuid=6eb3a9e9-805d-4468-bfd1-03aa390682f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.572 182096 DEBUG nova.network.os_vif_util [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Converting VIF {"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4101a62-8d", "ovs_interfaceid": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.573 182096 DEBUG nova.network.os_vif_util [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:77:97,bridge_name='br-int',has_traffic_filtering=True,id=c4101a62-8d2e-4ff2-b17d-95110c75a4bd,network=Network(3a9e1d00-8438-4823-956e-6cae137c7678),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4101a62-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.574 182096 DEBUG os_vif [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:77:97,bridge_name='br-int',has_traffic_filtering=True,id=c4101a62-8d2e-4ff2-b17d-95110c75a4bd,network=Network(3a9e1d00-8438-4823-956e-6cae137c7678),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4101a62-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.575 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.576 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4101a62-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.577 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.580 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.585 182096 INFO os_vif [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:77:97,bridge_name='br-int',has_traffic_filtering=True,id=c4101a62-8d2e-4ff2-b17d-95110c75a4bd,network=Network(3a9e1d00-8438-4823-956e-6cae137c7678),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4101a62-8d')
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.585 182096 INFO nova.virt.libvirt.driver [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Deleting instance files /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8_del
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.590 182096 INFO nova.virt.libvirt.driver [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Deletion of /var/lib/nova/instances/6eb3a9e9-805d-4468-bfd1-03aa390682f8_del complete
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.659 182096 DEBUG nova.compute.manager [req-438f3eb4-3c49-46e1-8ba1-e3a85bf846d1 req-051293d0-e738-4f4d-b8c2-beb6f4f8fa6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Received event network-changed-c4101a62-8d2e-4ff2-b17d-95110c75a4bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.660 182096 DEBUG nova.compute.manager [req-438f3eb4-3c49-46e1-8ba1-e3a85bf846d1 req-051293d0-e738-4f4d-b8c2-beb6f4f8fa6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Refreshing instance network info cache due to event network-changed-c4101a62-8d2e-4ff2-b17d-95110c75a4bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.660 182096 DEBUG oslo_concurrency.lockutils [req-438f3eb4-3c49-46e1-8ba1-e3a85bf846d1 req-051293d0-e738-4f4d-b8c2-beb6f4f8fa6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.660 182096 DEBUG oslo_concurrency.lockutils [req-438f3eb4-3c49-46e1-8ba1-e3a85bf846d1 req-051293d0-e738-4f4d-b8c2-beb6f4f8fa6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.660 182096 DEBUG nova.network.neutron [req-438f3eb4-3c49-46e1-8ba1-e3a85bf846d1 req-051293d0-e738-4f4d-b8c2-beb6f4f8fa6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Refreshing network info cache for port c4101a62-8d2e-4ff2-b17d-95110c75a4bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.671 182096 INFO nova.scheduler.client.report [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Deleted allocations for instance 6eb3a9e9-805d-4468-bfd1-03aa390682f8
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.720 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.720 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.774 182096 DEBUG nova.compute.provider_tree [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.784 182096 DEBUG nova.scheduler.client.report [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.806 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.881 182096 INFO nova.compute.manager [None req-d68a696f-0ccf-4b97-a81b-b6cec596fa8b 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Get console output
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.881 182096 DEBUG oslo_concurrency.lockutils [None req-bf652bcf-3c63-43ac-acc2-078300d68f01 c664df671f084e15bedf8f948ca3d38c b1b83842511b438480c657f4b89702d0 - - default default] Lock "6eb3a9e9-805d-4468-bfd1-03aa390682f8" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 8.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:34 compute-0 nova_compute[182092]: 2026-01-23 09:32:34.886 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.108 182096 DEBUG oslo_concurrency.lockutils [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.108 182096 DEBUG oslo_concurrency.lockutils [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.108 182096 DEBUG nova.compute.manager [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.110 182096 DEBUG nova.compute.manager [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.111 182096 DEBUG nova.objects.instance [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'flavor' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.131 182096 DEBUG nova.objects.instance [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'info_cache' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.157 182096 DEBUG nova.virt.libvirt.driver [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.572 182096 DEBUG nova.network.neutron [req-438f3eb4-3c49-46e1-8ba1-e3a85bf846d1 req-051293d0-e738-4f4d-b8c2-beb6f4f8fa6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Updated VIF entry in instance network info cache for port c4101a62-8d2e-4ff2-b17d-95110c75a4bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.573 182096 DEBUG nova.network.neutron [req-438f3eb4-3c49-46e1-8ba1-e3a85bf846d1 req-051293d0-e738-4f4d-b8c2-beb6f4f8fa6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Updating instance_info_cache with network_info: [{"id": "c4101a62-8d2e-4ff2-b17d-95110c75a4bd", "address": "fa:16:3e:4f:77:97", "network": {"id": "3a9e1d00-8438-4823-956e-6cae137c7678", "bridge": null, "label": "tempest-ServersNegativeTestJSON-1011329146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b83842511b438480c657f4b89702d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapc4101a62-8d", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:35 compute-0 nova_compute[182092]: 2026-01-23 09:32:35.591 182096 DEBUG oslo_concurrency.lockutils [req-438f3eb4-3c49-46e1-8ba1-e3a85bf846d1 req-051293d0-e738-4f4d-b8c2-beb6f4f8fa6e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-6eb3a9e9-805d-4468-bfd1-03aa390682f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:36 compute-0 nova_compute[182092]: 2026-01-23 09:32:36.776 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "ee953858-5a67-4803-a72b-f81eab35c0cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:36 compute-0 nova_compute[182092]: 2026-01-23 09:32:36.776 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:36 compute-0 nova_compute[182092]: 2026-01-23 09:32:36.792 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:32:36 compute-0 nova_compute[182092]: 2026-01-23 09:32:36.907 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:36 compute-0 nova_compute[182092]: 2026-01-23 09:32:36.907 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:36 compute-0 nova_compute[182092]: 2026-01-23 09:32:36.913 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:32:36 compute-0 nova_compute[182092]: 2026-01-23 09:32:36.913 182096 INFO nova.compute.claims [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.058 182096 DEBUG nova.compute.provider_tree [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.069 182096 DEBUG nova.scheduler.client.report [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.085 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.085 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.157 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.157 182096 DEBUG nova.network.neutron [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.171 182096 INFO nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.183 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.203 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.277 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.278 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.278 182096 INFO nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Creating image(s)
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.279 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "/var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.279 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "/var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.280 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "/var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:37 compute-0 kernel: tap82ac778e-8b (unregistering): left promiscuous mode
Jan 23 09:32:37 compute-0 NetworkManager[54920]: <info>  [1769160757.2884] device (tap82ac778e-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.291 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:37 compute-0 ovn_controller[94697]: 2026-01-23T09:32:37Z|00458|binding|INFO|Releasing lport 82ac778e-8b80-408b-8434-4614306b12a2 from this chassis (sb_readonly=0)
Jan 23 09:32:37 compute-0 ovn_controller[94697]: 2026-01-23T09:32:37Z|00459|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 down in Southbound
Jan 23 09:32:37 compute-0 ovn_controller[94697]: 2026-01-23T09:32:37Z|00460|binding|INFO|Removing iface tap82ac778e-8b ovn-installed in OVS
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.307 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:d9:b0 10.100.0.12'], port_security=['fa:16:3e:45:d9:b0 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4604008e-5083-4e5d-81a5-72c28448263a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d568202-ce92-4fcd-8e08-378aa90d8dd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=82ac778e-8b80-408b-8434-4614306b12a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.308 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 82ac778e-8b80-408b-8434-4614306b12a2 in datapath baaff0ba-bd4f-42f9-a73e-ba8b1647f64f unbound from our chassis
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.309 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.312 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[11d64de7-3b8e-4844-9894-812dd6ca76da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.309 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.312 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f namespace which is not needed anymore
Jan 23 09:32:37 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Jan 23 09:32:37 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000007d.scope: Consumed 11.310s CPU time.
Jan 23 09:32:37 compute-0 systemd-machined[153562]: Machine qemu-63-instance-0000007d terminated.
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.357 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.357 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.359 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.368 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.384 182096 DEBUG nova.policy [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e5868b02619e49e0b66f98db9a403cdd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '21aea57050b240e4936b364484455df6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:32:37 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[224937]: [NOTICE]   (224941) : haproxy version is 2.8.14-c23fe91
Jan 23 09:32:37 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[224937]: [NOTICE]   (224941) : path to executable is /usr/sbin/haproxy
Jan 23 09:32:37 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[224937]: [WARNING]  (224941) : Exiting Master process...
Jan 23 09:32:37 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[224937]: [ALERT]    (224941) : Current worker (224943) exited with code 143 (Terminated)
Jan 23 09:32:37 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[224937]: [WARNING]  (224941) : All workers exited. Exiting... (0)
Jan 23 09:32:37 compute-0 systemd[1]: libpod-28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506.scope: Deactivated successfully.
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.418 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.419 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:37 compute-0 podman[225163]: 2026-01-23 09:32:37.423992169 +0000 UTC m=+0.035429063 container died 28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.442 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.443 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.443 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-43809e5ad80077a2406d21182bfa1d1d0798285c9c47851d61f50cbeddb240df-merged.mount: Deactivated successfully.
Jan 23 09:32:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506-userdata-shm.mount: Deactivated successfully.
Jan 23 09:32:37 compute-0 podman[225163]: 2026-01-23 09:32:37.448930311 +0000 UTC m=+0.060367195 container cleanup 28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:32:37 compute-0 systemd[1]: libpod-conmon-28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506.scope: Deactivated successfully.
Jan 23 09:32:37 compute-0 podman[225192]: 2026-01-23 09:32:37.492841623 +0000 UTC m=+0.028405244 container remove 28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.496 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4d965f70-e756-4bf4-a992-878fea77fc13]: (4, ('Fri Jan 23 09:32:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f (28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506)\n28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506\nFri Jan 23 09:32:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f (28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506)\n28862965d71720b94df644eb07ab093fd8b6a1a246e5bb7404cd4a5c480a6506\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.497 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2abaf700-e2b4-493a-a327-0954cdae6548]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.498 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaaff0ba-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:37 compute-0 kernel: tapbaaff0ba-b0: left promiscuous mode
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.498 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.505 182096 DEBUG nova.virt.disk.api [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Checking if we can resize image /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.505 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:37 compute-0 NetworkManager[54920]: <info>  [1769160757.5156] manager: (tap82ac778e-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.520 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a133252d-467f-4294-bd59-94af31d95610]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.521 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.530 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1919bb57-193d-48b9-9346-958cc0a464d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.532 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa8c60d-442c-4fe5-ba4f-b6e7d3f39ae6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.549 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[689bdb59-10df-4b7e-accb-bbd7c773b807]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424320, 'reachable_time': 38286, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225219, 'error': None, 'target': 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:37 compute-0 systemd[1]: run-netns-ovnmeta\x2dbaaff0ba\x2dbd4f\x2d42f9\x2da73e\x2dba8b1647f64f.mount: Deactivated successfully.
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.551 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:32:37 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:37.552 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[001785c0-05d8-4699-a451-db611802a0bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.566 182096 DEBUG nova.compute.manager [req-b5da5017-319a-4cf6-b14e-38d9d3f55f29 req-d2c81fbd-3957-494d-94f4-0a25a0355772 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-unplugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.567 182096 DEBUG oslo_concurrency.lockutils [req-b5da5017-319a-4cf6-b14e-38d9d3f55f29 req-d2c81fbd-3957-494d-94f4-0a25a0355772 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.567 182096 DEBUG oslo_concurrency.lockutils [req-b5da5017-319a-4cf6-b14e-38d9d3f55f29 req-d2c81fbd-3957-494d-94f4-0a25a0355772 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.568 182096 DEBUG oslo_concurrency.lockutils [req-b5da5017-319a-4cf6-b14e-38d9d3f55f29 req-d2c81fbd-3957-494d-94f4-0a25a0355772 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.568 182096 DEBUG nova.compute.manager [req-b5da5017-319a-4cf6-b14e-38d9d3f55f29 req-d2c81fbd-3957-494d-94f4-0a25a0355772 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] No waiting events found dispatching network-vif-unplugged-82ac778e-8b80-408b-8434-4614306b12a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.568 182096 WARNING nova.compute.manager [req-b5da5017-319a-4cf6-b14e-38d9d3f55f29 req-d2c81fbd-3957-494d-94f4-0a25a0355772 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received unexpected event network-vif-unplugged-82ac778e-8b80-408b-8434-4614306b12a2 for instance with vm_state active and task_state powering-off.
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.571 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.571 182096 DEBUG nova.virt.disk.api [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Cannot resize image /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.572 182096 DEBUG nova.objects.instance [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lazy-loading 'migration_context' on Instance uuid ee953858-5a67-4803-a72b-f81eab35c0cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.584 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.584 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Ensure instance console log exists: /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.584 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.585 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:37 compute-0 nova_compute[182092]: 2026-01-23 09:32:37.585 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:38 compute-0 nova_compute[182092]: 2026-01-23 09:32:38.126 182096 DEBUG nova.network.neutron [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Successfully created port: 6fd2181b-103a-4d93-a7be-e15723e46df2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:32:38 compute-0 nova_compute[182092]: 2026-01-23 09:32:38.168 182096 INFO nova.virt.libvirt.driver [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance shutdown successfully after 3 seconds.
Jan 23 09:32:38 compute-0 nova_compute[182092]: 2026-01-23 09:32:38.172 182096 INFO nova.virt.libvirt.driver [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance destroyed successfully.
Jan 23 09:32:38 compute-0 nova_compute[182092]: 2026-01-23 09:32:38.172 182096 DEBUG nova.objects.instance [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'numa_topology' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:38 compute-0 nova_compute[182092]: 2026-01-23 09:32:38.206 182096 DEBUG nova.compute.manager [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:38 compute-0 nova_compute[182092]: 2026-01-23 09:32:38.270 182096 DEBUG oslo_concurrency.lockutils [None req-73c0e252-f8cb-4fb7-a778-a5676261d757 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.022 182096 DEBUG nova.network.neutron [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Successfully updated port: 6fd2181b-103a-4d93-a7be-e15723e46df2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.044 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "refresh_cache-ee953858-5a67-4803-a72b-f81eab35c0cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.044 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquired lock "refresh_cache-ee953858-5a67-4803-a72b-f81eab35c0cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.044 182096 DEBUG nova.network.neutron [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.135 182096 DEBUG nova.compute.manager [req-b2a95085-f492-482a-b856-d6a4636b16a0 req-98252dfc-b889-428c-af1f-2f92e3ef9f4f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received event network-changed-6fd2181b-103a-4d93-a7be-e15723e46df2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.135 182096 DEBUG nova.compute.manager [req-b2a95085-f492-482a-b856-d6a4636b16a0 req-98252dfc-b889-428c-af1f-2f92e3ef9f4f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Refreshing instance network info cache due to event network-changed-6fd2181b-103a-4d93-a7be-e15723e46df2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.135 182096 DEBUG oslo_concurrency.lockutils [req-b2a95085-f492-482a-b856-d6a4636b16a0 req-98252dfc-b889-428c-af1f-2f92e3ef9f4f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-ee953858-5a67-4803-a72b-f81eab35c0cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.218 182096 DEBUG nova.network.neutron [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.580 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.623 182096 DEBUG nova.compute.manager [req-d19285c1-4651-4b92-8ddc-9dcf6543be67 req-49177486-a2fa-4644-b766-04b3d40de783 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.624 182096 DEBUG oslo_concurrency.lockutils [req-d19285c1-4651-4b92-8ddc-9dcf6543be67 req-49177486-a2fa-4644-b766-04b3d40de783 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.624 182096 DEBUG oslo_concurrency.lockutils [req-d19285c1-4651-4b92-8ddc-9dcf6543be67 req-49177486-a2fa-4644-b766-04b3d40de783 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.624 182096 DEBUG oslo_concurrency.lockutils [req-d19285c1-4651-4b92-8ddc-9dcf6543be67 req-49177486-a2fa-4644-b766-04b3d40de783 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.624 182096 DEBUG nova.compute.manager [req-d19285c1-4651-4b92-8ddc-9dcf6543be67 req-49177486-a2fa-4644-b766-04b3d40de783 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] No waiting events found dispatching network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.624 182096 WARNING nova.compute.manager [req-d19285c1-4651-4b92-8ddc-9dcf6543be67 req-49177486-a2fa-4644-b766-04b3d40de783 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received unexpected event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 for instance with vm_state stopped and task_state None.
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.853 182096 DEBUG nova.network.neutron [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Updating instance_info_cache with network_info: [{"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:39.864 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:39.865 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:39.865 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.885 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Releasing lock "refresh_cache-ee953858-5a67-4803-a72b-f81eab35c0cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.885 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Instance network_info: |[{"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.885 182096 DEBUG oslo_concurrency.lockutils [req-b2a95085-f492-482a-b856-d6a4636b16a0 req-98252dfc-b889-428c-af1f-2f92e3ef9f4f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-ee953858-5a67-4803-a72b-f81eab35c0cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.886 182096 DEBUG nova.network.neutron [req-b2a95085-f492-482a-b856-d6a4636b16a0 req-98252dfc-b889-428c-af1f-2f92e3ef9f4f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Refreshing network info cache for port 6fd2181b-103a-4d93-a7be-e15723e46df2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.888 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Start _get_guest_xml network_info=[{"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.891 182096 WARNING nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.897 182096 DEBUG nova.virt.libvirt.host [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.898 182096 DEBUG nova.virt.libvirt.host [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.900 182096 DEBUG nova.virt.libvirt.host [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.901 182096 DEBUG nova.virt.libvirt.host [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.902 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.902 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.902 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.902 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.903 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.903 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.903 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.903 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.903 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.904 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.904 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.904 182096 DEBUG nova.virt.hardware [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.907 182096 DEBUG nova.virt.libvirt.vif [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:32:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-614548870',display_name='tempest-ServersTestJSON-server-614548870',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-614548870',id=128,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='21aea57050b240e4936b364484455df6',ramdisk_id='',reservation_id='r-rl0exvht',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1898793355',owner_user_name='tempest-ServersTestJSON-1898793355-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:32:37Z,user_data=None,user_id='e5868b02619e49e0b66f98db9a403cdd',uuid=ee953858-5a67-4803-a72b-f81eab35c0cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.907 182096 DEBUG nova.network.os_vif_util [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converting VIF {"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.907 182096 DEBUG nova.network.os_vif_util [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:0c:e7,bridge_name='br-int',has_traffic_filtering=True,id=6fd2181b-103a-4d93-a7be-e15723e46df2,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fd2181b-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.908 182096 DEBUG nova.objects.instance [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee953858-5a67-4803-a72b-f81eab35c0cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.918 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <uuid>ee953858-5a67-4803-a72b-f81eab35c0cf</uuid>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <name>instance-00000080</name>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersTestJSON-server-614548870</nova:name>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:32:39</nova:creationTime>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:32:39 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:32:39 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:32:39 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:32:39 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:32:39 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:32:39 compute-0 nova_compute[182092]:         <nova:user uuid="e5868b02619e49e0b66f98db9a403cdd">tempest-ServersTestJSON-1898793355-project-member</nova:user>
Jan 23 09:32:39 compute-0 nova_compute[182092]:         <nova:project uuid="21aea57050b240e4936b364484455df6">tempest-ServersTestJSON-1898793355</nova:project>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:32:39 compute-0 nova_compute[182092]:         <nova:port uuid="6fd2181b-103a-4d93-a7be-e15723e46df2">
Jan 23 09:32:39 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <system>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <entry name="serial">ee953858-5a67-4803-a72b-f81eab35c0cf</entry>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <entry name="uuid">ee953858-5a67-4803-a72b-f81eab35c0cf</entry>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </system>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <os>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   </os>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <features>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   </features>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk.config"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:e5:0c:e7"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <target dev="tap6fd2181b-10"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/console.log" append="off"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <video>
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </video>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:32:39 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:32:39 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:32:39 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:32:39 compute-0 nova_compute[182092]: </domain>
Jan 23 09:32:39 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.919 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Preparing to wait for external event network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.919 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.919 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.919 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.920 182096 DEBUG nova.virt.libvirt.vif [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:32:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-614548870',display_name='tempest-ServersTestJSON-server-614548870',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-614548870',id=128,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='21aea57050b240e4936b364484455df6',ramdisk_id='',reservation_id='r-rl0exvht',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1898793355',owner_user_name='tempest-ServersTestJSON-1898793355-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:32:37Z,user_data=None,user_id='e5868b02619e49e0b66f98db9a403cdd',uuid=ee953858-5a67-4803-a72b-f81eab35c0cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.920 182096 DEBUG nova.network.os_vif_util [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converting VIF {"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.920 182096 DEBUG nova.network.os_vif_util [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:0c:e7,bridge_name='br-int',has_traffic_filtering=True,id=6fd2181b-103a-4d93-a7be-e15723e46df2,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fd2181b-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.921 182096 DEBUG os_vif [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:0c:e7,bridge_name='br-int',has_traffic_filtering=True,id=6fd2181b-103a-4d93-a7be-e15723e46df2,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fd2181b-10') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.921 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.921 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.922 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.923 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.924 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6fd2181b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.924 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6fd2181b-10, col_values=(('external_ids', {'iface-id': '6fd2181b-103a-4d93-a7be-e15723e46df2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:0c:e7', 'vm-uuid': 'ee953858-5a67-4803-a72b-f81eab35c0cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.925 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:39 compute-0 NetworkManager[54920]: <info>  [1769160759.9264] manager: (tap6fd2181b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.928 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.932 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.933 182096 INFO os_vif [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:0c:e7,bridge_name='br-int',has_traffic_filtering=True,id=6fd2181b-103a-4d93-a7be-e15723e46df2,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fd2181b-10')
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.969 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.969 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.970 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] No VIF found with MAC fa:16:3e:e5:0c:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:32:39 compute-0 nova_compute[182092]: 2026-01-23 09:32:39.970 182096 INFO nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Using config drive
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.001 182096 INFO nova.compute.manager [None req-c80a0195-b651-455d-b2b0-416e02a9b3b8 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Get console output
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.222 182096 DEBUG nova.objects.instance [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'flavor' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.240 182096 DEBUG nova.objects.instance [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'info_cache' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.256 182096 DEBUG oslo_concurrency.lockutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.256 182096 DEBUG oslo_concurrency.lockutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquired lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.257 182096 DEBUG nova.network.neutron [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.480 182096 INFO nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Creating config drive at /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk.config
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.484 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbdplig2q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.604 182096 DEBUG oslo_concurrency.processutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbdplig2q" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:40 compute-0 NetworkManager[54920]: <info>  [1769160760.6466] manager: (tap6fd2181b-10): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Jan 23 09:32:40 compute-0 kernel: tap6fd2181b-10: entered promiscuous mode
Jan 23 09:32:40 compute-0 ovn_controller[94697]: 2026-01-23T09:32:40Z|00461|binding|INFO|Claiming lport 6fd2181b-103a-4d93-a7be-e15723e46df2 for this chassis.
Jan 23 09:32:40 compute-0 ovn_controller[94697]: 2026-01-23T09:32:40Z|00462|binding|INFO|6fd2181b-103a-4d93-a7be-e15723e46df2: Claiming fa:16:3e:e5:0c:e7 10.100.0.11
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.650 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.656 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:0c:e7 10.100.0.11'], port_security=['fa:16:3e:e5:0c:e7 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ee953858-5a67-4803-a72b-f81eab35c0cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2143175e-47cb-4dfa-857f-9144f6e1b535', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21aea57050b240e4936b364484455df6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '59334022-f3b5-45fa-ac27-7162de1e3ccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06e3af09-b59d-4b39-bcb5-29b66ddf8035, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=6fd2181b-103a-4d93-a7be-e15723e46df2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.657 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 6fd2181b-103a-4d93-a7be-e15723e46df2 in datapath 2143175e-47cb-4dfa-857f-9144f6e1b535 bound to our chassis
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.658 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2143175e-47cb-4dfa-857f-9144f6e1b535
Jan 23 09:32:40 compute-0 ovn_controller[94697]: 2026-01-23T09:32:40Z|00463|binding|INFO|Setting lport 6fd2181b-103a-4d93-a7be-e15723e46df2 ovn-installed in OVS
Jan 23 09:32:40 compute-0 ovn_controller[94697]: 2026-01-23T09:32:40Z|00464|binding|INFO|Setting lport 6fd2181b-103a-4d93-a7be-e15723e46df2 up in Southbound
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.665 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.668 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.669 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5a9f1fe9-b216-472a-a2af-ca5e0c9c4d8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.671 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2143175e-41 in ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.673 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2143175e-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.673 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a46dfe4a-8fd7-4c1e-b793-b251cf6da678]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.674 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[61e1c8e1-5bac-45f5-8d0d-abd0e3193c2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 systemd-machined[153562]: New machine qemu-64-instance-00000080.
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.682 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[7e57c55e-0956-4317-ba49-c5abd6749203]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 systemd[1]: Started Virtual Machine qemu-64-instance-00000080.
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.691 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3ddb08-bf1c-4dfb-ae28-2e57c7beb746]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 systemd-udevd[225255]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:32:40 compute-0 NetworkManager[54920]: <info>  [1769160760.7019] device (tap6fd2181b-10): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:32:40 compute-0 NetworkManager[54920]: <info>  [1769160760.7024] device (tap6fd2181b-10): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.715 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5638a553-c746-4ab1-ac85-e942ab9f2011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.719 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[49489043-b423-455d-8be4-66a8af35eadd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 NetworkManager[54920]: <info>  [1769160760.7201] manager: (tap2143175e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/241)
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.744 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[76187b1a-6ec8-4469-bc34-dad34ab9eb32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.746 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6366814c-19fa-437f-adba-51e503a35af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 NetworkManager[54920]: <info>  [1769160760.7666] device (tap2143175e-40): carrier: link connected
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.771 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b59adc17-7ca1-41dc-babe-cc2d094fc740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.783 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[58b9cf2d-552a-4331-baa8-1993687a4280]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2143175e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:f8:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426625, 'reachable_time': 18167, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225280, 'error': None, 'target': 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.792 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[331ab1d0-58fd-442a-a6d7-a83e77d55ea1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:f84c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426625, 'tstamp': 426625}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225281, 'error': None, 'target': 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.804 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[97bd8537-a5ea-4d6d-ac39-1c782428bfa2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2143175e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:f8:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426625, 'reachable_time': 18167, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225282, 'error': None, 'target': 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.823 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fc47c0cb-7247-4143-94f8-4b816a70efc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.859 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[076f9cfb-c90e-417f-9dc2-07b3e44c5331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.860 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2143175e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.860 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.860 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2143175e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:40 compute-0 NetworkManager[54920]: <info>  [1769160760.8626] manager: (tap2143175e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Jan 23 09:32:40 compute-0 kernel: tap2143175e-40: entered promiscuous mode
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.863 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.867 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.870 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2143175e-40, col_values=(('external_ids', {'iface-id': '7af6767c-9c56-47a8-9b6d-5763cb79298d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:40 compute-0 ovn_controller[94697]: 2026-01-23T09:32:40Z|00465|binding|INFO|Releasing lport 7af6767c-9c56-47a8-9b6d-5763cb79298d from this chassis (sb_readonly=0)
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.871 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.885 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.887 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2143175e-47cb-4dfa-857f-9144f6e1b535.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2143175e-47cb-4dfa-857f-9144f6e1b535.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.887 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[96c3294f-5755-4200-bba9-4d3874a8db28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.888 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-2143175e-47cb-4dfa-857f-9144f6e1b535
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/2143175e-47cb-4dfa-857f-9144f6e1b535.pid.haproxy
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 2143175e-47cb-4dfa-857f-9144f6e1b535
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:32:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:40.888 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'env', 'PROCESS_TAG=haproxy-2143175e-47cb-4dfa-857f-9144f6e1b535', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2143175e-47cb-4dfa-857f-9144f6e1b535.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.912 182096 DEBUG nova.compute.manager [req-3f0c0dba-eeee-449a-a18d-6e9a6975d8d5 req-17bf2de8-f246-49b8-b1f9-3c68ff3629fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received event network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.912 182096 DEBUG oslo_concurrency.lockutils [req-3f0c0dba-eeee-449a-a18d-6e9a6975d8d5 req-17bf2de8-f246-49b8-b1f9-3c68ff3629fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.912 182096 DEBUG oslo_concurrency.lockutils [req-3f0c0dba-eeee-449a-a18d-6e9a6975d8d5 req-17bf2de8-f246-49b8-b1f9-3c68ff3629fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.912 182096 DEBUG oslo_concurrency.lockutils [req-3f0c0dba-eeee-449a-a18d-6e9a6975d8d5 req-17bf2de8-f246-49b8-b1f9-3c68ff3629fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:40 compute-0 nova_compute[182092]: 2026-01-23 09:32:40.913 182096 DEBUG nova.compute.manager [req-3f0c0dba-eeee-449a-a18d-6e9a6975d8d5 req-17bf2de8-f246-49b8-b1f9-3c68ff3629fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Processing event network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:32:41 compute-0 podman[225320]: 2026-01-23 09:32:41.170325315 +0000 UTC m=+0.035011656 container create 1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.184 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.186 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160761.1842742, ee953858-5a67-4803-a72b-f81eab35c0cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.186 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] VM Started (Lifecycle Event)
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.192 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:32:41 compute-0 systemd[1]: Started libpod-conmon-1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3.scope.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.194 182096 INFO nova.virt.libvirt.driver [-] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Instance spawned successfully.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.195 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.214 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.214 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.215 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.215 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:41 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.216 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.216 182096 DEBUG nova.virt.libvirt.driver [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.219 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd4c0a29c86f85a04888c4a595b268f7a10b65f3adfbfe51b6029c208131874d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:32:41 compute-0 podman[225320]: 2026-01-23 09:32:41.234959978 +0000 UTC m=+0.099646340 container init 1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.234 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:32:41 compute-0 podman[225320]: 2026-01-23 09:32:41.240665029 +0000 UTC m=+0.105351372 container start 1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:32:41 compute-0 podman[225320]: 2026-01-23 09:32:41.155619103 +0000 UTC m=+0.020305465 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.265 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.266 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160761.1852443, ee953858-5a67-4803-a72b-f81eab35c0cf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.266 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] VM Paused (Lifecycle Event)
Jan 23 09:32:41 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225334]: [NOTICE]   (225350) : New worker (225368) forked
Jan 23 09:32:41 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225334]: [NOTICE]   (225350) : Loading success.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.289 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.294 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160761.1885502, ee953858-5a67-4803-a72b-f81eab35c0cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.294 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] VM Resumed (Lifecycle Event)
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.296 182096 INFO nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Took 4.02 seconds to spawn the instance on the hypervisor.
Jan 23 09:32:41 compute-0 podman[225338]: 2026-01-23 09:32:41.297068415 +0000 UTC m=+0.065547748 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.297 182096 DEBUG nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.312 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.314 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:32:41 compute-0 podman[225337]: 2026-01-23 09:32:41.329485095 +0000 UTC m=+0.096583352 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.340 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.387 182096 INFO nova.compute.manager [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Took 4.51 seconds to build instance.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.398 182096 DEBUG nova.network.neutron [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Updating instance_info_cache with network_info: [{"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.401 182096 DEBUG oslo_concurrency.lockutils [None req-350353fe-4597-4e5a-b343-a69eec667c2d e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.407 182096 DEBUG oslo_concurrency.lockutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Releasing lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.422 182096 INFO nova.virt.libvirt.driver [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance destroyed successfully.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.424 182096 DEBUG nova.objects.instance [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'numa_topology' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.436 182096 DEBUG nova.objects.instance [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'resources' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.451 182096 DEBUG nova.virt.libvirt.vif [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-580802037',display_name='tempest-TestNetworkAdvancedServerOps-server-580802037',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-580802037',id=125,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG7XLzIgUzZQEdBK1DT4VJUEFsSE7ZjpBHSslVMBkSX7VE0L01lxxor2j6G45z6oPLU17GVmBem+PADDQDczbCXPhidh22jGIwjzDpR0mmF8KGuBBI4iMTTeM+hCT4rUiA==',key_name='tempest-TestNetworkAdvancedServerOps-470226595',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-knag5dj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:32:38Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=4764c1b5-427e-4933-ae62-df47006aa673,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.452 182096 DEBUG nova.network.os_vif_util [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.453 182096 DEBUG nova.network.os_vif_util [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.453 182096 DEBUG os_vif [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.454 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.455 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82ac778e-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.456 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.458 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.459 182096 INFO os_vif [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b')
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.463 182096 DEBUG nova.virt.libvirt.driver [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Start _get_guest_xml network_info=[{"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.466 182096 WARNING nova.virt.libvirt.driver [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.471 182096 DEBUG nova.virt.libvirt.host [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.472 182096 DEBUG nova.virt.libvirt.host [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.474 182096 DEBUG nova.virt.libvirt.host [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.474 182096 DEBUG nova.virt.libvirt.host [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.475 182096 DEBUG nova.virt.libvirt.driver [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.475 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.476 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.476 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.476 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.476 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.476 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.476 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.477 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.477 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.477 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.477 182096 DEBUG nova.virt.hardware [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.477 182096 DEBUG nova.objects.instance [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.493 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.554 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.555 182096 DEBUG oslo_concurrency.lockutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.555 182096 DEBUG oslo_concurrency.lockutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.556 182096 DEBUG oslo_concurrency.lockutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.557 182096 DEBUG nova.virt.libvirt.vif [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-580802037',display_name='tempest-TestNetworkAdvancedServerOps-server-580802037',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-580802037',id=125,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG7XLzIgUzZQEdBK1DT4VJUEFsSE7ZjpBHSslVMBkSX7VE0L01lxxor2j6G45z6oPLU17GVmBem+PADDQDczbCXPhidh22jGIwjzDpR0mmF8KGuBBI4iMTTeM+hCT4rUiA==',key_name='tempest-TestNetworkAdvancedServerOps-470226595',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-knag5dj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:32:38Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=4764c1b5-427e-4933-ae62-df47006aa673,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.557 182096 DEBUG nova.network.os_vif_util [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.558 182096 DEBUG nova.network.os_vif_util [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.558 182096 DEBUG nova.objects.instance [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.570 182096 DEBUG nova.virt.libvirt.driver [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <uuid>4764c1b5-427e-4933-ae62-df47006aa673</uuid>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <name>instance-0000007d</name>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-580802037</nova:name>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:32:41</nova:creationTime>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:32:41 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:32:41 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:32:41 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:32:41 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:32:41 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:32:41 compute-0 nova_compute[182092]:         <nova:user uuid="2880f53bded147989ea61dc68ec0880e">tempest-TestNetworkAdvancedServerOps-169193993-project-member</nova:user>
Jan 23 09:32:41 compute-0 nova_compute[182092]:         <nova:project uuid="5a5525bfc549464cace77d44548fb012">tempest-TestNetworkAdvancedServerOps-169193993</nova:project>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:32:41 compute-0 nova_compute[182092]:         <nova:port uuid="82ac778e-8b80-408b-8434-4614306b12a2">
Jan 23 09:32:41 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <system>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <entry name="serial">4764c1b5-427e-4933-ae62-df47006aa673</entry>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <entry name="uuid">4764c1b5-427e-4933-ae62-df47006aa673</entry>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </system>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <os>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   </os>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <features>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   </features>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk.config"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:45:d9:b0"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <target dev="tap82ac778e-8b"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/console.log" append="off"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <video>
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </video>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <input type="keyboard" bus="usb"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:32:41 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:32:41 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:32:41 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:32:41 compute-0 nova_compute[182092]: </domain>
Jan 23 09:32:41 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.571 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.629 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.630 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.689 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.690 182096 DEBUG nova.objects.instance [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.708 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.752 182096 DEBUG nova.network.neutron [req-b2a95085-f492-482a-b856-d6a4636b16a0 req-98252dfc-b889-428c-af1f-2f92e3ef9f4f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Updated VIF entry in instance network info cache for port 6fd2181b-103a-4d93-a7be-e15723e46df2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.752 182096 DEBUG nova.network.neutron [req-b2a95085-f492-482a-b856-d6a4636b16a0 req-98252dfc-b889-428c-af1f-2f92e3ef9f4f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Updating instance_info_cache with network_info: [{"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.766 182096 DEBUG oslo_concurrency.lockutils [req-b2a95085-f492-482a-b856-d6a4636b16a0 req-98252dfc-b889-428c-af1f-2f92e3ef9f4f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-ee953858-5a67-4803-a72b-f81eab35c0cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.770 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.771 182096 DEBUG nova.virt.disk.api [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Checking if we can resize image /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.771 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.819 182096 DEBUG oslo_concurrency.processutils [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.820 182096 DEBUG nova.virt.disk.api [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Cannot resize image /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.820 182096 DEBUG nova.objects.instance [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'migration_context' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.831 182096 DEBUG nova.virt.libvirt.vif [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-580802037',display_name='tempest-TestNetworkAdvancedServerOps-server-580802037',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-580802037',id=125,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG7XLzIgUzZQEdBK1DT4VJUEFsSE7ZjpBHSslVMBkSX7VE0L01lxxor2j6G45z6oPLU17GVmBem+PADDQDczbCXPhidh22jGIwjzDpR0mmF8KGuBBI4iMTTeM+hCT4rUiA==',key_name='tempest-TestNetworkAdvancedServerOps-470226595',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-knag5dj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:32:38Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=4764c1b5-427e-4933-ae62-df47006aa673,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.831 182096 DEBUG nova.network.os_vif_util [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.832 182096 DEBUG nova.network.os_vif_util [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.832 182096 DEBUG os_vif [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.832 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.833 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.833 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.835 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.835 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82ac778e-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.836 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82ac778e-8b, col_values=(('external_ids', {'iface-id': '82ac778e-8b80-408b-8434-4614306b12a2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:d9:b0', 'vm-uuid': '4764c1b5-427e-4933-ae62-df47006aa673'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.837 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 NetworkManager[54920]: <info>  [1769160761.8381] manager: (tap82ac778e-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.840 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.843 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.843 182096 INFO os_vif [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b')
Jan 23 09:32:41 compute-0 NetworkManager[54920]: <info>  [1769160761.8933] manager: (tap82ac778e-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/244)
Jan 23 09:32:41 compute-0 kernel: tap82ac778e-8b: entered promiscuous mode
Jan 23 09:32:41 compute-0 systemd-udevd[225269]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:32:41 compute-0 ovn_controller[94697]: 2026-01-23T09:32:41Z|00466|binding|INFO|Claiming lport 82ac778e-8b80-408b-8434-4614306b12a2 for this chassis.
Jan 23 09:32:41 compute-0 ovn_controller[94697]: 2026-01-23T09:32:41Z|00467|binding|INFO|82ac778e-8b80-408b-8434-4614306b12a2: Claiming fa:16:3e:45:d9:b0 10.100.0.12
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.899 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 NetworkManager[54920]: <info>  [1769160761.9079] device (tap82ac778e-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:32:41 compute-0 NetworkManager[54920]: <info>  [1769160761.9085] device (tap82ac778e-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.911 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:d9:b0 10.100.0.12'], port_security=['fa:16:3e:45:d9:b0 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4604008e-5083-4e5d-81a5-72c28448263a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d568202-ce92-4fcd-8e08-378aa90d8dd0, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=82ac778e-8b80-408b-8434-4614306b12a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.912 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 82ac778e-8b80-408b-8434-4614306b12a2 in datapath baaff0ba-bd4f-42f9-a73e-ba8b1647f64f bound to our chassis
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.914 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network baaff0ba-bd4f-42f9-a73e-ba8b1647f64f
Jan 23 09:32:41 compute-0 ovn_controller[94697]: 2026-01-23T09:32:41Z|00468|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 ovn-installed in OVS
Jan 23 09:32:41 compute-0 ovn_controller[94697]: 2026-01-23T09:32:41Z|00469|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 up in Southbound
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.917 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 nova_compute[182092]: 2026-01-23 09:32:41.920 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.925 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[695f0c70-c3e9-45c4-8174-e29e3a891b8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.925 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbaaff0ba-b1 in ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.927 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbaaff0ba-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.927 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0f412f-b6dd-412f-aec5-cae77fdd0b89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.928 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[59103aac-4fda-436b-ab21-14c1ef3f1d86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.937 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[1298c53d-8455-4ade-8ac7-4408142b4588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:41 compute-0 systemd-machined[153562]: New machine qemu-65-instance-0000007d.
Jan 23 09:32:41 compute-0 systemd[1]: Started Virtual Machine qemu-65-instance-0000007d.
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.958 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b03b9b1e-ad5a-4fc2-a4e7-41cccf6823cf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.979 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[290233bb-99cf-48c5-a064-718fd967049f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:41.983 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d874ba92-e386-4680-87fb-76d368e7cb73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:41 compute-0 NetworkManager[54920]: <info>  [1769160761.9858] manager: (tapbaaff0ba-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/245)
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.008 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[7a90a27e-20b7-4e23-b08f-4faa70d45cbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.012 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d9af832a-9081-4857-8fa7-f76bdcfed8ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 NetworkManager[54920]: <info>  [1769160762.0297] device (tapbaaff0ba-b0): carrier: link connected
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.034 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[63709f22-1623-455f-b3f4-a7cb07340943]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.047 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[468f3743-12f1-4425-bd5a-64f95ab06c58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaaff0ba-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:07:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426751, 'reachable_time': 24497, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225431, 'error': None, 'target': 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.058 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2c51264c-d114-4f7b-923b-eb68e773653a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef3:7af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426751, 'tstamp': 426751}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225432, 'error': None, 'target': 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.071 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e91bcfce-6f9b-44f0-a95d-581deaf8e617]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaaff0ba-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:07:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426751, 'reachable_time': 24497, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225433, 'error': None, 'target': 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.090 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d48fad91-af53-420d-a069-cd6456cb121e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.128 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2cfe1b47-c1ef-4bb1-b0e7-d2d3b6af1422]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.129 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaaff0ba-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.129 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.129 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbaaff0ba-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:42 compute-0 kernel: tapbaaff0ba-b0: entered promiscuous mode
Jan 23 09:32:42 compute-0 NetworkManager[54920]: <info>  [1769160762.1318] manager: (tapbaaff0ba-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.133 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbaaff0ba-b0, col_values=(('external_ids', {'iface-id': '715202ac-6ce8-4fb8-a4a3-465e6f0c31a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.136 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/baaff0ba-bd4f-42f9-a73e-ba8b1647f64f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/baaff0ba-bd4f-42f9-a73e-ba8b1647f64f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:32:42 compute-0 ovn_controller[94697]: 2026-01-23T09:32:42Z|00470|binding|INFO|Releasing lport 715202ac-6ce8-4fb8-a4a3-465e6f0c31a7 from this chassis (sb_readonly=0)
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.137 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4444c44e-4132-47bd-91cd-26d7d18d3d2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.138 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/baaff0ba-bd4f-42f9-a73e-ba8b1647f64f.pid.haproxy
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID baaff0ba-bd4f-42f9-a73e-ba8b1647f64f
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:32:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:42.138 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'env', 'PROCESS_TAG=haproxy-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/baaff0ba-bd4f-42f9-a73e-ba8b1647f64f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.148 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.158 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.203 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:42 compute-0 podman[225461]: 2026-01-23 09:32:42.431936427 +0000 UTC m=+0.031981268 container create 263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.435 182096 DEBUG nova.compute.manager [req-ee808635-48e2-4e91-8a2c-b293a97e870f req-5c3952da-d7bb-4b85-9cb0-df7a77b26dcd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.436 182096 DEBUG oslo_concurrency.lockutils [req-ee808635-48e2-4e91-8a2c-b293a97e870f req-5c3952da-d7bb-4b85-9cb0-df7a77b26dcd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.436 182096 DEBUG oslo_concurrency.lockutils [req-ee808635-48e2-4e91-8a2c-b293a97e870f req-5c3952da-d7bb-4b85-9cb0-df7a77b26dcd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.436 182096 DEBUG oslo_concurrency.lockutils [req-ee808635-48e2-4e91-8a2c-b293a97e870f req-5c3952da-d7bb-4b85-9cb0-df7a77b26dcd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.436 182096 DEBUG nova.compute.manager [req-ee808635-48e2-4e91-8a2c-b293a97e870f req-5c3952da-d7bb-4b85-9cb0-df7a77b26dcd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] No waiting events found dispatching network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:42 compute-0 nova_compute[182092]: 2026-01-23 09:32:42.437 182096 WARNING nova.compute.manager [req-ee808635-48e2-4e91-8a2c-b293a97e870f req-5c3952da-d7bb-4b85-9cb0-df7a77b26dcd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received unexpected event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 for instance with vm_state stopped and task_state powering-on.
Jan 23 09:32:42 compute-0 systemd[1]: Started libpod-conmon-263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078.scope.
Jan 23 09:32:42 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:32:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b955cc5fa13e1d664d8e35815a4f4d5b854341d70fd6899d48d3359665815db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:32:42 compute-0 podman[225461]: 2026-01-23 09:32:42.416952372 +0000 UTC m=+0.016997233 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:32:42 compute-0 podman[225461]: 2026-01-23 09:32:42.515243023 +0000 UTC m=+0.115287894 container init 263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:32:42 compute-0 podman[225461]: 2026-01-23 09:32:42.519318761 +0000 UTC m=+0.119363602 container start 263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:32:42 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[225473]: [NOTICE]   (225477) : New worker (225479) forked
Jan 23 09:32:42 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[225473]: [NOTICE]   (225477) : Loading success.
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.030 182096 DEBUG nova.compute.manager [req-f58d6e5d-f700-4a68-bf1a-3c63bc546abd req-ff2e32e4-e268-476d-9317-0e876b84ff5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received event network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.031 182096 DEBUG oslo_concurrency.lockutils [req-f58d6e5d-f700-4a68-bf1a-3c63bc546abd req-ff2e32e4-e268-476d-9317-0e876b84ff5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.031 182096 DEBUG oslo_concurrency.lockutils [req-f58d6e5d-f700-4a68-bf1a-3c63bc546abd req-ff2e32e4-e268-476d-9317-0e876b84ff5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.031 182096 DEBUG oslo_concurrency.lockutils [req-f58d6e5d-f700-4a68-bf1a-3c63bc546abd req-ff2e32e4-e268-476d-9317-0e876b84ff5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.031 182096 DEBUG nova.compute.manager [req-f58d6e5d-f700-4a68-bf1a-3c63bc546abd req-ff2e32e4-e268-476d-9317-0e876b84ff5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] No waiting events found dispatching network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.031 182096 WARNING nova.compute.manager [req-f58d6e5d-f700-4a68-bf1a-3c63bc546abd req-ff2e32e4-e268-476d-9317-0e876b84ff5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received unexpected event network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 for instance with vm_state active and task_state None.
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.349 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "ee953858-5a67-4803-a72b-f81eab35c0cf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.350 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.351 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.351 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.351 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.359 182096 INFO nova.compute.manager [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Terminating instance
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.368 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160748.3654304, 6eb3a9e9-805d-4468-bfd1-03aa390682f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.368 182096 INFO nova.compute.manager [-] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] VM Stopped (Lifecycle Event)
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.370 182096 DEBUG nova.compute.manager [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.370 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 4764c1b5-427e-4933-ae62-df47006aa673 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.370 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160763.3696163, 4764c1b5-427e-4933-ae62-df47006aa673 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.371 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] VM Resumed (Lifecycle Event)
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.372 182096 DEBUG nova.compute.manager [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.375 182096 INFO nova.virt.libvirt.driver [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance rebooted successfully.
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.375 182096 DEBUG nova.compute.manager [None req-b7090549-b0f2-4fbc-9331-99346872cae9 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:43 compute-0 kernel: tap6fd2181b-10 (unregistering): left promiscuous mode
Jan 23 09:32:43 compute-0 NetworkManager[54920]: <info>  [1769160763.3857] device (tap6fd2181b-10): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:32:43 compute-0 ovn_controller[94697]: 2026-01-23T09:32:43Z|00471|binding|INFO|Releasing lport 6fd2181b-103a-4d93-a7be-e15723e46df2 from this chassis (sb_readonly=0)
Jan 23 09:32:43 compute-0 ovn_controller[94697]: 2026-01-23T09:32:43Z|00472|binding|INFO|Setting lport 6fd2181b-103a-4d93-a7be-e15723e46df2 down in Southbound
Jan 23 09:32:43 compute-0 ovn_controller[94697]: 2026-01-23T09:32:43Z|00473|binding|INFO|Removing iface tap6fd2181b-10 ovn-installed in OVS
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.398 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.400 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:0c:e7 10.100.0.11'], port_security=['fa:16:3e:e5:0c:e7 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ee953858-5a67-4803-a72b-f81eab35c0cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2143175e-47cb-4dfa-857f-9144f6e1b535', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21aea57050b240e4936b364484455df6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '59334022-f3b5-45fa-ac27-7162de1e3ccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06e3af09-b59d-4b39-bcb5-29b66ddf8035, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=6fd2181b-103a-4d93-a7be-e15723e46df2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.402 182096 DEBUG nova.compute.manager [None req-d5379a2f-f2e5-4ed5-80a3-b6ffd193d3bd - - - - - -] [instance: 6eb3a9e9-805d-4468-bfd1-03aa390682f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.403 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 6fd2181b-103a-4d93-a7be-e15723e46df2 in datapath 2143175e-47cb-4dfa-857f-9144f6e1b535 unbound from our chassis
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.403 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.405 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2143175e-47cb-4dfa-857f-9144f6e1b535, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.408 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d643974b-78e2-43d6-a94e-439afe7adbcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.408 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 namespace which is not needed anymore
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.411 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.412 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.441 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160763.3698947, 4764c1b5-427e-4933-ae62-df47006aa673 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.441 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] VM Started (Lifecycle Event)
Jan 23 09:32:43 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000080.scope: Deactivated successfully.
Jan 23 09:32:43 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000080.scope: Consumed 2.528s CPU time.
Jan 23 09:32:43 compute-0 systemd-machined[153562]: Machine qemu-64-instance-00000080 terminated.
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.460 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.463 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:32:43 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225334]: [NOTICE]   (225350) : haproxy version is 2.8.14-c23fe91
Jan 23 09:32:43 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225334]: [NOTICE]   (225350) : path to executable is /usr/sbin/haproxy
Jan 23 09:32:43 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225334]: [ALERT]    (225350) : Current worker (225368) exited with code 143 (Terminated)
Jan 23 09:32:43 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225334]: [WARNING]  (225350) : All workers exited. Exiting... (0)
Jan 23 09:32:43 compute-0 systemd[1]: libpod-1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3.scope: Deactivated successfully.
Jan 23 09:32:43 compute-0 podman[225509]: 2026-01-23 09:32:43.522041339 +0000 UTC m=+0.035198499 container died 1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:32:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3-userdata-shm.mount: Deactivated successfully.
Jan 23 09:32:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd4c0a29c86f85a04888c4a595b268f7a10b65f3adfbfe51b6029c208131874d-merged.mount: Deactivated successfully.
Jan 23 09:32:43 compute-0 podman[225509]: 2026-01-23 09:32:43.538844616 +0000 UTC m=+0.052001775 container cleanup 1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:32:43 compute-0 systemd[1]: libpod-conmon-1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3.scope: Deactivated successfully.
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.586 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.591 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 podman[225532]: 2026-01-23 09:32:43.5944195 +0000 UTC m=+0.037861651 container remove 1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.601 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d073ac02-e602-4c16-8259-85c643c66cea]: (4, ('Fri Jan 23 09:32:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 (1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3)\n1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3\nFri Jan 23 09:32:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 (1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3)\n1d783b25f8e047e7ab2fbf6523932d1bcea4a28dc96f93cdd8f40eae67b4eab3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.602 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce83998-2763-4f49-b33f-ac02ddcde459]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.603 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2143175e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.605 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 kernel: tap2143175e-40: left promiscuous mode
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.617 182096 INFO nova.virt.libvirt.driver [-] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Instance destroyed successfully.
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.617 182096 DEBUG nova.objects.instance [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lazy-loading 'resources' on Instance uuid ee953858-5a67-4803-a72b-f81eab35c0cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.621 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.622 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.626 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d3753469-247c-4a1b-a372-0d2aa28092a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.628 182096 DEBUG nova.virt.libvirt.vif [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:32:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-614548870',display_name='tempest-ServersTestJSON-server-614548870',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-614548870',id=128,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='21aea57050b240e4936b364484455df6',ramdisk_id='',reservation_id='r-rl0exvht',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1898793355',owner_user_name='tempest-ServersTestJSON-1898793355-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:32:41Z,user_data=None,user_id='e5868b02619e49e0b66f98db9a403cdd',uuid=ee953858-5a67-4803-a72b-f81eab35c0cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.628 182096 DEBUG nova.network.os_vif_util [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converting VIF {"id": "6fd2181b-103a-4d93-a7be-e15723e46df2", "address": "fa:16:3e:e5:0c:e7", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fd2181b-10", "ovs_interfaceid": "6fd2181b-103a-4d93-a7be-e15723e46df2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.629 182096 DEBUG nova.network.os_vif_util [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:0c:e7,bridge_name='br-int',has_traffic_filtering=True,id=6fd2181b-103a-4d93-a7be-e15723e46df2,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fd2181b-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.629 182096 DEBUG os_vif [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:0c:e7,bridge_name='br-int',has_traffic_filtering=True,id=6fd2181b-103a-4d93-a7be-e15723e46df2,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fd2181b-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.630 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.630 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6fd2181b-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.632 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.635 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.637 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.640 182096 INFO os_vif [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:0c:e7,bridge_name='br-int',has_traffic_filtering=True,id=6fd2181b-103a-4d93-a7be-e15723e46df2,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fd2181b-10')
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.640 182096 INFO nova.virt.libvirt.driver [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Deleting instance files /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf_del
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.640 182096 INFO nova.virt.libvirt.driver [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Deletion of /var/lib/nova/instances/ee953858-5a67-4803-a72b-f81eab35c0cf_del complete
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.643 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e593c02a-2c9b-49c1-8fa7-781da773efa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.643 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[afd39c9a-7b76-43c0-a460-ab0e288207af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.656 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[47644be7-cfb6-4b56-932b-7e371d753709]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426619, 'reachable_time': 23323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225565, 'error': None, 'target': 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d2143175e\x2d47cb\x2d4dfa\x2d857f\x2d9144f6e1b535.mount: Deactivated successfully.
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.659 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:32:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:32:43.660 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[16a542ca-4454-4f27-86cc-abca2189805e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.693 182096 INFO nova.compute.manager [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.694 182096 DEBUG oslo.service.loopingcall [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.694 182096 DEBUG nova.compute.manager [-] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:32:43 compute-0 nova_compute[182092]: 2026-01-23 09:32:43.695 182096 DEBUG nova.network.neutron [-] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.401 182096 DEBUG nova.network.neutron [-] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.412 182096 INFO nova.compute.manager [-] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Took 0.72 seconds to deallocate network for instance.
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.459 182096 DEBUG nova.compute.manager [req-3da932f4-6301-44e8-a995-a67cb22ab19f req-93fea956-036b-47fe-91db-ddabb4e250c6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received event network-vif-deleted-6fd2181b-103a-4d93-a7be-e15723e46df2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.469 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.469 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.528 182096 DEBUG nova.compute.manager [req-46cdc811-52bb-4d81-be10-658d0d2388f7 req-e534dee8-5eee-43de-97d5-e6cbf3f22efc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.528 182096 DEBUG oslo_concurrency.lockutils [req-46cdc811-52bb-4d81-be10-658d0d2388f7 req-e534dee8-5eee-43de-97d5-e6cbf3f22efc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.529 182096 DEBUG oslo_concurrency.lockutils [req-46cdc811-52bb-4d81-be10-658d0d2388f7 req-e534dee8-5eee-43de-97d5-e6cbf3f22efc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.529 182096 DEBUG oslo_concurrency.lockutils [req-46cdc811-52bb-4d81-be10-658d0d2388f7 req-e534dee8-5eee-43de-97d5-e6cbf3f22efc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.529 182096 DEBUG nova.compute.manager [req-46cdc811-52bb-4d81-be10-658d0d2388f7 req-e534dee8-5eee-43de-97d5-e6cbf3f22efc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] No waiting events found dispatching network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.530 182096 WARNING nova.compute.manager [req-46cdc811-52bb-4d81-be10-658d0d2388f7 req-e534dee8-5eee-43de-97d5-e6cbf3f22efc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received unexpected event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 for instance with vm_state active and task_state None.
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.547 182096 DEBUG nova.compute.provider_tree [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.558 182096 DEBUG nova.scheduler.client.report [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.573 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.590 182096 INFO nova.scheduler.client.report [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Deleted allocations for instance ee953858-5a67-4803-a72b-f81eab35c0cf
Jan 23 09:32:44 compute-0 nova_compute[182092]: 2026-01-23 09:32:44.636 182096 DEBUG oslo_concurrency.lockutils [None req-c37b6e48-f04c-48c6-97fb-72e1c75f74f9 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.094 182096 DEBUG nova.compute.manager [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received event network-vif-unplugged-6fd2181b-103a-4d93-a7be-e15723e46df2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.095 182096 DEBUG oslo_concurrency.lockutils [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.095 182096 DEBUG oslo_concurrency.lockutils [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.095 182096 DEBUG oslo_concurrency.lockutils [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.096 182096 DEBUG nova.compute.manager [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] No waiting events found dispatching network-vif-unplugged-6fd2181b-103a-4d93-a7be-e15723e46df2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.096 182096 WARNING nova.compute.manager [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received unexpected event network-vif-unplugged-6fd2181b-103a-4d93-a7be-e15723e46df2 for instance with vm_state deleted and task_state None.
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.096 182096 DEBUG nova.compute.manager [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received event network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.097 182096 DEBUG oslo_concurrency.lockutils [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.097 182096 DEBUG oslo_concurrency.lockutils [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.097 182096 DEBUG oslo_concurrency.lockutils [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "ee953858-5a67-4803-a72b-f81eab35c0cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.097 182096 DEBUG nova.compute.manager [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] No waiting events found dispatching network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.098 182096 WARNING nova.compute.manager [req-bde503bb-3029-4ed5-82b2-72226bce6133 req-34901670-c6c1-4e5f-a732-d6c0852e9737 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Received unexpected event network-vif-plugged-6fd2181b-103a-4d93-a7be-e15723e46df2 for instance with vm_state deleted and task_state None.
Jan 23 09:32:45 compute-0 nova_compute[182092]: 2026-01-23 09:32:45.527 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:47 compute-0 nova_compute[182092]: 2026-01-23 09:32:47.205 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:48 compute-0 nova_compute[182092]: 2026-01-23 09:32:48.045 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:48 compute-0 nova_compute[182092]: 2026-01-23 09:32:48.632 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:52 compute-0 nova_compute[182092]: 2026-01-23 09:32:52.206 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:53 compute-0 podman[225576]: 2026-01-23 09:32:53.219340384 +0000 UTC m=+0.048386354 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:32:53 compute-0 podman[225575]: 2026-01-23 09:32:53.243277217 +0000 UTC m=+0.073675518 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 09:32:53 compute-0 nova_compute[182092]: 2026-01-23 09:32:53.633 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:54 compute-0 ovn_controller[94697]: 2026-01-23T09:32:54Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:d9:b0 10.100.0.12
Jan 23 09:32:56 compute-0 podman[225612]: 2026-01-23 09:32:56.200359226 +0000 UTC m=+0.038411167 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public)
Jan 23 09:32:57 compute-0 nova_compute[182092]: 2026-01-23 09:32:57.208 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:32:58 compute-0 nova_compute[182092]: 2026-01-23 09:32:58.615 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160763.6141093, ee953858-5a67-4803-a72b-f81eab35c0cf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:32:58 compute-0 nova_compute[182092]: 2026-01-23 09:32:58.616 182096 INFO nova.compute.manager [-] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] VM Stopped (Lifecycle Event)
Jan 23 09:32:58 compute-0 nova_compute[182092]: 2026-01-23 09:32:58.629 182096 DEBUG nova.compute.manager [None req-e6172c5f-b5b6-41a9-86a5-c505e0b2cefa - - - - - -] [instance: ee953858-5a67-4803-a72b-f81eab35c0cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:32:58 compute-0 nova_compute[182092]: 2026-01-23 09:32:58.634 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:01 compute-0 nova_compute[182092]: 2026-01-23 09:33:01.015 182096 INFO nova.compute.manager [None req-c1f3d185-a1cb-4683-a6c7-a40b7af330cf 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Get console output
Jan 23 09:33:01 compute-0 nova_compute[182092]: 2026-01-23 09:33:01.019 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.210 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.438 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.438 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.438 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.439 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.439 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.445 182096 INFO nova.compute.manager [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Terminating instance
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.452 182096 DEBUG nova.compute.manager [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:33:02 compute-0 kernel: tap82ac778e-8b (unregistering): left promiscuous mode
Jan 23 09:33:02 compute-0 NetworkManager[54920]: <info>  [1769160782.4800] device (tap82ac778e-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.486 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.488 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00474|binding|INFO|Releasing lport 82ac778e-8b80-408b-8434-4614306b12a2 from this chassis (sb_readonly=0)
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00475|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 down in Southbound
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00476|binding|INFO|Removing iface tap82ac778e-8b ovn-installed in OVS
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.494 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:d9:b0 10.100.0.12'], port_security=['fa:16:3e:45:d9:b0 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '6', 'neutron:security_group_ids': '4604008e-5083-4e5d-81a5-72c28448263a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d568202-ce92-4fcd-8e08-378aa90d8dd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=82ac778e-8b80-408b-8434-4614306b12a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.497 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 82ac778e-8b80-408b-8434-4614306b12a2 in datapath baaff0ba-bd4f-42f9-a73e-ba8b1647f64f unbound from our chassis
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.499 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.500 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[477eb515-1662-4522-84da-90b7e238f8b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.500 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f namespace which is not needed anymore
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.513 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Jan 23 09:33:02 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000007d.scope: Consumed 11.893s CPU time.
Jan 23 09:33:02 compute-0 systemd-machined[153562]: Machine qemu-65-instance-0000007d terminated.
Jan 23 09:33:02 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[225473]: [NOTICE]   (225477) : haproxy version is 2.8.14-c23fe91
Jan 23 09:33:02 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[225473]: [NOTICE]   (225477) : path to executable is /usr/sbin/haproxy
Jan 23 09:33:02 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[225473]: [ALERT]    (225477) : Current worker (225479) exited with code 143 (Terminated)
Jan 23 09:33:02 compute-0 neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f[225473]: [WARNING]  (225477) : All workers exited. Exiting... (0)
Jan 23 09:33:02 compute-0 systemd[1]: libpod-263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078.scope: Deactivated successfully.
Jan 23 09:33:02 compute-0 conmon[225473]: conmon 263964f7a262abf3c6c7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078.scope/container/memory.events
Jan 23 09:33:02 compute-0 podman[225653]: 2026-01-23 09:33:02.601186905 +0000 UTC m=+0.036745896 container died 263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:33:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078-userdata-shm.mount: Deactivated successfully.
Jan 23 09:33:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-3b955cc5fa13e1d664d8e35815a4f4d5b854341d70fd6899d48d3359665815db-merged.mount: Deactivated successfully.
Jan 23 09:33:02 compute-0 podman[225653]: 2026-01-23 09:33:02.62174245 +0000 UTC m=+0.057301440 container cleanup 263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 09:33:02 compute-0 systemd[1]: libpod-conmon-263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078.scope: Deactivated successfully.
Jan 23 09:33:02 compute-0 kernel: tap82ac778e-8b: entered promiscuous mode
Jan 23 09:33:02 compute-0 NetworkManager[54920]: <info>  [1769160782.6669] manager: (tap82ac778e-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/247)
Jan 23 09:33:02 compute-0 systemd-udevd[225635]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.670 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00477|binding|INFO|Claiming lport 82ac778e-8b80-408b-8434-4614306b12a2 for this chassis.
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00478|binding|INFO|82ac778e-8b80-408b-8434-4614306b12a2: Claiming fa:16:3e:45:d9:b0 10.100.0.12
Jan 23 09:33:02 compute-0 podman[225677]: 2026-01-23 09:33:02.672946699 +0000 UTC m=+0.034757856 container remove 263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:33:02 compute-0 kernel: tap82ac778e-8b (unregistering): left promiscuous mode
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.677 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:d9:b0 10.100.0.12'], port_security=['fa:16:3e:45:d9:b0 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '6', 'neutron:security_group_ids': '4604008e-5083-4e5d-81a5-72c28448263a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d568202-ce92-4fcd-8e08-378aa90d8dd0, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=82ac778e-8b80-408b-8434-4614306b12a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.678 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc7b660-110b-4977-8a53-ba699d5cc942]: (4, ('Fri Jan 23 09:33:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f (263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078)\n263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078\nFri Jan 23 09:33:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f (263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078)\n263964f7a262abf3c6c7cf64929641c0e5c7f5cd64cb9ea5a6bb6c7910da1078\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.679 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3572ac40-6fd3-4b8c-855e-27fdee213f39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.680 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaaff0ba-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.684 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.687 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 kernel: tapbaaff0ba-b0: left promiscuous mode
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00479|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 ovn-installed in OVS
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00480|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 up in Southbound
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.698 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.703 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00481|binding|INFO|Releasing lport 82ac778e-8b80-408b-8434-4614306b12a2 from this chassis (sb_readonly=0)
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00482|binding|INFO|Setting lport 82ac778e-8b80-408b-8434-4614306b12a2 down in Southbound
Jan 23 09:33:02 compute-0 ovn_controller[94697]: 2026-01-23T09:33:02Z|00483|binding|INFO|Removing iface tap82ac778e-8b ovn-installed in OVS
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.705 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.707 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:d9:b0 10.100.0.12'], port_security=['fa:16:3e:45:d9:b0 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4764c1b5-427e-4933-ae62-df47006aa673', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '6', 'neutron:security_group_ids': '4604008e-5083-4e5d-81a5-72c28448263a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d568202-ce92-4fcd-8e08-378aa90d8dd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=82ac778e-8b80-408b-8434-4614306b12a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.705 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[295b24d6-59e1-4afa-aebd-f2f3718088d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.710 182096 INFO nova.virt.libvirt.driver [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance destroyed successfully.
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.711 182096 DEBUG nova.objects.instance [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'resources' on Instance uuid 4764c1b5-427e-4933-ae62-df47006aa673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.718 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.721 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7577b448-322c-44c7-bd76-a60373ff59c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.721 182096 DEBUG nova.virt.libvirt.vif [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:32:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-580802037',display_name='tempest-TestNetworkAdvancedServerOps-server-580802037',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-580802037',id=125,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG7XLzIgUzZQEdBK1DT4VJUEFsSE7ZjpBHSslVMBkSX7VE0L01lxxor2j6G45z6oPLU17GVmBem+PADDQDczbCXPhidh22jGIwjzDpR0mmF8KGuBBI4iMTTeM+hCT4rUiA==',key_name='tempest-TestNetworkAdvancedServerOps-470226595',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-knag5dj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:32:43Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=4764c1b5-427e-4933-ae62-df47006aa673,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.722 182096 DEBUG nova.network.os_vif_util [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "82ac778e-8b80-408b-8434-4614306b12a2", "address": "fa:16:3e:45:d9:b0", "network": {"id": "baaff0ba-bd4f-42f9-a73e-ba8b1647f64f", "bridge": "br-int", "label": "tempest-network-smoke--331431586", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ac778e-8b", "ovs_interfaceid": "82ac778e-8b80-408b-8434-4614306b12a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.722 182096 DEBUG nova.network.os_vif_util [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.723 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[42c3da87-8f83-4244-8755-bae937f2bd5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.722 182096 DEBUG os_vif [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.723 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.724 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82ac778e-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.725 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.726 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.728 182096 INFO os_vif [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:d9:b0,bridge_name='br-int',has_traffic_filtering=True,id=82ac778e-8b80-408b-8434-4614306b12a2,network=Network(baaff0ba-bd4f-42f9-a73e-ba8b1647f64f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ac778e-8b')
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.729 182096 INFO nova.virt.libvirt.driver [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Deleting instance files /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673_del
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.729 182096 INFO nova.virt.libvirt.driver [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Deletion of /var/lib/nova/instances/4764c1b5-427e-4933-ae62-df47006aa673_del complete
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.737 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9391bd47-e8ec-4d3d-a90d-162147e263fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426746, 'reachable_time': 27850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225701, 'error': None, 'target': 'ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 systemd[1]: run-netns-ovnmeta\x2dbaaff0ba\x2dbd4f\x2d42f9\x2da73e\x2dba8b1647f64f.mount: Deactivated successfully.
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.739 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-baaff0ba-bd4f-42f9-a73e-ba8b1647f64f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.740 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d59ab2-5d05-4635-9013-9aecc61d3332]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.740 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 82ac778e-8b80-408b-8434-4614306b12a2 in datapath baaff0ba-bd4f-42f9-a73e-ba8b1647f64f unbound from our chassis
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.741 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.742 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[eee1c051-d15e-460a-992c-ded29fcc8976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.742 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 82ac778e-8b80-408b-8434-4614306b12a2 in datapath baaff0ba-bd4f-42f9-a73e-ba8b1647f64f unbound from our chassis
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.743 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network baaff0ba-bd4f-42f9-a73e-ba8b1647f64f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:33:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:02.744 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1299a239-5e04-4d9f-b981-3eaf2ba44368]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.772 182096 INFO nova.compute.manager [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.772 182096 DEBUG oslo.service.loopingcall [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.774 182096 DEBUG nova.compute.manager [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:33:02 compute-0 nova_compute[182092]: 2026-01-23 09:33:02.774 182096 DEBUG nova.network.neutron [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.498 182096 DEBUG nova.network.neutron [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.510 182096 INFO nova.compute.manager [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Took 0.74 seconds to deallocate network for instance.
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.550 182096 DEBUG nova.compute.manager [req-0e1c5e19-7d31-44ff-b6b3-6c7c7d6fe752 req-724a7c99-33d8-46cd-8a6d-2f5e241a756b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-deleted-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.556 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.556 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.607 182096 DEBUG nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-unplugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.607 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.607 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.607 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.607 182096 DEBUG nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] No waiting events found dispatching network-vif-unplugged-82ac778e-8b80-408b-8434-4614306b12a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.608 182096 WARNING nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received unexpected event network-vif-unplugged-82ac778e-8b80-408b-8434-4614306b12a2 for instance with vm_state deleted and task_state None.
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.608 182096 DEBUG nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.608 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.608 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.608 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.609 182096 DEBUG nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] No waiting events found dispatching network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.609 182096 WARNING nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received unexpected event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 for instance with vm_state deleted and task_state None.
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.609 182096 DEBUG nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.609 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "4764c1b5-427e-4933-ae62-df47006aa673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.609 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.610 182096 DEBUG oslo_concurrency.lockutils [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.610 182096 DEBUG nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] No waiting events found dispatching network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.610 182096 WARNING nova.compute.manager [req-22f7be88-abfb-40f2-919e-4282fbea9d44 req-39aa6d31-aeef-4035-8345-dbd5d6a6d78d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received unexpected event network-vif-plugged-82ac778e-8b80-408b-8434-4614306b12a2 for instance with vm_state deleted and task_state None.
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.616 182096 DEBUG nova.scheduler.client.report [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.672 182096 DEBUG nova.scheduler.client.report [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.673 182096 DEBUG nova.compute.provider_tree [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.844 182096 DEBUG nova.scheduler.client.report [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.860 182096 DEBUG nova.scheduler.client.report [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.886 182096 DEBUG nova.compute.provider_tree [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.903 182096 DEBUG nova.scheduler.client.report [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.910 182096 DEBUG nova.compute.manager [req-780e22d9-78eb-435f-b63f-3034403850a0 req-1447c710-8e44-46b8-8b2f-c468f31ffb54 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Received event network-changed-82ac778e-8b80-408b-8434-4614306b12a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.910 182096 DEBUG nova.compute.manager [req-780e22d9-78eb-435f-b63f-3034403850a0 req-1447c710-8e44-46b8-8b2f-c468f31ffb54 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Refreshing instance network info cache due to event network-changed-82ac778e-8b80-408b-8434-4614306b12a2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.910 182096 DEBUG oslo_concurrency.lockutils [req-780e22d9-78eb-435f-b63f-3034403850a0 req-1447c710-8e44-46b8-8b2f-c468f31ffb54 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.911 182096 DEBUG oslo_concurrency.lockutils [req-780e22d9-78eb-435f-b63f-3034403850a0 req-1447c710-8e44-46b8-8b2f-c468f31ffb54 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.911 182096 DEBUG nova.network.neutron [req-780e22d9-78eb-435f-b63f-3034403850a0 req-1447c710-8e44-46b8-8b2f-c468f31ffb54 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Refreshing network info cache for port 82ac778e-8b80-408b-8434-4614306b12a2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.922 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.941 182096 INFO nova.scheduler.client.report [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Deleted allocations for instance 4764c1b5-427e-4933-ae62-df47006aa673
Jan 23 09:33:03 compute-0 nova_compute[182092]: 2026-01-23 09:33:03.996 182096 DEBUG oslo_concurrency.lockutils [None req-b76279dc-e02f-4f5f-b7c1-ccb48c05a8a2 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "4764c1b5-427e-4933-ae62-df47006aa673" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:04 compute-0 nova_compute[182092]: 2026-01-23 09:33:04.044 182096 DEBUG nova.network.neutron [req-780e22d9-78eb-435f-b63f-3034403850a0 req-1447c710-8e44-46b8-8b2f-c468f31ffb54 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:33:04 compute-0 nova_compute[182092]: 2026-01-23 09:33:04.314 182096 DEBUG nova.network.neutron [req-780e22d9-78eb-435f-b63f-3034403850a0 req-1447c710-8e44-46b8-8b2f-c468f31ffb54 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:04 compute-0 nova_compute[182092]: 2026-01-23 09:33:04.334 182096 DEBUG oslo_concurrency.lockutils [req-780e22d9-78eb-435f-b63f-3034403850a0 req-1447c710-8e44-46b8-8b2f-c468f31ffb54 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-4764c1b5-427e-4933-ae62-df47006aa673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:05 compute-0 podman[225702]: 2026-01-23 09:33:05.226215695 +0000 UTC m=+0.064013754 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Jan 23 09:33:06 compute-0 nova_compute[182092]: 2026-01-23 09:33:06.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:06 compute-0 nova_compute[182092]: 2026-01-23 09:33:06.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:06 compute-0 nova_compute[182092]: 2026-01-23 09:33:06.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.085 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.279 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.289 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.662 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.686 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.686 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.686 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.686 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.725 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.892 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.893 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5693MB free_disk=73.26350784301758GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.893 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.893 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.941 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.941 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.965 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:33:07 compute-0 nova_compute[182092]: 2026-01-23 09:33:07.973 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:33:08 compute-0 nova_compute[182092]: 2026-01-23 09:33:08.013 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:33:08 compute-0 nova_compute[182092]: 2026-01-23 09:33:08.014 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:09 compute-0 nova_compute[182092]: 2026-01-23 09:33:09.001 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:09 compute-0 nova_compute[182092]: 2026-01-23 09:33:09.002 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:09 compute-0 nova_compute[182092]: 2026-01-23 09:33:09.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:09 compute-0 nova_compute[182092]: 2026-01-23 09:33:09.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:33:09 compute-0 nova_compute[182092]: 2026-01-23 09:33:09.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:33:09 compute-0 nova_compute[182092]: 2026-01-23 09:33:09.661 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:33:09 compute-0 nova_compute[182092]: 2026-01-23 09:33:09.661 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:10 compute-0 nova_compute[182092]: 2026-01-23 09:33:10.657 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.631 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.631 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.646 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.714 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.715 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.719 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.720 182096 INFO nova.compute.claims [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.796 182096 DEBUG nova.compute.provider_tree [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.804 182096 DEBUG nova.scheduler.client.report [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.820 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.821 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.856 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.856 182096 DEBUG nova.network.neutron [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.867 182096 INFO nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.879 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.954 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.955 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.955 182096 INFO nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Creating image(s)
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.955 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "/var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.956 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "/var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.956 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "/var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:11 compute-0 nova_compute[182092]: 2026-01-23 09:33:11.966 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.013 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.014 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.015 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.024 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.058 182096 DEBUG nova.policy [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e5868b02619e49e0b66f98db9a403cdd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '21aea57050b240e4936b364484455df6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.069 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.070 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.091 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.092 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.092 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.139 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.139 182096 DEBUG nova.virt.disk.api [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Checking if we can resize image /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.140 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.191 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.192 182096 DEBUG nova.virt.disk.api [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Cannot resize image /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.192 182096 DEBUG nova.objects.instance [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lazy-loading 'migration_context' on Instance uuid 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.201 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.202 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Ensure instance console log exists: /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.202 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.202 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.202 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:12 compute-0 podman[225740]: 2026-01-23 09:33:12.212226263 +0000 UTC m=+0.047367754 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:33:12 compute-0 podman[225741]: 2026-01-23 09:33:12.225437013 +0000 UTC m=+0.056588227 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.282 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:33:12 compute-0 nova_compute[182092]: 2026-01-23 09:33:12.726 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.054 182096 DEBUG nova.network.neutron [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Successfully created port: affa6a36-18e7-479b-9ac3-f95817b06496 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.764 182096 DEBUG nova.network.neutron [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Successfully updated port: affa6a36-18e7-479b-9ac3-f95817b06496 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.774 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "refresh_cache-9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.774 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquired lock "refresh_cache-9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.774 182096 DEBUG nova.network.neutron [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.847 182096 DEBUG nova.compute.manager [req-c73d6251-3d58-417d-b873-75fa3fb2744a req-fa3dc599-6700-484a-b4db-ed5db01a1029 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Received event network-changed-affa6a36-18e7-479b-9ac3-f95817b06496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.847 182096 DEBUG nova.compute.manager [req-c73d6251-3d58-417d-b873-75fa3fb2744a req-fa3dc599-6700-484a-b4db-ed5db01a1029 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Refreshing instance network info cache due to event network-changed-affa6a36-18e7-479b-9ac3-f95817b06496. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.848 182096 DEBUG oslo_concurrency.lockutils [req-c73d6251-3d58-417d-b873-75fa3fb2744a req-fa3dc599-6700-484a-b4db-ed5db01a1029 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:13.859 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:13.860 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.860 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:13 compute-0 nova_compute[182092]: 2026-01-23 09:33:13.887 182096 DEBUG nova.network.neutron [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.831 182096 DEBUG nova.network.neutron [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Updating instance_info_cache with network_info: [{"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.850 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Releasing lock "refresh_cache-9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.850 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Instance network_info: |[{"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.850 182096 DEBUG oslo_concurrency.lockutils [req-c73d6251-3d58-417d-b873-75fa3fb2744a req-fa3dc599-6700-484a-b4db-ed5db01a1029 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.851 182096 DEBUG nova.network.neutron [req-c73d6251-3d58-417d-b873-75fa3fb2744a req-fa3dc599-6700-484a-b4db-ed5db01a1029 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Refreshing network info cache for port affa6a36-18e7-479b-9ac3-f95817b06496 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.853 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Start _get_guest_xml network_info=[{"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.856 182096 WARNING nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.859 182096 DEBUG nova.virt.libvirt.host [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.860 182096 DEBUG nova.virt.libvirt.host [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.865 182096 DEBUG nova.virt.libvirt.host [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.866 182096 DEBUG nova.virt.libvirt.host [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.867 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.867 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.867 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.867 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.868 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.868 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.868 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.868 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.868 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.869 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.869 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.869 182096 DEBUG nova.virt.hardware [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.872 182096 DEBUG nova.virt.libvirt.vif [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:33:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1968825900',display_name='tempest-ServersTestJSON-server-1968825900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1968825900',id=132,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='21aea57050b240e4936b364484455df6',ramdisk_id='',reservation_id='r-f3hufx9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1898793355',owner_user_name='tempest-ServersTestJSON-1898793355-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:33:11Z,user_data=None,user_id='e5868b02619e49e0b66f98db9a403cdd',uuid=9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.872 182096 DEBUG nova.network.os_vif_util [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converting VIF {"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.873 182096 DEBUG nova.network.os_vif_util [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:66:65,bridge_name='br-int',has_traffic_filtering=True,id=affa6a36-18e7-479b-9ac3-f95817b06496,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaffa6a36-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.873 182096 DEBUG nova.objects.instance [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.882 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <uuid>9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad</uuid>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <name>instance-00000084</name>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <nova:name>tempest-ServersTestJSON-server-1968825900</nova:name>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:33:14</nova:creationTime>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:33:14 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:33:14 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:33:14 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:33:14 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:33:14 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:33:14 compute-0 nova_compute[182092]:         <nova:user uuid="e5868b02619e49e0b66f98db9a403cdd">tempest-ServersTestJSON-1898793355-project-member</nova:user>
Jan 23 09:33:14 compute-0 nova_compute[182092]:         <nova:project uuid="21aea57050b240e4936b364484455df6">tempest-ServersTestJSON-1898793355</nova:project>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:33:14 compute-0 nova_compute[182092]:         <nova:port uuid="affa6a36-18e7-479b-9ac3-f95817b06496">
Jan 23 09:33:14 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <system>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <entry name="serial">9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad</entry>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <entry name="uuid">9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad</entry>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </system>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <os>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   </os>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <features>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   </features>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk.config"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:65:66:65"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <target dev="tapaffa6a36-18"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/console.log" append="off"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <video>
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </video>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:33:14 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:33:14 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:33:14 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:33:14 compute-0 nova_compute[182092]: </domain>
Jan 23 09:33:14 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.883 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Preparing to wait for external event network-vif-plugged-affa6a36-18e7-479b-9ac3-f95817b06496 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.883 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.883 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.883 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.884 182096 DEBUG nova.virt.libvirt.vif [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:33:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1968825900',display_name='tempest-ServersTestJSON-server-1968825900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1968825900',id=132,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='21aea57050b240e4936b364484455df6',ramdisk_id='',reservation_id='r-f3hufx9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1898793355',owner_user_name='tempest-ServersTestJSON-1898793355-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:33:11Z,user_data=None,user_id='e5868b02619e49e0b66f98db9a403cdd',uuid=9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.884 182096 DEBUG nova.network.os_vif_util [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converting VIF {"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.884 182096 DEBUG nova.network.os_vif_util [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:66:65,bridge_name='br-int',has_traffic_filtering=True,id=affa6a36-18e7-479b-9ac3-f95817b06496,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaffa6a36-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.885 182096 DEBUG os_vif [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:66:65,bridge_name='br-int',has_traffic_filtering=True,id=affa6a36-18e7-479b-9ac3-f95817b06496,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaffa6a36-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.885 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.885 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.886 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.888 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.888 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaffa6a36-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.888 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaffa6a36-18, col_values=(('external_ids', {'iface-id': 'affa6a36-18e7-479b-9ac3-f95817b06496', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:66:65', 'vm-uuid': '9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.889 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:14 compute-0 NetworkManager[54920]: <info>  [1769160794.8905] manager: (tapaffa6a36-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.892 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.894 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.894 182096 INFO os_vif [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:66:65,bridge_name='br-int',has_traffic_filtering=True,id=affa6a36-18e7-479b-9ac3-f95817b06496,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaffa6a36-18')
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.922 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.922 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.923 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] No VIF found with MAC fa:16:3e:65:66:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:33:14 compute-0 nova_compute[182092]: 2026-01-23 09:33:14.923 182096 INFO nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Using config drive
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.273 182096 INFO nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Creating config drive at /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk.config
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.277 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppqscg1s8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.396 182096 DEBUG oslo_concurrency.processutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppqscg1s8" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:15 compute-0 kernel: tapaffa6a36-18: entered promiscuous mode
Jan 23 09:33:15 compute-0 NetworkManager[54920]: <info>  [1769160795.4317] manager: (tapaffa6a36-18): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Jan 23 09:33:15 compute-0 ovn_controller[94697]: 2026-01-23T09:33:15Z|00484|binding|INFO|Claiming lport affa6a36-18e7-479b-9ac3-f95817b06496 for this chassis.
Jan 23 09:33:15 compute-0 ovn_controller[94697]: 2026-01-23T09:33:15Z|00485|binding|INFO|affa6a36-18e7-479b-9ac3-f95817b06496: Claiming fa:16:3e:65:66:65 10.100.0.5
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.433 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.435 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.437 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.449 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:66:65 10.100.0.5'], port_security=['fa:16:3e:65:66:65 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2143175e-47cb-4dfa-857f-9144f6e1b535', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21aea57050b240e4936b364484455df6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '59334022-f3b5-45fa-ac27-7162de1e3ccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06e3af09-b59d-4b39-bcb5-29b66ddf8035, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=affa6a36-18e7-479b-9ac3-f95817b06496) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.450 103978 INFO neutron.agent.ovn.metadata.agent [-] Port affa6a36-18e7-479b-9ac3-f95817b06496 in datapath 2143175e-47cb-4dfa-857f-9144f6e1b535 bound to our chassis
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.451 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2143175e-47cb-4dfa-857f-9144f6e1b535
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.458 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4e8f8dde-be99-4cd8-8201-81e5966eee12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.459 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2143175e-41 in ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.460 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2143175e-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.460 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2dafbae1-8451-4b76-a96d-84d150057e41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 systemd-udevd[225800]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.462 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[45a45413-aaae-4ff0-94ea-20ee3fe55de4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 NetworkManager[54920]: <info>  [1769160795.4704] device (tapaffa6a36-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:33:15 compute-0 NetworkManager[54920]: <info>  [1769160795.4709] device (tapaffa6a36-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.470 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a05f71-e693-469d-b572-9ce1532ab5e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 systemd-machined[153562]: New machine qemu-66-instance-00000084.
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.494 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[864e027d-59c1-42d8-922a-1e2ec3d7c335]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 systemd[1]: Started Virtual Machine qemu-66-instance-00000084.
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.496 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:15 compute-0 ovn_controller[94697]: 2026-01-23T09:33:15Z|00486|binding|INFO|Setting lport affa6a36-18e7-479b-9ac3-f95817b06496 ovn-installed in OVS
Jan 23 09:33:15 compute-0 ovn_controller[94697]: 2026-01-23T09:33:15Z|00487|binding|INFO|Setting lport affa6a36-18e7-479b-9ac3-f95817b06496 up in Southbound
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.500 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.515 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd31bce-3971-47df-ac6c-5520efc6624a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.519 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[25305730-ae4d-4caf-8fdc-4ea6bdb9efff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 NetworkManager[54920]: <info>  [1769160795.5213] manager: (tap2143175e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.542 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[90607e07-3c60-42e7-a58c-51cfe8efd293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.544 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ec91976f-0878-4c88-907f-af7c0c60b284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 NetworkManager[54920]: <info>  [1769160795.5592] device (tap2143175e-40): carrier: link connected
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.563 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[81e4005f-8bf7-477e-b0a3-0f18dd91234f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.574 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[71cb1a98-1318-4728-8621-3997e8d41a08]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2143175e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:f8:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430104, 'reachable_time': 17435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225827, 'error': None, 'target': 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.584 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[51bf67ea-6c11-4266-b218-10797c4fb004]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:f84c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430104, 'tstamp': 430104}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225828, 'error': None, 'target': 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.594 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[15ccc0d3-2f56-4087-b680-1223e4a41eaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2143175e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:f8:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430104, 'reachable_time': 17435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225829, 'error': None, 'target': 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.611 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcc1aff-8760-4ac5-b4ce-25dddb3677b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.652 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4de99a-e2f7-497c-b03f-4fd55ddb4b61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.654 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2143175e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.654 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.654 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2143175e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:15 compute-0 kernel: tap2143175e-40: entered promiscuous mode
Jan 23 09:33:15 compute-0 NetworkManager[54920]: <info>  [1769160795.6567] manager: (tap2143175e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.656 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.658 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2143175e-40, col_values=(('external_ids', {'iface-id': '7af6767c-9c56-47a8-9b6d-5763cb79298d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:15 compute-0 ovn_controller[94697]: 2026-01-23T09:33:15Z|00488|binding|INFO|Releasing lport 7af6767c-9c56-47a8-9b6d-5763cb79298d from this chassis (sb_readonly=0)
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.663 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.670 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.671 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2143175e-47cb-4dfa-857f-9144f6e1b535.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2143175e-47cb-4dfa-857f-9144f6e1b535.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.672 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[efcbedb8-5979-42e9-af93-7067977764e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.672 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-2143175e-47cb-4dfa-857f-9144f6e1b535
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/2143175e-47cb-4dfa-857f-9144f6e1b535.pid.haproxy
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 2143175e-47cb-4dfa-857f-9144f6e1b535
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:33:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:15.673 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'env', 'PROCESS_TAG=haproxy-2143175e-47cb-4dfa-857f-9144f6e1b535', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2143175e-47cb-4dfa-857f-9144f6e1b535.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.874 182096 DEBUG nova.compute.manager [req-a0591a78-a527-4969-ba08-bd377b0efd8f req-869e657d-da0d-472f-9298-c549b0193dd2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Received event network-vif-plugged-affa6a36-18e7-479b-9ac3-f95817b06496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.874 182096 DEBUG oslo_concurrency.lockutils [req-a0591a78-a527-4969-ba08-bd377b0efd8f req-869e657d-da0d-472f-9298-c549b0193dd2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.875 182096 DEBUG oslo_concurrency.lockutils [req-a0591a78-a527-4969-ba08-bd377b0efd8f req-869e657d-da0d-472f-9298-c549b0193dd2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.875 182096 DEBUG oslo_concurrency.lockutils [req-a0591a78-a527-4969-ba08-bd377b0efd8f req-869e657d-da0d-472f-9298-c549b0193dd2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.875 182096 DEBUG nova.compute.manager [req-a0591a78-a527-4969-ba08-bd377b0efd8f req-869e657d-da0d-472f-9298-c549b0193dd2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Processing event network-vif-plugged-affa6a36-18e7-479b-9ac3-f95817b06496 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.888 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.888 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160795.8878057, 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.888 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] VM Started (Lifecycle Event)
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.891 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.897 182096 INFO nova.virt.libvirt.driver [-] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Instance spawned successfully.
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.898 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.908 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.910 182096 DEBUG nova.network.neutron [req-c73d6251-3d58-417d-b873-75fa3fb2744a req-fa3dc599-6700-484a-b4db-ed5db01a1029 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Updated VIF entry in instance network info cache for port affa6a36-18e7-479b-9ac3-f95817b06496. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.910 182096 DEBUG nova.network.neutron [req-c73d6251-3d58-417d-b873-75fa3fb2744a req-fa3dc599-6700-484a-b4db-ed5db01a1029 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Updating instance_info_cache with network_info: [{"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.914 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.917 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.917 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.918 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.918 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.918 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.919 182096 DEBUG nova.virt.libvirt.driver [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.925 182096 DEBUG oslo_concurrency.lockutils [req-c73d6251-3d58-417d-b873-75fa3fb2744a req-fa3dc599-6700-484a-b4db-ed5db01a1029 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.940 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.941 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160795.887874, 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.941 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] VM Paused (Lifecycle Event)
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.968 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.970 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160795.8957682, 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.971 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] VM Resumed (Lifecycle Event)
Jan 23 09:33:15 compute-0 podman[225864]: 2026-01-23 09:33:15.97253976 +0000 UTC m=+0.036159861 container create 836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.989 182096 INFO nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Took 4.03 seconds to spawn the instance on the hypervisor.
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.990 182096 DEBUG nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.991 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:15 compute-0 nova_compute[182092]: 2026-01-23 09:33:15.995 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:33:15 compute-0 systemd[1]: Started libpod-conmon-836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16.scope.
Jan 23 09:33:16 compute-0 nova_compute[182092]: 2026-01-23 09:33:16.016 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:33:16 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:33:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf6869b768d069c0d12fb362271e82bf04605712be2f0d05b35a971eee1a9666/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:33:16 compute-0 podman[225864]: 2026-01-23 09:33:16.030961624 +0000 UTC m=+0.094581736 container init 836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 09:33:16 compute-0 podman[225864]: 2026-01-23 09:33:16.041172345 +0000 UTC m=+0.104792446 container start 836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:33:16 compute-0 podman[225864]: 2026-01-23 09:33:15.956699299 +0000 UTC m=+0.020319421 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:33:16 compute-0 nova_compute[182092]: 2026-01-23 09:33:16.052 182096 INFO nova.compute.manager [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Took 4.35 seconds to build instance.
Jan 23 09:33:16 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225876]: [NOTICE]   (225880) : New worker (225882) forked
Jan 23 09:33:16 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225876]: [NOTICE]   (225880) : Loading success.
Jan 23 09:33:16 compute-0 nova_compute[182092]: 2026-01-23 09:33:16.062 182096 DEBUG oslo_concurrency.lockutils [None req-d4176191-f14e-4322-a1ef-faf5ff98a299 e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:16.861 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.283 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.708 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160782.7067697, 4764c1b5-427e-4933-ae62-df47006aa673 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.708 182096 INFO nova.compute.manager [-] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] VM Stopped (Lifecycle Event)
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.726 182096 DEBUG nova.compute.manager [None req-274bb6c8-a6d1-416c-9e33-6f9feb486e09 - - - - - -] [instance: 4764c1b5-427e-4933-ae62-df47006aa673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.952 182096 DEBUG nova.compute.manager [req-70416952-4e93-423f-a370-b04227854774 req-846c4f16-08e2-4c7a-8307-9e4a3c8e6e56 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Received event network-vif-plugged-affa6a36-18e7-479b-9ac3-f95817b06496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.953 182096 DEBUG oslo_concurrency.lockutils [req-70416952-4e93-423f-a370-b04227854774 req-846c4f16-08e2-4c7a-8307-9e4a3c8e6e56 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.953 182096 DEBUG oslo_concurrency.lockutils [req-70416952-4e93-423f-a370-b04227854774 req-846c4f16-08e2-4c7a-8307-9e4a3c8e6e56 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.954 182096 DEBUG oslo_concurrency.lockutils [req-70416952-4e93-423f-a370-b04227854774 req-846c4f16-08e2-4c7a-8307-9e4a3c8e6e56 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.954 182096 DEBUG nova.compute.manager [req-70416952-4e93-423f-a370-b04227854774 req-846c4f16-08e2-4c7a-8307-9e4a3c8e6e56 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] No waiting events found dispatching network-vif-plugged-affa6a36-18e7-479b-9ac3-f95817b06496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:17 compute-0 nova_compute[182092]: 2026-01-23 09:33:17.954 182096 WARNING nova.compute.manager [req-70416952-4e93-423f-a370-b04227854774 req-846c4f16-08e2-4c7a-8307-9e4a3c8e6e56 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Received unexpected event network-vif-plugged-affa6a36-18e7-479b-9ac3-f95817b06496 for instance with vm_state active and task_state None.
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.504 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.505 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.505 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.505 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.506 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.512 182096 INFO nova.compute.manager [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Terminating instance
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.517 182096 DEBUG nova.compute.manager [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:33:18 compute-0 kernel: tapaffa6a36-18 (unregistering): left promiscuous mode
Jan 23 09:33:18 compute-0 NetworkManager[54920]: <info>  [1769160798.5343] device (tapaffa6a36-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:33:18 compute-0 ovn_controller[94697]: 2026-01-23T09:33:18Z|00489|binding|INFO|Releasing lport affa6a36-18e7-479b-9ac3-f95817b06496 from this chassis (sb_readonly=0)
Jan 23 09:33:18 compute-0 ovn_controller[94697]: 2026-01-23T09:33:18Z|00490|binding|INFO|Setting lport affa6a36-18e7-479b-9ac3-f95817b06496 down in Southbound
Jan 23 09:33:18 compute-0 ovn_controller[94697]: 2026-01-23T09:33:18Z|00491|binding|INFO|Removing iface tapaffa6a36-18 ovn-installed in OVS
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.545 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.546 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.549 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:66:65 10.100.0.5'], port_security=['fa:16:3e:65:66:65 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2143175e-47cb-4dfa-857f-9144f6e1b535', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21aea57050b240e4936b364484455df6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '59334022-f3b5-45fa-ac27-7162de1e3ccc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06e3af09-b59d-4b39-bcb5-29b66ddf8035, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=affa6a36-18e7-479b-9ac3-f95817b06496) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.550 103978 INFO neutron.agent.ovn.metadata.agent [-] Port affa6a36-18e7-479b-9ac3-f95817b06496 in datapath 2143175e-47cb-4dfa-857f-9144f6e1b535 unbound from our chassis
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.551 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2143175e-47cb-4dfa-857f-9144f6e1b535, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.552 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a19d9a-9038-4668-bf55-280c437799c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.552 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 namespace which is not needed anymore
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.561 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000084.scope: Deactivated successfully.
Jan 23 09:33:18 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000084.scope: Consumed 2.934s CPU time.
Jan 23 09:33:18 compute-0 systemd-machined[153562]: Machine qemu-66-instance-00000084 terminated.
Jan 23 09:33:18 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225876]: [NOTICE]   (225880) : haproxy version is 2.8.14-c23fe91
Jan 23 09:33:18 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225876]: [NOTICE]   (225880) : path to executable is /usr/sbin/haproxy
Jan 23 09:33:18 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225876]: [WARNING]  (225880) : Exiting Master process...
Jan 23 09:33:18 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225876]: [ALERT]    (225880) : Current worker (225882) exited with code 143 (Terminated)
Jan 23 09:33:18 compute-0 neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535[225876]: [WARNING]  (225880) : All workers exited. Exiting... (0)
Jan 23 09:33:18 compute-0 systemd[1]: libpod-836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16.scope: Deactivated successfully.
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:33:18 compute-0 podman[225908]: 2026-01-23 09:33:18.653320838 +0000 UTC m=+0.035496690 container died 836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 09:33:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16-userdata-shm.mount: Deactivated successfully.
Jan 23 09:33:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-cf6869b768d069c0d12fb362271e82bf04605712be2f0d05b35a971eee1a9666-merged.mount: Deactivated successfully.
Jan 23 09:33:18 compute-0 podman[225908]: 2026-01-23 09:33:18.674062344 +0000 UTC m=+0.056238195 container cleanup 836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:33:18 compute-0 systemd[1]: libpod-conmon-836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16.scope: Deactivated successfully.
Jan 23 09:33:18 compute-0 podman[225932]: 2026-01-23 09:33:18.71557441 +0000 UTC m=+0.025505333 container remove 836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.719 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[16e70fc1-aa1f-4da7-ba66-09538d0e6430]: (4, ('Fri Jan 23 09:33:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 (836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16)\n836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16\nFri Jan 23 09:33:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 (836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16)\n836c50e2b612e1c57304145b9adca3c3f33d6063a774d9b9e843f3cc190c2f16\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.722 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[49b64d18-0aed-4856-a484-5dbb1c0e8ecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.723 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2143175e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:18 compute-0 kernel: tap2143175e-40: left promiscuous mode
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.724 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 NetworkManager[54920]: <info>  [1769160798.7311] manager: (tapaffa6a36-18): new Tun device (/org/freedesktop/NetworkManager/Devices/252)
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.742 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.743 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.744 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[54575fe6-ed0d-4b53-bc61-c2621a5dc4fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.753 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee9dba1-9a24-4267-ae01-d3e9e35fa9ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.754 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ec0052-fa5a-40d2-add9-21fcf7cfbfaa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.770 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[814c5e1a-9d9d-48aa-90be-ef41821214db]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430099, 'reachable_time': 17738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225958, 'error': None, 'target': 'ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d2143175e\x2d47cb\x2d4dfa\x2d857f\x2d9144f6e1b535.mount: Deactivated successfully.
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.772 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2143175e-47cb-4dfa-857f-9144f6e1b535 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:33:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:18.772 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[f437aec0-5f45-49f4-9e64-9c1e3c6869ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.773 182096 INFO nova.virt.libvirt.driver [-] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Instance destroyed successfully.
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.774 182096 DEBUG nova.objects.instance [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lazy-loading 'resources' on Instance uuid 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.782 182096 DEBUG nova.virt.libvirt.vif [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:33:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1968825900',display_name='tempest-ServersTestJSON-server-1968825900',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1968825900',id=132,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:33:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='21aea57050b240e4936b364484455df6',ramdisk_id='',reservation_id='r-f3hufx9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1898793355',owner_user_name='tempest-ServersTestJSON-1898793355-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:33:16Z,user_data=None,user_id='e5868b02619e49e0b66f98db9a403cdd',uuid=9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.782 182096 DEBUG nova.network.os_vif_util [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converting VIF {"id": "affa6a36-18e7-479b-9ac3-f95817b06496", "address": "fa:16:3e:65:66:65", "network": {"id": "2143175e-47cb-4dfa-857f-9144f6e1b535", "bridge": "br-int", "label": "tempest-ServersTestJSON-1186921283-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "21aea57050b240e4936b364484455df6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaffa6a36-18", "ovs_interfaceid": "affa6a36-18e7-479b-9ac3-f95817b06496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.783 182096 DEBUG nova.network.os_vif_util [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:66:65,bridge_name='br-int',has_traffic_filtering=True,id=affa6a36-18e7-479b-9ac3-f95817b06496,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaffa6a36-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.783 182096 DEBUG os_vif [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:66:65,bridge_name='br-int',has_traffic_filtering=True,id=affa6a36-18e7-479b-9ac3-f95817b06496,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaffa6a36-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.784 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.784 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaffa6a36-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.785 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.787 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.790 182096 INFO os_vif [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:66:65,bridge_name='br-int',has_traffic_filtering=True,id=affa6a36-18e7-479b-9ac3-f95817b06496,network=Network(2143175e-47cb-4dfa-857f-9144f6e1b535),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaffa6a36-18')
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.790 182096 INFO nova.virt.libvirt.driver [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Deleting instance files /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad_del
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.791 182096 INFO nova.virt.libvirt.driver [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Deletion of /var/lib/nova/instances/9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad_del complete
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.833 182096 INFO nova.compute.manager [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.834 182096 DEBUG oslo.service.loopingcall [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.834 182096 DEBUG nova.compute.manager [-] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:33:18 compute-0 nova_compute[182092]: 2026-01-23 09:33:18.834 182096 DEBUG nova.network.neutron [-] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.613 182096 DEBUG nova.network.neutron [-] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.630 182096 INFO nova.compute.manager [-] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Took 0.80 seconds to deallocate network for instance.
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.708 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.709 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.710 182096 DEBUG nova.compute.manager [req-fdc7dd2d-0d46-4f88-b7a1-9a7653dcbc73 req-97ec5720-1881-4e39-b495-79b252ed65d5 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Received event network-vif-deleted-affa6a36-18e7-479b-9ac3-f95817b06496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.764 182096 DEBUG nova.compute.provider_tree [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.774 182096 DEBUG nova.scheduler.client.report [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.944 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:19 compute-0 nova_compute[182092]: 2026-01-23 09:33:19.963 182096 INFO nova.scheduler.client.report [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Deleted allocations for instance 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.031 182096 DEBUG oslo_concurrency.lockutils [None req-455cbf04-fc2c-4bd6-b437-eb153c08ac6a e5868b02619e49e0b66f98db9a403cdd 21aea57050b240e4936b364484455df6 - - default default] Lock "9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.453 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.453 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.465 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.530 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.531 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.541 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.541 182096 INFO nova.compute.claims [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.634 182096 DEBUG nova.compute.provider_tree [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.643 182096 DEBUG nova.scheduler.client.report [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.671 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.671 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.714 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.714 182096 DEBUG nova.network.neutron [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.727 182096 INFO nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.743 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.817 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.818 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.818 182096 INFO nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Creating image(s)
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.819 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "/var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.819 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "/var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.819 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "/var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.829 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.878 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.879 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.879 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.889 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.906 182096 DEBUG nova.policy [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2880f53bded147989ea61dc68ec0880e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5a5525bfc549464cace77d44548fb012', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.936 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.937 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.959 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.960 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:20 compute-0 nova_compute[182092]: 2026-01-23 09:33:20.960 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.008 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.009 182096 DEBUG nova.virt.disk.api [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Checking if we can resize image /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.009 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.071 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.072 182096 DEBUG nova.virt.disk.api [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Cannot resize image /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.072 182096 DEBUG nova.objects.instance [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'migration_context' on Instance uuid 17cffa6e-a77c-4bc8-965a-82b05ed586b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.084 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.084 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Ensure instance console log exists: /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.085 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.085 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.085 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:21 compute-0 nova_compute[182092]: 2026-01-23 09:33:21.873 182096 DEBUG nova.network.neutron [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Successfully created port: 0772a773-aac8-4339-8c9d-5823b999c2fd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:33:22 compute-0 nova_compute[182092]: 2026-01-23 09:33:22.285 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:22 compute-0 nova_compute[182092]: 2026-01-23 09:33:22.558 182096 DEBUG nova.network.neutron [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Successfully updated port: 0772a773-aac8-4339-8c9d-5823b999c2fd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:33:22 compute-0 nova_compute[182092]: 2026-01-23 09:33:22.573 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:22 compute-0 nova_compute[182092]: 2026-01-23 09:33:22.573 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquired lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:22 compute-0 nova_compute[182092]: 2026-01-23 09:33:22.573 182096 DEBUG nova.network.neutron [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:33:22 compute-0 nova_compute[182092]: 2026-01-23 09:33:22.720 182096 DEBUG nova.network.neutron [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:33:23 compute-0 nova_compute[182092]: 2026-01-23 09:33:23.785 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.144 182096 DEBUG nova.network.neutron [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updating instance_info_cache with network_info: [{"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.162 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Releasing lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.162 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Instance network_info: |[{"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.164 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Start _get_guest_xml network_info=[{"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.168 182096 WARNING nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.171 182096 DEBUG nova.virt.libvirt.host [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.171 182096 DEBUG nova.virt.libvirt.host [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.174 182096 DEBUG nova.virt.libvirt.host [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.175 182096 DEBUG nova.virt.libvirt.host [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.175 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.176 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.176 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.176 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.176 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.176 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.176 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.177 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.177 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.177 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.177 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.177 182096 DEBUG nova.virt.hardware [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.180 182096 DEBUG nova.virt.libvirt.vif [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:33:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2069664660',display_name='tempest-TestNetworkAdvancedServerOps-server-2069664660',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2069664660',id=134,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUQLriSjjHbXLEiMyQkKDLigOSd5Dv8tekDWyLjAQXwMXljix6ygFfhVVpjSn73NmGuRLe+/DJQuZoGzsEcBeSmuYq0fo7ZuqLW1a5vvWJzJEvEXIB26Ui/UebMH42/fw==',key_name='tempest-TestNetworkAdvancedServerOps-1219377951',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-a2jnembt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:33:20Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=17cffa6e-a77c-4bc8-965a-82b05ed586b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.180 182096 DEBUG nova.network.os_vif_util [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.181 182096 DEBUG nova.network.os_vif_util [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.181 182096 DEBUG nova.objects.instance [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17cffa6e-a77c-4bc8-965a-82b05ed586b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.201 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <uuid>17cffa6e-a77c-4bc8-965a-82b05ed586b6</uuid>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <name>instance-00000086</name>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-2069664660</nova:name>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:33:24</nova:creationTime>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:33:24 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:33:24 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:33:24 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:33:24 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:33:24 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:33:24 compute-0 nova_compute[182092]:         <nova:user uuid="2880f53bded147989ea61dc68ec0880e">tempest-TestNetworkAdvancedServerOps-169193993-project-member</nova:user>
Jan 23 09:33:24 compute-0 nova_compute[182092]:         <nova:project uuid="5a5525bfc549464cace77d44548fb012">tempest-TestNetworkAdvancedServerOps-169193993</nova:project>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:33:24 compute-0 nova_compute[182092]:         <nova:port uuid="0772a773-aac8-4339-8c9d-5823b999c2fd">
Jan 23 09:33:24 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <system>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <entry name="serial">17cffa6e-a77c-4bc8-965a-82b05ed586b6</entry>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <entry name="uuid">17cffa6e-a77c-4bc8-965a-82b05ed586b6</entry>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </system>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <os>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   </os>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <features>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   </features>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk.config"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:c1:7e:1e"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <target dev="tap0772a773-aa"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/console.log" append="off"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <video>
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </video>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:33:24 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:33:24 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:33:24 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:33:24 compute-0 nova_compute[182092]: </domain>
Jan 23 09:33:24 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.201 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Preparing to wait for external event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.201 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.201 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.202 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.202 182096 DEBUG nova.virt.libvirt.vif [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:33:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2069664660',display_name='tempest-TestNetworkAdvancedServerOps-server-2069664660',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2069664660',id=134,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUQLriSjjHbXLEiMyQkKDLigOSd5Dv8tekDWyLjAQXwMXljix6ygFfhVVpjSn73NmGuRLe+/DJQuZoGzsEcBeSmuYq0fo7ZuqLW1a5vvWJzJEvEXIB26Ui/UebMH42/fw==',key_name='tempest-TestNetworkAdvancedServerOps-1219377951',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-a2jnembt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:33:20Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=17cffa6e-a77c-4bc8-965a-82b05ed586b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.202 182096 DEBUG nova.network.os_vif_util [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.203 182096 DEBUG nova.network.os_vif_util [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.203 182096 DEBUG os_vif [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.203 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.204 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.204 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.206 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.206 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0772a773-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.206 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0772a773-aa, col_values=(('external_ids', {'iface-id': '0772a773-aac8-4339-8c9d-5823b999c2fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:7e:1e', 'vm-uuid': '17cffa6e-a77c-4bc8-965a-82b05ed586b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:24 compute-0 NetworkManager[54920]: <info>  [1769160804.2106] manager: (tap0772a773-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.209 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.213 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.214 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.214 182096 INFO os_vif [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa')
Jan 23 09:33:24 compute-0 podman[225978]: 2026-01-23 09:33:24.221318146 +0000 UTC m=+0.047760874 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:33:24 compute-0 podman[225979]: 2026-01-23 09:33:24.246269893 +0000 UTC m=+0.072592505 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.258 182096 DEBUG nova.compute.manager [req-3778352c-fc3d-4deb-984d-f30b0f066b19 req-12d2e2d7-4d15-4767-98ba-3ec6e4c18db9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-changed-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.259 182096 DEBUG nova.compute.manager [req-3778352c-fc3d-4deb-984d-f30b0f066b19 req-12d2e2d7-4d15-4767-98ba-3ec6e4c18db9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Refreshing instance network info cache due to event network-changed-0772a773-aac8-4339-8c9d-5823b999c2fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.259 182096 DEBUG oslo_concurrency.lockutils [req-3778352c-fc3d-4deb-984d-f30b0f066b19 req-12d2e2d7-4d15-4767-98ba-3ec6e4c18db9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.259 182096 DEBUG oslo_concurrency.lockutils [req-3778352c-fc3d-4deb-984d-f30b0f066b19 req-12d2e2d7-4d15-4767-98ba-3ec6e4c18db9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.259 182096 DEBUG nova.network.neutron [req-3778352c-fc3d-4deb-984d-f30b0f066b19 req-12d2e2d7-4d15-4767-98ba-3ec6e4c18db9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Refreshing network info cache for port 0772a773-aac8-4339-8c9d-5823b999c2fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.265 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.265 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.265 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] No VIF found with MAC fa:16:3e:c1:7e:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.265 182096 INFO nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Using config drive
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.588 182096 INFO nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Creating config drive at /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk.config
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.593 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp343z5qk4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.712 182096 DEBUG oslo_concurrency.processutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp343z5qk4" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:24 compute-0 kernel: tap0772a773-aa: entered promiscuous mode
Jan 23 09:33:24 compute-0 ovn_controller[94697]: 2026-01-23T09:33:24Z|00492|binding|INFO|Claiming lport 0772a773-aac8-4339-8c9d-5823b999c2fd for this chassis.
Jan 23 09:33:24 compute-0 ovn_controller[94697]: 2026-01-23T09:33:24Z|00493|binding|INFO|0772a773-aac8-4339-8c9d-5823b999c2fd: Claiming fa:16:3e:c1:7e:1e 10.100.0.7
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.754 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 NetworkManager[54920]: <info>  [1769160804.7563] manager: (tap0772a773-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/254)
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.757 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.769 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:7e:1e 10.100.0.7'], port_security=['fa:16:3e:c1:7e:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '17cffa6e-a77c-4bc8-965a-82b05ed586b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '2', 'neutron:security_group_ids': '993818f6-1b7e-42c6-ba0b-9eb803a2a7ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61785ad5-8ba8-4f1d-8081-40b8ea412d48, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0772a773-aac8-4339-8c9d-5823b999c2fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.770 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0772a773-aac8-4339-8c9d-5823b999c2fd in datapath 7e99b582-97b3-4b8e-8fea-dd2badac679e bound to our chassis
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.771 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7e99b582-97b3-4b8e-8fea-dd2badac679e
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.780 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f13261e8-0224-4660-ac0b-bed272ce1abe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.781 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7e99b582-91 in ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:33:24 compute-0 systemd-udevd[226034]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.782 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7e99b582-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.782 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[492ee2dc-f325-4c06-a6eb-177b7befeeaf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.783 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[725837eb-4642-44ab-b193-4e964fadd30b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 NetworkManager[54920]: <info>  [1769160804.7934] device (tap0772a773-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.793 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[7e9bc8c1-a539-4e18-a05a-f67f6dce1830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 NetworkManager[54920]: <info>  [1769160804.7942] device (tap0772a773-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:33:24 compute-0 systemd-machined[153562]: New machine qemu-67-instance-00000086.
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.813 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9c4568a0-f0ef-4086-88ee-6f9322871650]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 systemd[1]: Started Virtual Machine qemu-67-instance-00000086.
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.815 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 ovn_controller[94697]: 2026-01-23T09:33:24Z|00494|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd ovn-installed in OVS
Jan 23 09:33:24 compute-0 ovn_controller[94697]: 2026-01-23T09:33:24Z|00495|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd up in Southbound
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.819 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.837 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b95edb82-7b62-460a-950e-bc4d20279949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.840 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[14d1f673-b8b0-4b6f-b78c-8803080043bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 NetworkManager[54920]: <info>  [1769160804.8430] manager: (tap7e99b582-90): new Veth device (/org/freedesktop/NetworkManager/Devices/255)
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.865 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[dc67aef3-5dfd-42ce-abf0-701d22010a2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.868 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8459c25b-02be-4ea2-93e2-05789c5ea65c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 NetworkManager[54920]: <info>  [1769160804.8873] device (tap7e99b582-90): carrier: link connected
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.890 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[cded3abc-cf73-4659-98df-966fa2913fc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.904 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3ba536-cac8-4b46-a1d4-0fd015e53603]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e99b582-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:a3:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431037, 'reachable_time': 43193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226061, 'error': None, 'target': 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.916 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f1db546b-1620-4029-a6d8-4ddc32d17d13]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:a3c6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431037, 'tstamp': 431037}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226062, 'error': None, 'target': 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.929 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a97a3746-6ee6-47cd-85d6-bf0f5769c0db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e99b582-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:a3:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431037, 'reachable_time': 43193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226063, 'error': None, 'target': 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.951 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e7295943-354e-4248-ac31-b15cee78e7f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.994 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e7aa8764-6bfa-403d-8ada-dcf6ffb33949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.995 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e99b582-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.995 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:24.995 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e99b582-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.996 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:24 compute-0 NetworkManager[54920]: <info>  [1769160804.9975] manager: (tap7e99b582-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Jan 23 09:33:24 compute-0 kernel: tap7e99b582-90: entered promiscuous mode
Jan 23 09:33:24 compute-0 nova_compute[182092]: 2026-01-23 09:33:24.998 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:25.001 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7e99b582-90, col_values=(('external_ids', {'iface-id': 'c094d569-ccd0-4588-94bd-768eb8e26c27'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.002 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:25 compute-0 ovn_controller[94697]: 2026-01-23T09:33:25Z|00496|binding|INFO|Releasing lport c094d569-ccd0-4588-94bd-768eb8e26c27 from this chassis (sb_readonly=0)
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.014 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.015 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:25.015 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7e99b582-97b3-4b8e-8fea-dd2badac679e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7e99b582-97b3-4b8e-8fea-dd2badac679e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:25.016 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[300f9443-c4f5-4880-9a7e-12b791b06277]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:25.017 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-7e99b582-97b3-4b8e-8fea-dd2badac679e
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/7e99b582-97b3-4b8e-8fea-dd2badac679e.pid.haproxy
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 7e99b582-97b3-4b8e-8fea-dd2badac679e
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:33:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:25.017 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'env', 'PROCESS_TAG=haproxy-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7e99b582-97b3-4b8e-8fea-dd2badac679e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.053 182096 DEBUG nova.compute.manager [req-d4ca4572-6cab-4142-921a-a64b1313cbdb req-bbe8b13a-d7de-42fb-b76c-5818579dceb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.054 182096 DEBUG oslo_concurrency.lockutils [req-d4ca4572-6cab-4142-921a-a64b1313cbdb req-bbe8b13a-d7de-42fb-b76c-5818579dceb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.054 182096 DEBUG oslo_concurrency.lockutils [req-d4ca4572-6cab-4142-921a-a64b1313cbdb req-bbe8b13a-d7de-42fb-b76c-5818579dceb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.054 182096 DEBUG oslo_concurrency.lockutils [req-d4ca4572-6cab-4142-921a-a64b1313cbdb req-bbe8b13a-d7de-42fb-b76c-5818579dceb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.054 182096 DEBUG nova.compute.manager [req-d4ca4572-6cab-4142-921a-a64b1313cbdb req-bbe8b13a-d7de-42fb-b76c-5818579dceb6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Processing event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:33:25 compute-0 podman[226091]: 2026-01-23 09:33:25.294520262 +0000 UTC m=+0.032519565 container create c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:33:25 compute-0 systemd[1]: Started libpod-conmon-c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e.scope.
Jan 23 09:33:25 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:33:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a81d18ab00de79bb4dab651455b16b43b5b297b958072ce75d760b71e03eab31/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:33:25 compute-0 podman[226091]: 2026-01-23 09:33:25.351136329 +0000 UTC m=+0.089135652 container init c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 09:33:25 compute-0 podman[226091]: 2026-01-23 09:33:25.358291857 +0000 UTC m=+0.096291170 container start c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 09:33:25 compute-0 podman[226091]: 2026-01-23 09:33:25.280234072 +0000 UTC m=+0.018233405 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:33:25 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226103]: [NOTICE]   (226112) : New worker (226115) forked
Jan 23 09:33:25 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226103]: [NOTICE]   (226112) : Loading success.
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.421 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160805.420949, 17cffa6e-a77c-4bc8-965a-82b05ed586b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.421 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] VM Started (Lifecycle Event)
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.423 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.434 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.436 182096 INFO nova.virt.libvirt.driver [-] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Instance spawned successfully.
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.436 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.440 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.442 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.449 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.449 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.450 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.450 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.450 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.450 182096 DEBUG nova.virt.libvirt.driver [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.454 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.454 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160805.4211006, 17cffa6e-a77c-4bc8-965a-82b05ed586b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.454 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] VM Paused (Lifecycle Event)
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.474 182096 DEBUG nova.network.neutron [req-3778352c-fc3d-4deb-984d-f30b0f066b19 req-12d2e2d7-4d15-4767-98ba-3ec6e4c18db9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updated VIF entry in instance network info cache for port 0772a773-aac8-4339-8c9d-5823b999c2fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.475 182096 DEBUG nova.network.neutron [req-3778352c-fc3d-4deb-984d-f30b0f066b19 req-12d2e2d7-4d15-4767-98ba-3ec6e4c18db9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updating instance_info_cache with network_info: [{"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.483 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.484 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160805.4249978, 17cffa6e-a77c-4bc8-965a-82b05ed586b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.485 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] VM Resumed (Lifecycle Event)
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.497 182096 DEBUG oslo_concurrency.lockutils [req-3778352c-fc3d-4deb-984d-f30b0f066b19 req-12d2e2d7-4d15-4767-98ba-3ec6e4c18db9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.501 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.503 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.521 182096 INFO nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Took 4.70 seconds to spawn the instance on the hypervisor.
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.521 182096 DEBUG nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.525 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.574 182096 INFO nova.compute.manager [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Took 5.07 seconds to build instance.
Jan 23 09:33:25 compute-0 nova_compute[182092]: 2026-01-23 09:33:25.591 182096 DEBUG oslo_concurrency.lockutils [None req-92157272-50d3-41cf-9ce1-f774d738f803 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.158 182096 DEBUG nova.compute.manager [req-2b056660-3aaa-4b4a-911c-a0bb2eb24179 req-4fd491fc-03df-454d-aa7d-b03688ef2058 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.158 182096 DEBUG oslo_concurrency.lockutils [req-2b056660-3aaa-4b4a-911c-a0bb2eb24179 req-4fd491fc-03df-454d-aa7d-b03688ef2058 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.159 182096 DEBUG oslo_concurrency.lockutils [req-2b056660-3aaa-4b4a-911c-a0bb2eb24179 req-4fd491fc-03df-454d-aa7d-b03688ef2058 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.159 182096 DEBUG oslo_concurrency.lockutils [req-2b056660-3aaa-4b4a-911c-a0bb2eb24179 req-4fd491fc-03df-454d-aa7d-b03688ef2058 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.159 182096 DEBUG nova.compute.manager [req-2b056660-3aaa-4b4a-911c-a0bb2eb24179 req-4fd491fc-03df-454d-aa7d-b03688ef2058 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] No waiting events found dispatching network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.159 182096 WARNING nova.compute.manager [req-2b056660-3aaa-4b4a-911c-a0bb2eb24179 req-4fd491fc-03df-454d-aa7d-b03688ef2058 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received unexpected event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd for instance with vm_state active and task_state None.
Jan 23 09:33:27 compute-0 podman[226121]: 2026-01-23 09:33:27.220012609 +0000 UTC m=+0.051920672 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Jan 23 09:33:27 compute-0 NetworkManager[54920]: <info>  [1769160807.2954] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Jan 23 09:33:27 compute-0 NetworkManager[54920]: <info>  [1769160807.2964] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.297 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.379 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:27 compute-0 ovn_controller[94697]: 2026-01-23T09:33:27Z|00497|binding|INFO|Releasing lport c094d569-ccd0-4588-94bd-768eb8e26c27 from this chassis (sb_readonly=0)
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.393 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.514 182096 DEBUG nova.compute.manager [req-a67c9430-079e-4fc4-9a5b-fd40430d7f81 req-563f26fe-b0f8-4de0-8c4a-357976479859 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-changed-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.514 182096 DEBUG nova.compute.manager [req-a67c9430-079e-4fc4-9a5b-fd40430d7f81 req-563f26fe-b0f8-4de0-8c4a-357976479859 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Refreshing instance network info cache due to event network-changed-0772a773-aac8-4339-8c9d-5823b999c2fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.514 182096 DEBUG oslo_concurrency.lockutils [req-a67c9430-079e-4fc4-9a5b-fd40430d7f81 req-563f26fe-b0f8-4de0-8c4a-357976479859 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.514 182096 DEBUG oslo_concurrency.lockutils [req-a67c9430-079e-4fc4-9a5b-fd40430d7f81 req-563f26fe-b0f8-4de0-8c4a-357976479859 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:27 compute-0 nova_compute[182092]: 2026-01-23 09:33:27.514 182096 DEBUG nova.network.neutron [req-a67c9430-079e-4fc4-9a5b-fd40430d7f81 req-563f26fe-b0f8-4de0-8c4a-357976479859 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Refreshing network info cache for port 0772a773-aac8-4339-8c9d-5823b999c2fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:33:28 compute-0 nova_compute[182092]: 2026-01-23 09:33:28.501 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:28 compute-0 nova_compute[182092]: 2026-01-23 09:33:28.731 182096 DEBUG nova.network.neutron [req-a67c9430-079e-4fc4-9a5b-fd40430d7f81 req-563f26fe-b0f8-4de0-8c4a-357976479859 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updated VIF entry in instance network info cache for port 0772a773-aac8-4339-8c9d-5823b999c2fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:33:28 compute-0 nova_compute[182092]: 2026-01-23 09:33:28.732 182096 DEBUG nova.network.neutron [req-a67c9430-079e-4fc4-9a5b-fd40430d7f81 req-563f26fe-b0f8-4de0-8c4a-357976479859 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updating instance_info_cache with network_info: [{"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:28 compute-0 nova_compute[182092]: 2026-01-23 09:33:28.754 182096 DEBUG oslo_concurrency.lockutils [req-a67c9430-079e-4fc4-9a5b-fd40430d7f81 req-563f26fe-b0f8-4de0-8c4a-357976479859 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:29 compute-0 nova_compute[182092]: 2026-01-23 09:33:29.209 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:31 compute-0 nova_compute[182092]: 2026-01-23 09:33:31.921 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:32 compute-0 nova_compute[182092]: 2026-01-23 09:33:32.382 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:33 compute-0 nova_compute[182092]: 2026-01-23 09:33:33.771 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160798.7704084, 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:33 compute-0 nova_compute[182092]: 2026-01-23 09:33:33.771 182096 INFO nova.compute.manager [-] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] VM Stopped (Lifecycle Event)
Jan 23 09:33:33 compute-0 nova_compute[182092]: 2026-01-23 09:33:33.791 182096 DEBUG nova.compute.manager [None req-90224cd2-1b4d-4d2f-9165-ddedd0a2466f - - - - - -] [instance: 9fbfb6e2-44ef-4a5f-97c5-af56ea4a6aad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:34 compute-0 nova_compute[182092]: 2026-01-23 09:33:34.211 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:34 compute-0 nova_compute[182092]: 2026-01-23 09:33:34.941 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:36 compute-0 podman[226151]: 2026-01-23 09:33:36.232035891 +0000 UTC m=+0.065168813 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:33:37 compute-0 ovn_controller[94697]: 2026-01-23T09:33:37Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:7e:1e 10.100.0.7
Jan 23 09:33:37 compute-0 ovn_controller[94697]: 2026-01-23T09:33:37Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:7e:1e 10.100.0.7
Jan 23 09:33:37 compute-0 nova_compute[182092]: 2026-01-23 09:33:37.386 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:39 compute-0 ovn_controller[94697]: 2026-01-23T09:33:39Z|00498|binding|INFO|Releasing lport c094d569-ccd0-4588-94bd-768eb8e26c27 from this chassis (sb_readonly=0)
Jan 23 09:33:39 compute-0 nova_compute[182092]: 2026-01-23 09:33:39.148 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:39 compute-0 nova_compute[182092]: 2026-01-23 09:33:39.212 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:39.865 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:39.865 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:39.866 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:41 compute-0 ovn_controller[94697]: 2026-01-23T09:33:41Z|00499|binding|INFO|Releasing lport c094d569-ccd0-4588-94bd-768eb8e26c27 from this chassis (sb_readonly=0)
Jan 23 09:33:41 compute-0 nova_compute[182092]: 2026-01-23 09:33:41.987 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.387 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.572 182096 INFO nova.compute.manager [None req-e12c79a0-feae-42e1-b5bc-b2647d004aac 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Get console output
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.579 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.884 182096 DEBUG nova.objects.instance [None req-8f357e4e-7635-4ea9-9571-5e4d940643b7 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17cffa6e-a77c-4bc8-965a-82b05ed586b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.904 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160822.903875, 17cffa6e-a77c-4bc8-965a-82b05ed586b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.904 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] VM Paused (Lifecycle Event)
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.922 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.924 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:33:42 compute-0 nova_compute[182092]: 2026-01-23 09:33:42.938 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 23 09:33:43 compute-0 podman[226180]: 2026-01-23 09:33:43.205879204 +0000 UTC m=+0.037034845 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:33:43 compute-0 podman[226179]: 2026-01-23 09:33:43.206380529 +0000 UTC m=+0.040081460 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 23 09:33:43 compute-0 kernel: tap0772a773-aa (unregistering): left promiscuous mode
Jan 23 09:33:43 compute-0 NetworkManager[54920]: <info>  [1769160823.4732] device (tap0772a773-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:33:43 compute-0 nova_compute[182092]: 2026-01-23 09:33:43.481 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:43 compute-0 nova_compute[182092]: 2026-01-23 09:33:43.482 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:43 compute-0 ovn_controller[94697]: 2026-01-23T09:33:43Z|00500|binding|INFO|Releasing lport 0772a773-aac8-4339-8c9d-5823b999c2fd from this chassis (sb_readonly=0)
Jan 23 09:33:43 compute-0 ovn_controller[94697]: 2026-01-23T09:33:43Z|00501|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd down in Southbound
Jan 23 09:33:43 compute-0 ovn_controller[94697]: 2026-01-23T09:33:43Z|00502|binding|INFO|Removing iface tap0772a773-aa ovn-installed in OVS
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.494 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:7e:1e 10.100.0.7'], port_security=['fa:16:3e:c1:7e:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '17cffa6e-a77c-4bc8-965a-82b05ed586b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '4', 'neutron:security_group_ids': '993818f6-1b7e-42c6-ba0b-9eb803a2a7ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61785ad5-8ba8-4f1d-8081-40b8ea412d48, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0772a773-aac8-4339-8c9d-5823b999c2fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.495 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0772a773-aac8-4339-8c9d-5823b999c2fd in datapath 7e99b582-97b3-4b8e-8fea-dd2badac679e unbound from our chassis
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.496 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e99b582-97b3-4b8e-8fea-dd2badac679e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.497 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dc68cf35-d301-4745-adc5-742185745a13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:43 compute-0 nova_compute[182092]: 2026-01-23 09:33:43.498 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.498 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e namespace which is not needed anymore
Jan 23 09:33:43 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 23 09:33:43 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000086.scope: Consumed 11.742s CPU time.
Jan 23 09:33:43 compute-0 systemd-machined[153562]: Machine qemu-67-instance-00000086 terminated.
Jan 23 09:33:43 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226103]: [NOTICE]   (226112) : haproxy version is 2.8.14-c23fe91
Jan 23 09:33:43 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226103]: [NOTICE]   (226112) : path to executable is /usr/sbin/haproxy
Jan 23 09:33:43 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226103]: [WARNING]  (226112) : Exiting Master process...
Jan 23 09:33:43 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226103]: [ALERT]    (226112) : Current worker (226115) exited with code 143 (Terminated)
Jan 23 09:33:43 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226103]: [WARNING]  (226112) : All workers exited. Exiting... (0)
Jan 23 09:33:43 compute-0 systemd[1]: libpod-c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e.scope: Deactivated successfully.
Jan 23 09:33:43 compute-0 podman[226238]: 2026-01-23 09:33:43.591555334 +0000 UTC m=+0.034728193 container died c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:33:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e-userdata-shm.mount: Deactivated successfully.
Jan 23 09:33:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-a81d18ab00de79bb4dab651455b16b43b5b297b958072ce75d760b71e03eab31-merged.mount: Deactivated successfully.
Jan 23 09:33:43 compute-0 podman[226238]: 2026-01-23 09:33:43.614509599 +0000 UTC m=+0.057682448 container cleanup c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 09:33:43 compute-0 systemd[1]: libpod-conmon-c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e.scope: Deactivated successfully.
Jan 23 09:33:43 compute-0 podman[226261]: 2026-01-23 09:33:43.656358492 +0000 UTC m=+0.025132915 container remove c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.659 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0bfe09-02b0-4d96-bb1f-4f2019ebf9b4]: (4, ('Fri Jan 23 09:33:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e (c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e)\nc614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e\nFri Jan 23 09:33:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e (c614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e)\nc614e32fb666f3a92805c5a8205fc3c8fd9c6dcbf9eadb7164bf57fb8d03dc5e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.660 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[80d3f69c-1cf1-4aaa-a3f9-442dd1c95136]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.662 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e99b582-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:43 compute-0 nova_compute[182092]: 2026-01-23 09:33:43.664 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:43 compute-0 kernel: tap7e99b582-90: left promiscuous mode
Jan 23 09:33:43 compute-0 nova_compute[182092]: 2026-01-23 09:33:43.681 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:43 compute-0 nova_compute[182092]: 2026-01-23 09:33:43.682 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.683 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4b4376-0d3a-4c8a-8613-3df0a2ef83e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.694 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d61e4a-6f47-4fc4-a0a9-f3d67e4ab150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.694 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[335c4b41-8f4b-4915-a9fd-060658c919cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:43 compute-0 nova_compute[182092]: 2026-01-23 09:33:43.704 182096 DEBUG nova.compute.manager [None req-8f357e4e-7635-4ea9-9571-5e4d940643b7 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.705 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3a085f-9bf1-47dd-b40f-c45e60d319c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431032, 'reachable_time': 36674, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226289, 'error': None, 'target': 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d7e99b582\x2d97b3\x2d4b8e\x2d8fea\x2ddd2badac679e.mount: Deactivated successfully.
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.709 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:33:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:43.709 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f87c9d-b736-4a01-888a-4221c4c534d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.214 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.520 182096 DEBUG nova.compute.manager [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-unplugged-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.521 182096 DEBUG oslo_concurrency.lockutils [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.521 182096 DEBUG oslo_concurrency.lockutils [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.521 182096 DEBUG oslo_concurrency.lockutils [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.521 182096 DEBUG nova.compute.manager [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] No waiting events found dispatching network-vif-unplugged-0772a773-aac8-4339-8c9d-5823b999c2fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.522 182096 WARNING nova.compute.manager [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received unexpected event network-vif-unplugged-0772a773-aac8-4339-8c9d-5823b999c2fd for instance with vm_state suspended and task_state None.
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.522 182096 DEBUG nova.compute.manager [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.522 182096 DEBUG oslo_concurrency.lockutils [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.522 182096 DEBUG oslo_concurrency.lockutils [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.523 182096 DEBUG oslo_concurrency.lockutils [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.523 182096 DEBUG nova.compute.manager [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] No waiting events found dispatching network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:44 compute-0 nova_compute[182092]: 2026-01-23 09:33:44.523 182096 WARNING nova.compute.manager [req-60f33aa0-66d3-4f8f-b151-6f4e5d3eefb6 req-5a240f0e-bcec-42b7-9ffb-3ecb74dc166d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received unexpected event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd for instance with vm_state suspended and task_state None.
Jan 23 09:33:47 compute-0 nova_compute[182092]: 2026-01-23 09:33:47.388 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:47 compute-0 nova_compute[182092]: 2026-01-23 09:33:47.438 182096 INFO nova.compute.manager [None req-f621cde5-26f9-427d-b4d7-3191ba494f33 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Get console output
Jan 23 09:33:47 compute-0 nova_compute[182092]: 2026-01-23 09:33:47.579 182096 INFO nova.compute.manager [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Resuming
Jan 23 09:33:47 compute-0 nova_compute[182092]: 2026-01-23 09:33:47.580 182096 DEBUG nova.objects.instance [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'flavor' on Instance uuid 17cffa6e-a77c-4bc8-965a-82b05ed586b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:47 compute-0 nova_compute[182092]: 2026-01-23 09:33:47.605 182096 DEBUG oslo_concurrency.lockutils [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:47 compute-0 nova_compute[182092]: 2026-01-23 09:33:47.605 182096 DEBUG oslo_concurrency.lockutils [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquired lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:47 compute-0 nova_compute[182092]: 2026-01-23 09:33:47.605 182096 DEBUG nova.network.neutron [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.216 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.336 182096 DEBUG nova.network.neutron [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updating instance_info_cache with network_info: [{"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.354 182096 DEBUG oslo_concurrency.lockutils [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Releasing lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.357 182096 DEBUG nova.virt.libvirt.vif [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:33:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2069664660',display_name='tempest-TestNetworkAdvancedServerOps-server-2069664660',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2069664660',id=134,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUQLriSjjHbXLEiMyQkKDLigOSd5Dv8tekDWyLjAQXwMXljix6ygFfhVVpjSn73NmGuRLe+/DJQuZoGzsEcBeSmuYq0fo7ZuqLW1a5vvWJzJEvEXIB26Ui/UebMH42/fw==',key_name='tempest-TestNetworkAdvancedServerOps-1219377951',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:33:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-a2jnembt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:33:43Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=17cffa6e-a77c-4bc8-965a-82b05ed586b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.358 182096 DEBUG nova.network.os_vif_util [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.358 182096 DEBUG nova.network.os_vif_util [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.359 182096 DEBUG os_vif [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.359 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.360 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.360 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.363 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.364 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0772a773-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.364 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0772a773-aa, col_values=(('external_ids', {'iface-id': '0772a773-aac8-4339-8c9d-5823b999c2fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:7e:1e', 'vm-uuid': '17cffa6e-a77c-4bc8-965a-82b05ed586b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.364 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.364 182096 INFO os_vif [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa')
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.377 182096 DEBUG nova.objects.instance [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'numa_topology' on Instance uuid 17cffa6e-a77c-4bc8-965a-82b05ed586b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:49 compute-0 kernel: tap0772a773-aa: entered promiscuous mode
Jan 23 09:33:49 compute-0 ovn_controller[94697]: 2026-01-23T09:33:49Z|00503|binding|INFO|Claiming lport 0772a773-aac8-4339-8c9d-5823b999c2fd for this chassis.
Jan 23 09:33:49 compute-0 ovn_controller[94697]: 2026-01-23T09:33:49Z|00504|binding|INFO|0772a773-aac8-4339-8c9d-5823b999c2fd: Claiming fa:16:3e:c1:7e:1e 10.100.0.7
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.420 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 NetworkManager[54920]: <info>  [1769160829.4253] manager: (tap0772a773-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.423 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:7e:1e 10.100.0.7'], port_security=['fa:16:3e:c1:7e:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '17cffa6e-a77c-4bc8-965a-82b05ed586b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '5', 'neutron:security_group_ids': '993818f6-1b7e-42c6-ba0b-9eb803a2a7ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61785ad5-8ba8-4f1d-8081-40b8ea412d48, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0772a773-aac8-4339-8c9d-5823b999c2fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.424 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0772a773-aac8-4339-8c9d-5823b999c2fd in datapath 7e99b582-97b3-4b8e-8fea-dd2badac679e bound to our chassis
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.425 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7e99b582-97b3-4b8e-8fea-dd2badac679e
Jan 23 09:33:49 compute-0 ovn_controller[94697]: 2026-01-23T09:33:49Z|00505|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd ovn-installed in OVS
Jan 23 09:33:49 compute-0 ovn_controller[94697]: 2026-01-23T09:33:49Z|00506|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd up in Southbound
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.432 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.435 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.435 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e68b4b8d-7d2a-49ed-a64c-2391aec01416]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.436 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7e99b582-91 in ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.437 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7e99b582-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.437 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b668a861-2a91-4d11-b3b5-2beeb4dbf9f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.438 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[178c6a2d-1c4e-4fb7-ad53-5f376b31fac1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 systemd-udevd[226308]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.447 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd642a0-9d5b-4064-b72c-b27764b966e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 systemd-machined[153562]: New machine qemu-68-instance-00000086.
Jan 23 09:33:49 compute-0 NetworkManager[54920]: <info>  [1769160829.4544] device (tap0772a773-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:33:49 compute-0 NetworkManager[54920]: <info>  [1769160829.4552] device (tap0772a773-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.456 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[de19ca75-21d8-4dc5-9d1f-a3377643f3ba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 systemd[1]: Started Virtual Machine qemu-68-instance-00000086.
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.476 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[28547719-b24f-407b-b828-a3a095f65370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.479 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[515a6a37-7b5d-45cb-88a9-3a70e5ffcaa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 NetworkManager[54920]: <info>  [1769160829.4803] manager: (tap7e99b582-90): new Veth device (/org/freedesktop/NetworkManager/Devices/260)
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.504 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f85074-6b44-4e26-95dc-88e8b41fb1e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.506 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[00150952-7a4f-45e7-9e0d-149c126ef242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 NetworkManager[54920]: <info>  [1769160829.5216] device (tap7e99b582-90): carrier: link connected
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.524 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e3182d7c-efd5-4551-814e-d19505111238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.536 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3d962e79-e6dc-4de7-8b2f-d878e1226572]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e99b582-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:a3:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433500, 'reachable_time': 33747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226333, 'error': None, 'target': 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.545 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[be95705b-03ba-40f8-8411-58069b075b20]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:a3c6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433500, 'tstamp': 433500}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226334, 'error': None, 'target': 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.555 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a661e2ad-f347-4125-bfef-bffb36025316]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e99b582-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:a3:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433500, 'reachable_time': 33747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226335, 'error': None, 'target': 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.576 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aac6b238-2ce2-43e1-899e-7461663c234b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.613 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[74096bbe-29a8-4425-8bc0-18152298691b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.614 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e99b582-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.614 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.614 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e99b582-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:49 compute-0 kernel: tap7e99b582-90: entered promiscuous mode
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.616 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 NetworkManager[54920]: <info>  [1769160829.6174] manager: (tap7e99b582-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.622 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7e99b582-90, col_values=(('external_ids', {'iface-id': 'c094d569-ccd0-4588-94bd-768eb8e26c27'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:49 compute-0 ovn_controller[94697]: 2026-01-23T09:33:49Z|00507|binding|INFO|Releasing lport c094d569-ccd0-4588-94bd-768eb8e26c27 from this chassis (sb_readonly=0)
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.624 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.626 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7e99b582-97b3-4b8e-8fea-dd2badac679e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7e99b582-97b3-4b8e-8fea-dd2badac679e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.636 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.635 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c056dbf-1dff-4204-8e02-3a28d229b695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.637 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-7e99b582-97b3-4b8e-8fea-dd2badac679e
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/7e99b582-97b3-4b8e-8fea-dd2badac679e.pid.haproxy
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 7e99b582-97b3-4b8e-8fea-dd2badac679e
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:33:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:49.637 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'env', 'PROCESS_TAG=haproxy-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7e99b582-97b3-4b8e-8fea-dd2badac679e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.894 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 17cffa6e-a77c-4bc8-965a-82b05ed586b6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.896 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160829.8947282, 17cffa6e-a77c-4bc8-965a-82b05ed586b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.896 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] VM Started (Lifecycle Event)
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.911 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.917 182096 DEBUG nova.compute.manager [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.917 182096 DEBUG nova.objects.instance [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17cffa6e-a77c-4bc8-965a-82b05ed586b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.920 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:33:49 compute-0 podman[226371]: 2026-01-23 09:33:49.926109677 +0000 UTC m=+0.040842536 container create bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.931 182096 INFO nova.virt.libvirt.driver [-] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Instance running successfully.
Jan 23 09:33:49 compute-0 virtqemud[181713]: argument unsupported: QEMU guest agent is not configured
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.933 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.933 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160829.902288, 17cffa6e-a77c-4bc8-965a-82b05ed586b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.933 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] VM Resumed (Lifecycle Event)
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.935 182096 DEBUG nova.virt.libvirt.guest [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.935 182096 DEBUG nova.compute.manager [None req-1d8151eb-522d-4352-8aa1-ada909e1012f 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.950 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.951 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:33:49 compute-0 systemd[1]: Started libpod-conmon-bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093.scope.
Jan 23 09:33:49 compute-0 nova_compute[182092]: 2026-01-23 09:33:49.970 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 23 09:33:49 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:33:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ab54c50da7e6f1dfc0b57736d16c82eff994b464a568f9db7347422adb47672/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:33:49 compute-0 podman[226371]: 2026-01-23 09:33:49.98337092 +0000 UTC m=+0.098103789 container init bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:33:49 compute-0 podman[226371]: 2026-01-23 09:33:49.987918908 +0000 UTC m=+0.102651767 container start bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:33:49 compute-0 podman[226371]: 2026-01-23 09:33:49.910878309 +0000 UTC m=+0.025611188 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:33:50 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226383]: [NOTICE]   (226388) : New worker (226390) forked
Jan 23 09:33:50 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226383]: [NOTICE]   (226388) : Loading success.
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.748 182096 DEBUG nova.compute.manager [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.748 182096 DEBUG oslo_concurrency.lockutils [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.748 182096 DEBUG oslo_concurrency.lockutils [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.748 182096 DEBUG oslo_concurrency.lockutils [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.749 182096 DEBUG nova.compute.manager [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] No waiting events found dispatching network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.749 182096 WARNING nova.compute.manager [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received unexpected event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd for instance with vm_state active and task_state None.
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.749 182096 DEBUG nova.compute.manager [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.749 182096 DEBUG oslo_concurrency.lockutils [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.749 182096 DEBUG oslo_concurrency.lockutils [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.750 182096 DEBUG oslo_concurrency.lockutils [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.750 182096 DEBUG nova.compute.manager [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] No waiting events found dispatching network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:50 compute-0 nova_compute[182092]: 2026-01-23 09:33:50.750 182096 WARNING nova.compute.manager [req-618753c9-7c18-4734-a1e6-baf8b79833f2 req-e1609ad0-c3c0-4949-8170-0d39c6e4e16a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received unexpected event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd for instance with vm_state active and task_state None.
Jan 23 09:33:51 compute-0 nova_compute[182092]: 2026-01-23 09:33:51.729 182096 INFO nova.compute.manager [None req-b31443b1-fd11-4b50-803c-313996eb66fc 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Get console output
Jan 23 09:33:51 compute-0 nova_compute[182092]: 2026-01-23 09:33:51.732 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.389 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.578 182096 DEBUG nova.compute.manager [req-f69a2927-61ee-406d-b12c-f3124db5def6 req-5c2a39a5-42e0-4a90-83bf-83c04c693c23 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-changed-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.578 182096 DEBUG nova.compute.manager [req-f69a2927-61ee-406d-b12c-f3124db5def6 req-5c2a39a5-42e0-4a90-83bf-83c04c693c23 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Refreshing instance network info cache due to event network-changed-0772a773-aac8-4339-8c9d-5823b999c2fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.578 182096 DEBUG oslo_concurrency.lockutils [req-f69a2927-61ee-406d-b12c-f3124db5def6 req-5c2a39a5-42e0-4a90-83bf-83c04c693c23 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.578 182096 DEBUG oslo_concurrency.lockutils [req-f69a2927-61ee-406d-b12c-f3124db5def6 req-5c2a39a5-42e0-4a90-83bf-83c04c693c23 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.579 182096 DEBUG nova.network.neutron [req-f69a2927-61ee-406d-b12c-f3124db5def6 req-5c2a39a5-42e0-4a90-83bf-83c04c693c23 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Refreshing network info cache for port 0772a773-aac8-4339-8c9d-5823b999c2fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.635 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.635 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.635 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.636 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.636 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.644 182096 INFO nova.compute.manager [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Terminating instance
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.650 182096 DEBUG nova.compute.manager [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:33:52 compute-0 kernel: tap0772a773-aa (unregistering): left promiscuous mode
Jan 23 09:33:52 compute-0 NetworkManager[54920]: <info>  [1769160832.6706] device (tap0772a773-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00508|binding|INFO|Releasing lport 0772a773-aac8-4339-8c9d-5823b999c2fd from this chassis (sb_readonly=0)
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00509|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd down in Southbound
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00510|binding|INFO|Removing iface tap0772a773-aa ovn-installed in OVS
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.681 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.685 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.693 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:7e:1e 10.100.0.7'], port_security=['fa:16:3e:c1:7e:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '17cffa6e-a77c-4bc8-965a-82b05ed586b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '6', 'neutron:security_group_ids': '993818f6-1b7e-42c6-ba0b-9eb803a2a7ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61785ad5-8ba8-4f1d-8081-40b8ea412d48, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0772a773-aac8-4339-8c9d-5823b999c2fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.694 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0772a773-aac8-4339-8c9d-5823b999c2fd in datapath 7e99b582-97b3-4b8e-8fea-dd2badac679e unbound from our chassis
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.696 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e99b582-97b3-4b8e-8fea-dd2badac679e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.696 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aef1187b-d737-4701-8d35-6c851df69084]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.698 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e namespace which is not needed anymore
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.697 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 23 09:33:52 compute-0 systemd-machined[153562]: Machine qemu-68-instance-00000086 terminated.
Jan 23 09:33:52 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226383]: [NOTICE]   (226388) : haproxy version is 2.8.14-c23fe91
Jan 23 09:33:52 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226383]: [NOTICE]   (226388) : path to executable is /usr/sbin/haproxy
Jan 23 09:33:52 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226383]: [WARNING]  (226388) : Exiting Master process...
Jan 23 09:33:52 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226383]: [ALERT]    (226388) : Current worker (226390) exited with code 143 (Terminated)
Jan 23 09:33:52 compute-0 neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e[226383]: [WARNING]  (226388) : All workers exited. Exiting... (0)
Jan 23 09:33:52 compute-0 systemd[1]: libpod-bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093.scope: Deactivated successfully.
Jan 23 09:33:52 compute-0 podman[226417]: 2026-01-23 09:33:52.793593601 +0000 UTC m=+0.033007257 container died bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:33:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093-userdata-shm.mount: Deactivated successfully.
Jan 23 09:33:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-4ab54c50da7e6f1dfc0b57736d16c82eff994b464a568f9db7347422adb47672-merged.mount: Deactivated successfully.
Jan 23 09:33:52 compute-0 podman[226417]: 2026-01-23 09:33:52.812328367 +0000 UTC m=+0.051742034 container cleanup bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:33:52 compute-0 systemd[1]: libpod-conmon-bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093.scope: Deactivated successfully.
Jan 23 09:33:52 compute-0 podman[226440]: 2026-01-23 09:33:52.858439179 +0000 UTC m=+0.026589142 container remove bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:33:52 compute-0 kernel: tap0772a773-aa: entered promiscuous mode
Jan 23 09:33:52 compute-0 kernel: tap0772a773-aa (unregistering): left promiscuous mode
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.861 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[08bb7324-a4b1-4dfd-af79-260b778c167a]: (4, ('Fri Jan 23 09:33:52 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e (bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093)\nbcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093\nFri Jan 23 09:33:52 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e (bcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093)\nbcfd0cb1575200f3897f67fe30d8675e9bc0fa6d72c58bfb5a0e9416b11e5093\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 systemd-udevd[226398]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:33:52 compute-0 NetworkManager[54920]: <info>  [1769160832.8680] manager: (tap0772a773-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/262)
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.866 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[17c3d0b9-1f06-4cca-b47c-7887a3dfe13a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.868 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e99b582-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.870 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00511|binding|INFO|Claiming lport 0772a773-aac8-4339-8c9d-5823b999c2fd for this chassis.
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00512|binding|INFO|0772a773-aac8-4339-8c9d-5823b999c2fd: Claiming fa:16:3e:c1:7e:1e 10.100.0.7
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.881 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:7e:1e 10.100.0.7'], port_security=['fa:16:3e:c1:7e:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '17cffa6e-a77c-4bc8-965a-82b05ed586b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '6', 'neutron:security_group_ids': '993818f6-1b7e-42c6-ba0b-9eb803a2a7ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61785ad5-8ba8-4f1d-8081-40b8ea412d48, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0772a773-aac8-4339-8c9d-5823b999c2fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:52 compute-0 kernel: tap7e99b582-90: left promiscuous mode
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.898 182096 INFO nova.virt.libvirt.driver [-] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Instance destroyed successfully.
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.898 182096 DEBUG nova.objects.instance [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lazy-loading 'resources' on Instance uuid 17cffa6e-a77c-4bc8-965a-82b05ed586b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00513|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd ovn-installed in OVS
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00514|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd up in Southbound
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00515|binding|INFO|Releasing lport 0772a773-aac8-4339-8c9d-5823b999c2fd from this chassis (sb_readonly=1)
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00516|if_status|INFO|Not setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd down as sb is readonly
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00517|binding|INFO|Removing iface tap0772a773-aa ovn-installed in OVS
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00518|binding|INFO|Releasing lport 0772a773-aac8-4339-8c9d-5823b999c2fd from this chassis (sb_readonly=0)
Jan 23 09:33:52 compute-0 ovn_controller[94697]: 2026-01-23T09:33:52Z|00519|binding|INFO|Setting lport 0772a773-aac8-4339-8c9d-5823b999c2fd down in Southbound
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.902 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bd505b94-0370-4c44-96ff-a1a37fa09e77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.904 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.906 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.908 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:7e:1e 10.100.0.7'], port_security=['fa:16:3e:c1:7e:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '17cffa6e-a77c-4bc8-965a-82b05ed586b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a5525bfc549464cace77d44548fb012', 'neutron:revision_number': '6', 'neutron:security_group_ids': '993818f6-1b7e-42c6-ba0b-9eb803a2a7ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61785ad5-8ba8-4f1d-8081-40b8ea412d48, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=0772a773-aac8-4339-8c9d-5823b999c2fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.912 182096 DEBUG nova.virt.libvirt.vif [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:33:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2069664660',display_name='tempest-TestNetworkAdvancedServerOps-server-2069664660',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2069664660',id=134,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUQLriSjjHbXLEiMyQkKDLigOSd5Dv8tekDWyLjAQXwMXljix6ygFfhVVpjSn73NmGuRLe+/DJQuZoGzsEcBeSmuYq0fo7ZuqLW1a5vvWJzJEvEXIB26Ui/UebMH42/fw==',key_name='tempest-TestNetworkAdvancedServerOps-1219377951',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:33:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5a5525bfc549464cace77d44548fb012',ramdisk_id='',reservation_id='r-a2jnembt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-169193993',owner_user_name='tempest-TestNetworkAdvancedServerOps-169193993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:33:49Z,user_data=None,user_id='2880f53bded147989ea61dc68ec0880e',uuid=17cffa6e-a77c-4bc8-965a-82b05ed586b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.912 182096 DEBUG nova.network.os_vif_util [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converting VIF {"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.912 182096 DEBUG nova.network.os_vif_util [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.913 182096 DEBUG os_vif [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.915 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.915 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0772a773-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.917 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.922 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.922 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb5c041-9cea-4377-ba34-bbe6130d8aab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.923 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e9279d-4f5f-4a2e-b3e0-a069ad4c6fe7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.923 182096 INFO os_vif [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:7e:1e,bridge_name='br-int',has_traffic_filtering=True,id=0772a773-aac8-4339-8c9d-5823b999c2fd,network=Network(7e99b582-97b3-4b8e-8fea-dd2badac679e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0772a773-aa')
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.924 182096 INFO nova.virt.libvirt.driver [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Deleting instance files /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6_del
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.924 182096 INFO nova.virt.libvirt.driver [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Deletion of /var/lib/nova/instances/17cffa6e-a77c-4bc8-965a-82b05ed586b6_del complete
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.935 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d45d1e0d-6dfc-43f8-8187-7bc419fc3816]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433495, 'reachable_time': 18303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226464, 'error': None, 'target': 'ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 systemd[1]: run-netns-ovnmeta\x2d7e99b582\x2d97b3\x2d4b8e\x2d8fea\x2ddd2badac679e.mount: Deactivated successfully.
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.938 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7e99b582-97b3-4b8e-8fea-dd2badac679e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.938 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[560578e1-545c-476b-99fa-28909748b200]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.939 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0772a773-aac8-4339-8c9d-5823b999c2fd in datapath 7e99b582-97b3-4b8e-8fea-dd2badac679e unbound from our chassis
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.940 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e99b582-97b3-4b8e-8fea-dd2badac679e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.941 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e51ced-63e7-4bee-9362-ab74d0daf1d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.941 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 0772a773-aac8-4339-8c9d-5823b999c2fd in datapath 7e99b582-97b3-4b8e-8fea-dd2badac679e unbound from our chassis
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.942 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e99b582-97b3-4b8e-8fea-dd2badac679e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:33:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:52.942 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f81cbe1f-43a4-42a0-a793-a765627d7361]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.976 182096 INFO nova.compute.manager [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.977 182096 DEBUG oslo.service.loopingcall [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.977 182096 DEBUG nova.compute.manager [-] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:33:52 compute-0 nova_compute[182092]: 2026-01-23 09:33:52.978 182096 DEBUG nova.network.neutron [-] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.276 182096 DEBUG nova.compute.manager [req-ed34135c-8c1d-4e25-a9c2-2a56f067e036 req-6867a2d7-7047-44a9-b9d7-06f4f305a4f8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-unplugged-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.276 182096 DEBUG oslo_concurrency.lockutils [req-ed34135c-8c1d-4e25-a9c2-2a56f067e036 req-6867a2d7-7047-44a9-b9d7-06f4f305a4f8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.277 182096 DEBUG oslo_concurrency.lockutils [req-ed34135c-8c1d-4e25-a9c2-2a56f067e036 req-6867a2d7-7047-44a9-b9d7-06f4f305a4f8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.277 182096 DEBUG oslo_concurrency.lockutils [req-ed34135c-8c1d-4e25-a9c2-2a56f067e036 req-6867a2d7-7047-44a9-b9d7-06f4f305a4f8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.277 182096 DEBUG nova.compute.manager [req-ed34135c-8c1d-4e25-a9c2-2a56f067e036 req-6867a2d7-7047-44a9-b9d7-06f4f305a4f8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] No waiting events found dispatching network-vif-unplugged-0772a773-aac8-4339-8c9d-5823b999c2fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.277 182096 DEBUG nova.compute.manager [req-ed34135c-8c1d-4e25-a9c2-2a56f067e036 req-6867a2d7-7047-44a9-b9d7-06f4f305a4f8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-unplugged-0772a773-aac8-4339-8c9d-5823b999c2fd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.511 182096 DEBUG nova.network.neutron [-] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.522 182096 INFO nova.compute.manager [-] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Took 0.54 seconds to deallocate network for instance.
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.563 182096 DEBUG nova.compute.manager [req-f5019bfe-aee3-4ff4-940b-fc7b6c91477e req-4487073b-13e1-4777-8b38-ad923b0f562a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-deleted-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.583 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.584 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.598 182096 DEBUG nova.network.neutron [req-f69a2927-61ee-406d-b12c-f3124db5def6 req-5c2a39a5-42e0-4a90-83bf-83c04c693c23 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updated VIF entry in instance network info cache for port 0772a773-aac8-4339-8c9d-5823b999c2fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.598 182096 DEBUG nova.network.neutron [req-f69a2927-61ee-406d-b12c-f3124db5def6 req-5c2a39a5-42e0-4a90-83bf-83c04c693c23 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Updating instance_info_cache with network_info: [{"id": "0772a773-aac8-4339-8c9d-5823b999c2fd", "address": "fa:16:3e:c1:7e:1e", "network": {"id": "7e99b582-97b3-4b8e-8fea-dd2badac679e", "bridge": "br-int", "label": "tempest-network-smoke--1771427752", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5a5525bfc549464cace77d44548fb012", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0772a773-aa", "ovs_interfaceid": "0772a773-aac8-4339-8c9d-5823b999c2fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.613 182096 DEBUG oslo_concurrency.lockutils [req-f69a2927-61ee-406d-b12c-f3124db5def6 req-5c2a39a5-42e0-4a90-83bf-83c04c693c23 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-17cffa6e-a77c-4bc8-965a-82b05ed586b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.624 182096 DEBUG nova.compute.provider_tree [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.633 182096 DEBUG nova.scheduler.client.report [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.653 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.671 182096 INFO nova.scheduler.client.report [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Deleted allocations for instance 17cffa6e-a77c-4bc8-965a-82b05ed586b6
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.720 182096 DEBUG oslo_concurrency.lockutils [None req-378fdfd3-8f20-4b42-8ce6-f06159f273c0 2880f53bded147989ea61dc68ec0880e 5a5525bfc549464cace77d44548fb012 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.890 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.890 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.904 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.961 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.961 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.966 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:33:53 compute-0 nova_compute[182092]: 2026-01-23 09:33:53.966 182096 INFO nova.compute.claims [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.043 182096 DEBUG nova.compute.provider_tree [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.051 182096 DEBUG nova.scheduler.client.report [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.065 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.065 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.105 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.105 182096 DEBUG nova.network.neutron [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.120 182096 INFO nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.135 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.212 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.212 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.213 182096 INFO nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Creating image(s)
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.213 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.213 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.214 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.223 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.245 182096 DEBUG nova.policy [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.271 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.271 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.272 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.281 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.333 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.334 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.367 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.368 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.368 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.414 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.414 182096 DEBUG nova.virt.disk.api [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Checking if we can resize image /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.415 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.461 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.462 182096 DEBUG nova.virt.disk.api [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Cannot resize image /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.462 182096 DEBUG nova.objects.instance [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'migration_context' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.477 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.477 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Ensure instance console log exists: /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.478 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.478 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:54 compute-0 nova_compute[182092]: 2026-01-23 09:33:54.478 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:55 compute-0 podman[226481]: 2026-01-23 09:33:55.213265589 +0000 UTC m=+0.049247338 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:33:55 compute-0 podman[226480]: 2026-01-23 09:33:55.214235618 +0000 UTC m=+0.051386644 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:33:55 compute-0 nova_compute[182092]: 2026-01-23 09:33:55.376 182096 DEBUG nova.compute.manager [req-4705b77e-25f9-48e7-b3d7-b1343181d543 req-c74061e9-c0b6-4d2c-a9f5-e977e62afc2e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:55 compute-0 nova_compute[182092]: 2026-01-23 09:33:55.377 182096 DEBUG oslo_concurrency.lockutils [req-4705b77e-25f9-48e7-b3d7-b1343181d543 req-c74061e9-c0b6-4d2c-a9f5-e977e62afc2e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:55 compute-0 nova_compute[182092]: 2026-01-23 09:33:55.377 182096 DEBUG oslo_concurrency.lockutils [req-4705b77e-25f9-48e7-b3d7-b1343181d543 req-c74061e9-c0b6-4d2c-a9f5-e977e62afc2e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:55 compute-0 nova_compute[182092]: 2026-01-23 09:33:55.377 182096 DEBUG oslo_concurrency.lockutils [req-4705b77e-25f9-48e7-b3d7-b1343181d543 req-c74061e9-c0b6-4d2c-a9f5-e977e62afc2e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "17cffa6e-a77c-4bc8-965a-82b05ed586b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:55 compute-0 nova_compute[182092]: 2026-01-23 09:33:55.377 182096 DEBUG nova.compute.manager [req-4705b77e-25f9-48e7-b3d7-b1343181d543 req-c74061e9-c0b6-4d2c-a9f5-e977e62afc2e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] No waiting events found dispatching network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:33:55 compute-0 nova_compute[182092]: 2026-01-23 09:33:55.377 182096 WARNING nova.compute.manager [req-4705b77e-25f9-48e7-b3d7-b1343181d543 req-c74061e9-c0b6-4d2c-a9f5-e977e62afc2e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Received unexpected event network-vif-plugged-0772a773-aac8-4339-8c9d-5823b999c2fd for instance with vm_state deleted and task_state None.
Jan 23 09:33:55 compute-0 nova_compute[182092]: 2026-01-23 09:33:55.804 182096 DEBUG nova.network.neutron [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Successfully created port: eaf8bed9-e3ff-491c-adfa-61a91e2b0509 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:33:56 compute-0 nova_compute[182092]: 2026-01-23 09:33:56.297 182096 DEBUG nova.network.neutron [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Successfully updated port: eaf8bed9-e3ff-491c-adfa-61a91e2b0509 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:33:56 compute-0 nova_compute[182092]: 2026-01-23 09:33:56.314 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:56 compute-0 nova_compute[182092]: 2026-01-23 09:33:56.314 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquired lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:56 compute-0 nova_compute[182092]: 2026-01-23 09:33:56.314 182096 DEBUG nova.network.neutron [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:33:56 compute-0 nova_compute[182092]: 2026-01-23 09:33:56.354 182096 DEBUG nova.compute.manager [req-f439ced9-b6fe-471f-b35a-889a3735fe1b req-fdc9d107-bb73-4507-8f5e-589c4272747d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-changed-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:33:56 compute-0 nova_compute[182092]: 2026-01-23 09:33:56.354 182096 DEBUG nova.compute.manager [req-f439ced9-b6fe-471f-b35a-889a3735fe1b req-fdc9d107-bb73-4507-8f5e-589c4272747d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Refreshing instance network info cache due to event network-changed-eaf8bed9-e3ff-491c-adfa-61a91e2b0509. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:33:56 compute-0 nova_compute[182092]: 2026-01-23 09:33:56.355 182096 DEBUG oslo_concurrency.lockutils [req-f439ced9-b6fe-471f-b35a-889a3735fe1b req-fdc9d107-bb73-4507-8f5e-589c4272747d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:33:57 compute-0 nova_compute[182092]: 2026-01-23 09:33:57.330 182096 DEBUG nova.network.neutron [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:33:57 compute-0 nova_compute[182092]: 2026-01-23 09:33:57.390 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:57 compute-0 nova_compute[182092]: 2026-01-23 09:33:57.917 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:58 compute-0 podman[226518]: 2026-01-23 09:33:58.211686377 +0000 UTC m=+0.049227630 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, architecture=x86_64)
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.033 182096 DEBUG nova.network.neutron [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updating instance_info_cache with network_info: [{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.047 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Releasing lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.048 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance network_info: |[{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.048 182096 DEBUG oslo_concurrency.lockutils [req-f439ced9-b6fe-471f-b35a-889a3735fe1b req-fdc9d107-bb73-4507-8f5e-589c4272747d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.048 182096 DEBUG nova.network.neutron [req-f439ced9-b6fe-471f-b35a-889a3735fe1b req-fdc9d107-bb73-4507-8f5e-589c4272747d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Refreshing network info cache for port eaf8bed9-e3ff-491c-adfa-61a91e2b0509 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.050 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Start _get_guest_xml network_info=[{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.053 182096 WARNING nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.063 182096 DEBUG nova.virt.libvirt.host [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.063 182096 DEBUG nova.virt.libvirt.host [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.066 182096 DEBUG nova.virt.libvirt.host [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.066 182096 DEBUG nova.virt.libvirt.host [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.067 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.067 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.068 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.068 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.068 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.068 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.068 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.069 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.069 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.069 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.069 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.069 182096 DEBUG nova.virt.hardware [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.073 182096 DEBUG nova.virt.libvirt.vif [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-361105592',display_name='tempest-ServerStableDeviceRescueTest-server-361105592',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-361105592',id=137,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da36a2e2bae4483caec82cba10014d48',ramdisk_id='',reservation_id='r-42fam2uc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1355926257',owner_user_name='tempest-ServerStableDeviceRescueTest-1355926257-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:33:54Z,user_data=None,user_id='800ea9ca92114ca5bf7589f4500f4bec',uuid=77396169-1daa-41c7-8ba6-c68b50815e2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.073 182096 DEBUG nova.network.os_vif_util [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Converting VIF {"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.074 182096 DEBUG nova.network.os_vif_util [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:a6:68,bridge_name='br-int',has_traffic_filtering=True,id=eaf8bed9-e3ff-491c-adfa-61a91e2b0509,network=Network(1a0d6dfa-0d95-490d-ab54-4e7f98a34e91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaf8bed9-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.074 182096 DEBUG nova.objects.instance [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'pci_devices' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.083 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <uuid>77396169-1daa-41c7-8ba6-c68b50815e2a</uuid>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <name>instance-00000089</name>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-361105592</nova:name>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:33:59</nova:creationTime>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:33:59 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:33:59 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:33:59 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:33:59 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:33:59 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:33:59 compute-0 nova_compute[182092]:         <nova:user uuid="800ea9ca92114ca5bf7589f4500f4bec">tempest-ServerStableDeviceRescueTest-1355926257-project-member</nova:user>
Jan 23 09:33:59 compute-0 nova_compute[182092]:         <nova:project uuid="da36a2e2bae4483caec82cba10014d48">tempest-ServerStableDeviceRescueTest-1355926257</nova:project>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:33:59 compute-0 nova_compute[182092]:         <nova:port uuid="eaf8bed9-e3ff-491c-adfa-61a91e2b0509">
Jan 23 09:33:59 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <system>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <entry name="serial">77396169-1daa-41c7-8ba6-c68b50815e2a</entry>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <entry name="uuid">77396169-1daa-41c7-8ba6-c68b50815e2a</entry>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </system>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <os>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   </os>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <features>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   </features>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:e0:a6:68"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <target dev="tapeaf8bed9-e3"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/console.log" append="off"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <video>
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </video>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:33:59 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:33:59 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:33:59 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:33:59 compute-0 nova_compute[182092]: </domain>
Jan 23 09:33:59 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.083 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Preparing to wait for external event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.084 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.084 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.084 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.085 182096 DEBUG nova.virt.libvirt.vif [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-361105592',display_name='tempest-ServerStableDeviceRescueTest-server-361105592',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-361105592',id=137,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da36a2e2bae4483caec82cba10014d48',ramdisk_id='',reservation_id='r-42fam2uc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1355926257',owner_user_name='tempest-ServerStableDeviceRescueTest-1355926257-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:33:54Z,user_data=None,user_id='800ea9ca92114ca5bf7589f4500f4bec',uuid=77396169-1daa-41c7-8ba6-c68b50815e2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.085 182096 DEBUG nova.network.os_vif_util [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Converting VIF {"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.085 182096 DEBUG nova.network.os_vif_util [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:a6:68,bridge_name='br-int',has_traffic_filtering=True,id=eaf8bed9-e3ff-491c-adfa-61a91e2b0509,network=Network(1a0d6dfa-0d95-490d-ab54-4e7f98a34e91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaf8bed9-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.085 182096 DEBUG os_vif [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:a6:68,bridge_name='br-int',has_traffic_filtering=True,id=eaf8bed9-e3ff-491c-adfa-61a91e2b0509,network=Network(1a0d6dfa-0d95-490d-ab54-4e7f98a34e91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaf8bed9-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.086 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.086 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.086 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.089 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.089 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeaf8bed9-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.089 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeaf8bed9-e3, col_values=(('external_ids', {'iface-id': 'eaf8bed9-e3ff-491c-adfa-61a91e2b0509', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:a6:68', 'vm-uuid': '77396169-1daa-41c7-8ba6-c68b50815e2a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.090 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 NetworkManager[54920]: <info>  [1769160839.0921] manager: (tapeaf8bed9-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.092 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.094 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.095 182096 INFO os_vif [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:a6:68,bridge_name='br-int',has_traffic_filtering=True,id=eaf8bed9-e3ff-491c-adfa-61a91e2b0509,network=Network(1a0d6dfa-0d95-490d-ab54-4e7f98a34e91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaf8bed9-e3')
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.132 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.132 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.133 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] No VIF found with MAC fa:16:3e:e0:a6:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.133 182096 INFO nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Using config drive
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.255 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.383 182096 INFO nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Creating config drive at /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.390 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc8k5jfwi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.418 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.511 182096 DEBUG oslo_concurrency.processutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc8k5jfwi" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:33:59 compute-0 kernel: tapeaf8bed9-e3: entered promiscuous mode
Jan 23 09:33:59 compute-0 ovn_controller[94697]: 2026-01-23T09:33:59Z|00520|binding|INFO|Claiming lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for this chassis.
Jan 23 09:33:59 compute-0 ovn_controller[94697]: 2026-01-23T09:33:59Z|00521|binding|INFO|eaf8bed9-e3ff-491c-adfa-61a91e2b0509: Claiming fa:16:3e:e0:a6:68 10.100.0.12
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.554 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 NetworkManager[54920]: <info>  [1769160839.5562] manager: (tapeaf8bed9-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.556 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.561 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:a6:68 10.100.0.12'], port_security=['fa:16:3e:e0:a6:68 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da36a2e2bae4483caec82cba10014d48', 'neutron:revision_number': '2', 'neutron:security_group_ids': '58a2818d-8ca4-4e70-8576-1bc8674f2c75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10cde387-ecdb-487a-958c-8320067c4581, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=eaf8bed9-e3ff-491c-adfa-61a91e2b0509) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.562 103978 INFO neutron.agent.ovn.metadata.agent [-] Port eaf8bed9-e3ff-491c-adfa-61a91e2b0509 in datapath 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 bound to our chassis
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.563 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.572 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[769f7714-7f30-4387-beb5-744683fd56bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.572 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a0d6dfa-01 in ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.576 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a0d6dfa-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.576 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7051ea79-b817-41f6-ad59-c5e9f9f27fc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.577 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3b31c5c5-921d-4221-b0a5-349012b39b52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.588 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[f845f4a1-517e-4417-84b0-c0837bc5da5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 systemd-udevd[226558]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:33:59 compute-0 systemd-machined[153562]: New machine qemu-69-instance-00000089.
Jan 23 09:33:59 compute-0 NetworkManager[54920]: <info>  [1769160839.6051] device (tapeaf8bed9-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:33:59 compute-0 NetworkManager[54920]: <info>  [1769160839.6058] device (tapeaf8bed9-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.610 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 ovn_controller[94697]: 2026-01-23T09:33:59Z|00522|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 ovn-installed in OVS
Jan 23 09:33:59 compute-0 ovn_controller[94697]: 2026-01-23T09:33:59Z|00523|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 up in Southbound
Jan 23 09:33:59 compute-0 systemd[1]: Started Virtual Machine qemu-69-instance-00000089.
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.616 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.620 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5c734d5c-f7fa-4d0e-bf47-eba2e3148bfb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.642 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[fae991f4-a779-4c93-9a58-532dcc087236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 NetworkManager[54920]: <info>  [1769160839.6475] manager: (tap1a0d6dfa-00): new Veth device (/org/freedesktop/NetworkManager/Devices/265)
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.646 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f709bae9-6585-498c-96f2-a263e1630b6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.671 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[66ecd62c-ac1b-4839-bb63-44f4b0e0a057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.673 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5974e2-3e86-4452-ac18-83a20c1e7a93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 NetworkManager[54920]: <info>  [1769160839.6914] device (tap1a0d6dfa-00): carrier: link connected
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.695 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[50ecc12f-451f-4743-8710-fc5cf17b8e6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.707 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b9627b6f-35dd-4289-9aed-db2b21d2fed1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a0d6dfa-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:b6:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434517, 'reachable_time': 36756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226582, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.718 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[11515da3-07a0-4581-a310-160b972b8016]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:b644'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434517, 'tstamp': 434517}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226583, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.730 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[56542284-05f9-432b-a817-51ec0efda1a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a0d6dfa-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:b6:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434517, 'reachable_time': 36756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226584, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.750 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[589e83f3-179b-41ee-bc23-72bd55389d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.790 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b1087d-9e80-4124-b9bb-b16082b4a1b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.791 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a0d6dfa-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.791 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.791 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a0d6dfa-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:59 compute-0 kernel: tap1a0d6dfa-00: entered promiscuous mode
Jan 23 09:33:59 compute-0 NetworkManager[54920]: <info>  [1769160839.7938] manager: (tap1a0d6dfa-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.796 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.800 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a0d6dfa-00, col_values=(('external_ids', {'iface-id': 'ce525290-d2d6-41ee-a73a-700c8cf045a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:33:59 compute-0 ovn_controller[94697]: 2026-01-23T09:33:59Z|00524|binding|INFO|Releasing lport ce525290-d2d6-41ee-a73a-700c8cf045a2 from this chassis (sb_readonly=0)
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.801 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.814 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:33:59 compute-0 nova_compute[182092]: 2026-01-23 09:33:59.813 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.814 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[702787c1-2e24-43c8-93a4-806801b88549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.815 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:33:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:33:59.816 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'env', 'PROCESS_TAG=haproxy-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:34:00 compute-0 podman[226613]: 2026-01-23 09:34:00.086878928 +0000 UTC m=+0.030100946 container create 16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 09:34:00 compute-0 systemd[1]: Started libpod-conmon-16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600.scope.
Jan 23 09:34:00 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:34:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd9a2c171ab6d2f6d71aa09ed2d4255dcf1abb613cc67c993581ea68ebe07a2e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:34:00 compute-0 podman[226613]: 2026-01-23 09:34:00.142551865 +0000 UTC m=+0.085773903 container init 16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:34:00 compute-0 podman[226613]: 2026-01-23 09:34:00.146974516 +0000 UTC m=+0.090196534 container start 16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:34:00 compute-0 podman[226613]: 2026-01-23 09:34:00.072815983 +0000 UTC m=+0.016038021 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:34:00 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226625]: [NOTICE]   (226629) : New worker (226631) forked
Jan 23 09:34:00 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226625]: [NOTICE]   (226629) : Loading success.
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.296 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160840.2964303, 77396169-1daa-41c7-8ba6-c68b50815e2a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.297 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] VM Started (Lifecycle Event)
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.320 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.325 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160840.2966049, 77396169-1daa-41c7-8ba6-c68b50815e2a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.325 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] VM Paused (Lifecycle Event)
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.344 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.346 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.361 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.517 182096 DEBUG nova.compute.manager [req-01cc4465-9773-4e13-8a5b-ee7bfa554d1b req-27071bfc-1743-49b8-91a2-0c423a0b9de3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.517 182096 DEBUG oslo_concurrency.lockutils [req-01cc4465-9773-4e13-8a5b-ee7bfa554d1b req-27071bfc-1743-49b8-91a2-0c423a0b9de3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.517 182096 DEBUG oslo_concurrency.lockutils [req-01cc4465-9773-4e13-8a5b-ee7bfa554d1b req-27071bfc-1743-49b8-91a2-0c423a0b9de3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.517 182096 DEBUG oslo_concurrency.lockutils [req-01cc4465-9773-4e13-8a5b-ee7bfa554d1b req-27071bfc-1743-49b8-91a2-0c423a0b9de3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.518 182096 DEBUG nova.compute.manager [req-01cc4465-9773-4e13-8a5b-ee7bfa554d1b req-27071bfc-1743-49b8-91a2-0c423a0b9de3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Processing event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.518 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.520 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160840.5203195, 77396169-1daa-41c7-8ba6-c68b50815e2a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.520 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] VM Resumed (Lifecycle Event)
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.521 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.523 182096 INFO nova.virt.libvirt.driver [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance spawned successfully.
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.523 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.542 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.546 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.549 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.549 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.549 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.549 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.550 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.550 182096 DEBUG nova.virt.libvirt.driver [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.579 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.624 182096 INFO nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Took 6.41 seconds to spawn the instance on the hypervisor.
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.624 182096 DEBUG nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.696 182096 INFO nova.compute.manager [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Took 6.75 seconds to build instance.
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.712 182096 DEBUG oslo_concurrency.lockutils [None req-042205f5-4fe0-4fe6-8c62-b4a0888b449e 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.822s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.902 182096 DEBUG nova.network.neutron [req-f439ced9-b6fe-471f-b35a-889a3735fe1b req-fdc9d107-bb73-4507-8f5e-589c4272747d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updated VIF entry in instance network info cache for port eaf8bed9-e3ff-491c-adfa-61a91e2b0509. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.902 182096 DEBUG nova.network.neutron [req-f439ced9-b6fe-471f-b35a-889a3735fe1b req-fdc9d107-bb73-4507-8f5e-589c4272747d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updating instance_info_cache with network_info: [{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:34:00 compute-0 nova_compute[182092]: 2026-01-23 09:34:00.913 182096 DEBUG oslo_concurrency.lockutils [req-f439ced9-b6fe-471f-b35a-889a3735fe1b req-fdc9d107-bb73-4507-8f5e-589c4272747d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:34:02 compute-0 nova_compute[182092]: 2026-01-23 09:34:02.391 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:02 compute-0 nova_compute[182092]: 2026-01-23 09:34:02.843 182096 DEBUG nova.compute.manager [req-44c61946-9f86-41ce-8f47-4c468789dead req-f74983cb-3c65-4fc0-b274-710964ef0cf8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:02 compute-0 nova_compute[182092]: 2026-01-23 09:34:02.844 182096 DEBUG oslo_concurrency.lockutils [req-44c61946-9f86-41ce-8f47-4c468789dead req-f74983cb-3c65-4fc0-b274-710964ef0cf8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:02 compute-0 nova_compute[182092]: 2026-01-23 09:34:02.844 182096 DEBUG oslo_concurrency.lockutils [req-44c61946-9f86-41ce-8f47-4c468789dead req-f74983cb-3c65-4fc0-b274-710964ef0cf8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:02 compute-0 nova_compute[182092]: 2026-01-23 09:34:02.844 182096 DEBUG oslo_concurrency.lockutils [req-44c61946-9f86-41ce-8f47-4c468789dead req-f74983cb-3c65-4fc0-b274-710964ef0cf8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:02 compute-0 nova_compute[182092]: 2026-01-23 09:34:02.845 182096 DEBUG nova.compute.manager [req-44c61946-9f86-41ce-8f47-4c468789dead req-f74983cb-3c65-4fc0-b274-710964ef0cf8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:02 compute-0 nova_compute[182092]: 2026-01-23 09:34:02.845 182096 WARNING nova.compute.manager [req-44c61946-9f86-41ce-8f47-4c468789dead req-f74983cb-3c65-4fc0-b274-710964ef0cf8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state active and task_state None.
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.264 182096 DEBUG nova.compute.manager [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.301 182096 INFO nova.compute.manager [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] instance snapshotting
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.515 182096 INFO nova.virt.libvirt.driver [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Beginning live snapshot process
Jan 23 09:34:03 compute-0 virtqemud[181713]: invalid argument: disk vda does not have an active block job
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.632 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.694 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.695 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.752 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.762 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.817 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.819 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpfg7fdehw/946151b3add545ddb085d906ddc3c376.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.840 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpfg7fdehw/946151b3add545ddb085d906ddc3c376.delta 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.841 182096 INFO nova.virt.libvirt.driver [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 23 09:34:03 compute-0 nova_compute[182092]: 2026-01-23 09:34:03.873 182096 DEBUG nova.virt.libvirt.guest [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 23 09:34:04 compute-0 nova_compute[182092]: 2026-01-23 09:34:04.092 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:04 compute-0 nova_compute[182092]: 2026-01-23 09:34:04.376 182096 DEBUG nova.virt.libvirt.guest [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 23 09:34:04 compute-0 nova_compute[182092]: 2026-01-23 09:34:04.379 182096 INFO nova.virt.libvirt.driver [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 23 09:34:04 compute-0 nova_compute[182092]: 2026-01-23 09:34:04.407 182096 DEBUG nova.privsep.utils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:34:04 compute-0 nova_compute[182092]: 2026-01-23 09:34:04.407 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpfg7fdehw/946151b3add545ddb085d906ddc3c376.delta /var/lib/nova/instances/snapshots/tmpfg7fdehw/946151b3add545ddb085d906ddc3c376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:04 compute-0 nova_compute[182092]: 2026-01-23 09:34:04.461 182096 DEBUG oslo_concurrency.processutils [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpfg7fdehw/946151b3add545ddb085d906ddc3c376.delta /var/lib/nova/instances/snapshots/tmpfg7fdehw/946151b3add545ddb085d906ddc3c376" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:04 compute-0 nova_compute[182092]: 2026-01-23 09:34:04.462 182096 INFO nova.virt.libvirt.driver [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Snapshot extracted, beginning image upload
Jan 23 09:34:06 compute-0 nova_compute[182092]: 2026-01-23 09:34:06.240 182096 INFO nova.virt.libvirt.driver [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Snapshot image upload complete
Jan 23 09:34:06 compute-0 nova_compute[182092]: 2026-01-23 09:34:06.241 182096 INFO nova.compute.manager [None req-f8cb6fac-d477-4498-9841-1682fba9e144 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Took 2.93 seconds to snapshot the instance on the hypervisor.
Jan 23 09:34:06 compute-0 nova_compute[182092]: 2026-01-23 09:34:06.659 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:07 compute-0 podman[226670]: 2026-01-23 09:34:07.219467147 +0000 UTC m=+0.058661605 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.393 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.671 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.671 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.710 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.767 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.767 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.826 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.848 182096 INFO nova.compute.manager [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Rescuing
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.849 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.849 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquired lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.849 182096 DEBUG nova.network.neutron [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.895 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160832.8941612, 17cffa6e-a77c-4bc8-965a-82b05ed586b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.895 182096 INFO nova.compute.manager [-] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] VM Stopped (Lifecycle Event)
Jan 23 09:34:07 compute-0 nova_compute[182092]: 2026-01-23 09:34:07.913 182096 DEBUG nova.compute.manager [None req-530ca773-e1d6-4539-bdcb-698c63da3442 - - - - - -] [instance: 17cffa6e-a77c-4bc8-965a-82b05ed586b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.086 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.086 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5522MB free_disk=73.26066589355469GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.087 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.087 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.152 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 77396169-1daa-41c7-8ba6-c68b50815e2a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.152 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.152 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.189 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.204 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.219 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.219 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.910 182096 DEBUG nova.network.neutron [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updating instance_info_cache with network_info: [{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:34:08 compute-0 nova_compute[182092]: 2026-01-23 09:34:08.930 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Releasing lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.095 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.136 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.219 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.220 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.671 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:34:09 compute-0 nova_compute[182092]: 2026-01-23 09:34:09.671 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:11 compute-0 nova_compute[182092]: 2026-01-23 09:34:11.276 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updating instance_info_cache with network_info: [{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:34:11 compute-0 nova_compute[182092]: 2026-01-23 09:34:11.289 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:34:11 compute-0 nova_compute[182092]: 2026-01-23 09:34:11.290 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:34:11 compute-0 nova_compute[182092]: 2026-01-23 09:34:11.290 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:12 compute-0 nova_compute[182092]: 2026-01-23 09:34:12.394 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:12 compute-0 ovn_controller[94697]: 2026-01-23T09:34:12Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:a6:68 10.100.0.12
Jan 23 09:34:12 compute-0 ovn_controller[94697]: 2026-01-23T09:34:12Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:a6:68 10.100.0.12
Jan 23 09:34:13 compute-0 nova_compute[182092]: 2026-01-23 09:34:13.285 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:13 compute-0 nova_compute[182092]: 2026-01-23 09:34:13.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:13 compute-0 nova_compute[182092]: 2026-01-23 09:34:13.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:34:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:13.889 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:34:13 compute-0 nova_compute[182092]: 2026-01-23 09:34:13.890 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:13.891 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:34:14 compute-0 nova_compute[182092]: 2026-01-23 09:34:14.097 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:14 compute-0 podman[226714]: 2026-01-23 09:34:14.204348722 +0000 UTC m=+0.042222900 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 23 09:34:14 compute-0 podman[226715]: 2026-01-23 09:34:14.230198157 +0000 UTC m=+0.065841107 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:34:15 compute-0 nova_compute[182092]: 2026-01-23 09:34:15.646 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:15 compute-0 nova_compute[182092]: 2026-01-23 09:34:15.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:34:17 compute-0 nova_compute[182092]: 2026-01-23 09:34:17.396 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:19 compute-0 nova_compute[182092]: 2026-01-23 09:34:19.100 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:19 compute-0 nova_compute[182092]: 2026-01-23 09:34:19.166 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:34:21 compute-0 kernel: tapeaf8bed9-e3 (unregistering): left promiscuous mode
Jan 23 09:34:21 compute-0 NetworkManager[54920]: <info>  [1769160861.2940] device (tapeaf8bed9-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.298 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:21 compute-0 ovn_controller[94697]: 2026-01-23T09:34:21Z|00525|binding|INFO|Releasing lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 from this chassis (sb_readonly=0)
Jan 23 09:34:21 compute-0 ovn_controller[94697]: 2026-01-23T09:34:21Z|00526|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 down in Southbound
Jan 23 09:34:21 compute-0 ovn_controller[94697]: 2026-01-23T09:34:21Z|00527|binding|INFO|Removing iface tapeaf8bed9-e3 ovn-installed in OVS
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.310 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:a6:68 10.100.0.12'], port_security=['fa:16:3e:e0:a6:68 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da36a2e2bae4483caec82cba10014d48', 'neutron:revision_number': '4', 'neutron:security_group_ids': '58a2818d-8ca4-4e70-8576-1bc8674f2c75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10cde387-ecdb-487a-958c-8320067c4581, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=eaf8bed9-e3ff-491c-adfa-61a91e2b0509) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.311 103978 INFO neutron.agent.ovn.metadata.agent [-] Port eaf8bed9-e3ff-491c-adfa-61a91e2b0509 in datapath 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 unbound from our chassis
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.312 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.314 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.314 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[60c95851-bd2a-4dde-bd30-b9f0dc3c0c01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.315 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 namespace which is not needed anymore
Jan 23 09:34:21 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000089.scope: Deactivated successfully.
Jan 23 09:34:21 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000089.scope: Consumed 11.384s CPU time.
Jan 23 09:34:21 compute-0 systemd-machined[153562]: Machine qemu-69-instance-00000089 terminated.
Jan 23 09:34:21 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226625]: [NOTICE]   (226629) : haproxy version is 2.8.14-c23fe91
Jan 23 09:34:21 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226625]: [NOTICE]   (226629) : path to executable is /usr/sbin/haproxy
Jan 23 09:34:21 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226625]: [WARNING]  (226629) : Exiting Master process...
Jan 23 09:34:21 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226625]: [ALERT]    (226629) : Current worker (226631) exited with code 143 (Terminated)
Jan 23 09:34:21 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226625]: [WARNING]  (226629) : All workers exited. Exiting... (0)
Jan 23 09:34:21 compute-0 systemd[1]: libpod-16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600.scope: Deactivated successfully.
Jan 23 09:34:21 compute-0 podman[226772]: 2026-01-23 09:34:21.407440879 +0000 UTC m=+0.034164991 container died 16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:34:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600-userdata-shm.mount: Deactivated successfully.
Jan 23 09:34:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-fd9a2c171ab6d2f6d71aa09ed2d4255dcf1abb613cc67c993581ea68ebe07a2e-merged.mount: Deactivated successfully.
Jan 23 09:34:21 compute-0 podman[226772]: 2026-01-23 09:34:21.428591423 +0000 UTC m=+0.055315535 container cleanup 16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:34:21 compute-0 systemd[1]: libpod-conmon-16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600.scope: Deactivated successfully.
Jan 23 09:34:21 compute-0 podman[226796]: 2026-01-23 09:34:21.466456351 +0000 UTC m=+0.024137628 container remove 16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.469 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[177a7253-a272-4dfc-9ec3-2cbb6495bf8c]: (4, ('Fri Jan 23 09:34:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 (16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600)\n16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600\nFri Jan 23 09:34:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 (16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600)\n16b3860673120ae760ad7bf71766785a19266ba582e8af2f89c5a3bc59b90600\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.470 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e74e1107-8981-4bee-a9c3-b4f2e7a85452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.471 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a0d6dfa-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.473 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:21 compute-0 kernel: tap1a0d6dfa-00: left promiscuous mode
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.487 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.488 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e33625-b88f-4142-9477-eff62fa6422e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.502 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[46930b7d-2b2e-492e-ba04-601fad28d390]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.503 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[53609d22-4562-4b2f-a49d-e9d51bc6af3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.515 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8767536e-9066-4544-9918-a857c277f541]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434512, 'reachable_time': 24593, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226810, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d1a0d6dfa\x2d0d95\x2d490d\x2dab54\x2d4e7f98a34e91.mount: Deactivated successfully.
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.517 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:34:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:21.517 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b67a3383-f6ae-4c4b-bb6e-1012dc37a198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.526 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.530 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.552 182096 DEBUG nova.compute.manager [req-f6d25d5e-5187-4932-8cef-851b72f6103f req-c073f7cb-6184-46bd-8f2f-dae28081fc24 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.553 182096 DEBUG oslo_concurrency.lockutils [req-f6d25d5e-5187-4932-8cef-851b72f6103f req-c073f7cb-6184-46bd-8f2f-dae28081fc24 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.553 182096 DEBUG oslo_concurrency.lockutils [req-f6d25d5e-5187-4932-8cef-851b72f6103f req-c073f7cb-6184-46bd-8f2f-dae28081fc24 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.553 182096 DEBUG oslo_concurrency.lockutils [req-f6d25d5e-5187-4932-8cef-851b72f6103f req-c073f7cb-6184-46bd-8f2f-dae28081fc24 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.553 182096 DEBUG nova.compute.manager [req-f6d25d5e-5187-4932-8cef-851b72f6103f req-c073f7cb-6184-46bd-8f2f-dae28081fc24 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:21 compute-0 nova_compute[182092]: 2026-01-23 09:34:21.553 182096 WARNING nova.compute.manager [req-f6d25d5e-5187-4932-8cef-851b72f6103f req-c073f7cb-6184-46bd-8f2f-dae28081fc24 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state active and task_state rescuing.
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.175 182096 INFO nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance shutdown successfully after 13 seconds.
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.179 182096 INFO nova.virt.libvirt.driver [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance destroyed successfully.
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.179 182096 DEBUG nova.objects.instance [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'numa_topology' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.191 182096 INFO nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Attempting a stable device rescue
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.396 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.451 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.454 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.454 182096 INFO nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Creating image(s)
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.455 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.455 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.456 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.456 182096 DEBUG nova.objects.instance [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.463 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "9813441954c0e6188ad4957fcfa70eab6148e9cb" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:22 compute-0 nova_compute[182092]: 2026-01-23 09:34:22.464 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "9813441954c0e6188ad4957fcfa70eab6148e9cb" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:22.893 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.634 182096 DEBUG nova.compute.manager [req-d2889f5c-9a1b-4d50-bff5-490107e74bbb req-27b92e64-e8f4-414b-bc7d-888b3ca0b6a6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.635 182096 DEBUG oslo_concurrency.lockutils [req-d2889f5c-9a1b-4d50-bff5-490107e74bbb req-27b92e64-e8f4-414b-bc7d-888b3ca0b6a6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.635 182096 DEBUG oslo_concurrency.lockutils [req-d2889f5c-9a1b-4d50-bff5-490107e74bbb req-27b92e64-e8f4-414b-bc7d-888b3ca0b6a6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.635 182096 DEBUG oslo_concurrency.lockutils [req-d2889f5c-9a1b-4d50-bff5-490107e74bbb req-27b92e64-e8f4-414b-bc7d-888b3ca0b6a6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.635 182096 DEBUG nova.compute.manager [req-d2889f5c-9a1b-4d50-bff5-490107e74bbb req-27b92e64-e8f4-414b-bc7d-888b3ca0b6a6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.635 182096 WARNING nova.compute.manager [req-d2889f5c-9a1b-4d50-bff5-490107e74bbb req-27b92e64-e8f4-414b-bc7d-888b3ca0b6a6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state active and task_state rescuing.
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.771 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.827 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb.part --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.828 182096 DEBUG nova.virt.images [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] 0328b848-8653-45f6-be05-4940f580aaac was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.829 182096 DEBUG nova.privsep.utils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.829 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb.part /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.881 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb.part /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb.converted" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.885 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.937 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb.converted --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.938 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "9813441954c0e6188ad4957fcfa70eab6148e9cb" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.949 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "9813441954c0e6188ad4957fcfa70eab6148e9cb" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.949 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "9813441954c0e6188ad4957fcfa70eab6148e9cb" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:23 compute-0 nova_compute[182092]: 2026-01-23 09:34:23.958 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.005 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.005 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb,backing_fmt=raw /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.027 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb,backing_fmt=raw /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.rescue" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.027 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "9813441954c0e6188ad4957fcfa70eab6148e9cb" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.028 182096 DEBUG nova.objects.instance [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'migration_context' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.039 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.041 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Start _get_guest_xml network_info=[{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "vif_mac": "fa:16:3e:e0:a6:68"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '0328b848-8653-45f6-be05-4940f580aaac', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.041 182096 DEBUG nova.objects.instance [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'resources' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.050 182096 WARNING nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.058 182096 DEBUG nova.virt.libvirt.host [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.058 182096 DEBUG nova.virt.libvirt.host [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.061 182096 DEBUG nova.virt.libvirt.host [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.061 182096 DEBUG nova.virt.libvirt.host [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.062 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.063 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.063 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.063 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.063 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.064 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.064 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.064 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.064 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.064 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.065 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.065 182096 DEBUG nova.virt.hardware [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.065 182096 DEBUG nova.objects.instance [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.079 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.102 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.127 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.128 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.128 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.128 182096 DEBUG oslo_concurrency.lockutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.130 182096 DEBUG nova.virt.libvirt.vif [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-361105592',display_name='tempest-ServerStableDeviceRescueTest-server-361105592',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-361105592',id=137,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:34:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da36a2e2bae4483caec82cba10014d48',ramdisk_id='',reservation_id='r-42fam2uc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1355926257',owner_user_name='tempest-ServerStableDeviceRescueTest-1355926257-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:34:06Z,user_data=None,user_id='800ea9ca92114ca5bf7589f4500f4bec',uuid=77396169-1daa-41c7-8ba6-c68b50815e2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "vif_mac": "fa:16:3e:e0:a6:68"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.130 182096 DEBUG nova.network.os_vif_util [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Converting VIF {"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "vif_mac": "fa:16:3e:e0:a6:68"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.131 182096 DEBUG nova.network.os_vif_util [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:a6:68,bridge_name='br-int',has_traffic_filtering=True,id=eaf8bed9-e3ff-491c-adfa-61a91e2b0509,network=Network(1a0d6dfa-0d95-490d-ab54-4e7f98a34e91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaf8bed9-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.131 182096 DEBUG nova.objects.instance [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'pci_devices' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.150 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <uuid>77396169-1daa-41c7-8ba6-c68b50815e2a</uuid>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <name>instance-00000089</name>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-361105592</nova:name>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:34:24</nova:creationTime>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:34:24 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:34:24 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:34:24 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:34:24 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:34:24 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:34:24 compute-0 nova_compute[182092]:         <nova:user uuid="800ea9ca92114ca5bf7589f4500f4bec">tempest-ServerStableDeviceRescueTest-1355926257-project-member</nova:user>
Jan 23 09:34:24 compute-0 nova_compute[182092]:         <nova:project uuid="da36a2e2bae4483caec82cba10014d48">tempest-ServerStableDeviceRescueTest-1355926257</nova:project>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:34:24 compute-0 nova_compute[182092]:         <nova:port uuid="eaf8bed9-e3ff-491c-adfa-61a91e2b0509">
Jan 23 09:34:24 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <system>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <entry name="serial">77396169-1daa-41c7-8ba6-c68b50815e2a</entry>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <entry name="uuid">77396169-1daa-41c7-8ba6-c68b50815e2a</entry>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </system>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <os>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   </os>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <features>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   </features>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.rescue"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <target dev="sdb" bus="usb"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <boot order="1"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:e0:a6:68"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <target dev="tapeaf8bed9-e3"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/console.log" append="off"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <video>
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </video>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:34:24 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:34:24 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:34:24 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:34:24 compute-0 nova_compute[182092]: </domain>
Jan 23 09:34:24 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.156 182096 INFO nova.virt.libvirt.driver [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance destroyed successfully.
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.197 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.197 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.197 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.198 182096 DEBUG nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] No VIF found with MAC fa:16:3e:e0:a6:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.198 182096 INFO nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Using config drive
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.208 182096 DEBUG nova.objects.instance [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:24 compute-0 nova_compute[182092]: 2026-01-23 09:34:24.244 182096 DEBUG nova.objects.instance [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'keypairs' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:26 compute-0 podman[226851]: 2026-01-23 09:34:26.206278321 +0000 UTC m=+0.043377146 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:34:26 compute-0 podman[226852]: 2026-01-23 09:34:26.20628286 +0000 UTC m=+0.042083927 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.355 182096 INFO nova.virt.libvirt.driver [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Creating config drive at /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config.rescue
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.359 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmc413v5d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.478 182096 DEBUG oslo_concurrency.processutils [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmc413v5d" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:26 compute-0 kernel: tapeaf8bed9-e3: entered promiscuous mode
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.527 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:26 compute-0 ovn_controller[94697]: 2026-01-23T09:34:26Z|00528|binding|INFO|Claiming lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for this chassis.
Jan 23 09:34:26 compute-0 ovn_controller[94697]: 2026-01-23T09:34:26Z|00529|binding|INFO|eaf8bed9-e3ff-491c-adfa-61a91e2b0509: Claiming fa:16:3e:e0:a6:68 10.100.0.12
Jan 23 09:34:26 compute-0 NetworkManager[54920]: <info>  [1769160866.5301] manager: (tapeaf8bed9-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.533 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:a6:68 10.100.0.12'], port_security=['fa:16:3e:e0:a6:68 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da36a2e2bae4483caec82cba10014d48', 'neutron:revision_number': '5', 'neutron:security_group_ids': '58a2818d-8ca4-4e70-8576-1bc8674f2c75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10cde387-ecdb-487a-958c-8320067c4581, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=eaf8bed9-e3ff-491c-adfa-61a91e2b0509) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.534 103978 INFO neutron.agent.ovn.metadata.agent [-] Port eaf8bed9-e3ff-491c-adfa-61a91e2b0509 in datapath 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 bound to our chassis
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.536 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:34:26 compute-0 ovn_controller[94697]: 2026-01-23T09:34:26Z|00530|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 ovn-installed in OVS
Jan 23 09:34:26 compute-0 ovn_controller[94697]: 2026-01-23T09:34:26Z|00531|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 up in Southbound
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.541 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.542 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.545 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.546 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e25ab73d-455a-4f83-9e85-62e1b6d73d29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.546 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a0d6dfa-01 in ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.548 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a0d6dfa-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.548 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a94b60ed-1f70-4ede-972e-b70fe57465fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.549 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bd277b87-c7e6-45c2-81f6-43ba8e9d0aad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 systemd-udevd[226912]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.559 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd55e77-d73e-4a04-bdc1-7f455f993f6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 systemd-machined[153562]: New machine qemu-70-instance-00000089.
Jan 23 09:34:26 compute-0 systemd[1]: Started Virtual Machine qemu-70-instance-00000089.
Jan 23 09:34:26 compute-0 NetworkManager[54920]: <info>  [1769160866.5693] device (tapeaf8bed9-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:34:26 compute-0 NetworkManager[54920]: <info>  [1769160866.5699] device (tapeaf8bed9-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.580 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[336c6f84-e7b3-411e-8469-b36860e8ba61]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.603 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[93fa4d07-54fe-4dea-a22d-eeacab3eee1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.605 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fa4e9ad1-dcab-4990-b878-8c556ba26507]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 NetworkManager[54920]: <info>  [1769160866.6063] manager: (tap1a0d6dfa-00): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Jan 23 09:34:26 compute-0 systemd-udevd[226915]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.628 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[7f436f01-36e4-45e8-b980-9e3bf1a1aacf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.630 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[75b15ef9-df3c-41a1-9187-63f06a117566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 NetworkManager[54920]: <info>  [1769160866.6461] device (tap1a0d6dfa-00): carrier: link connected
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.650 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d66d5138-4bb0-4481-ab27-d454894db81b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.662 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[85c16b46-b1ec-4f58-b3b5-53402de13c19]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a0d6dfa-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:b6:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 163], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437213, 'reachable_time': 30810, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226935, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.673 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8db8c2c5-66db-405f-8ab0-3b777aac4213]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:b644'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 437213, 'tstamp': 437213}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226936, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.686 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[09194c20-d66c-482f-952b-86ceb025f0ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a0d6dfa-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:b6:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 163], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437213, 'reachable_time': 30810, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226937, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.707 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c0230f0c-3886-426c-8bc5-d956d1cd8156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.754 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4e0296-a7b6-4c1a-8be3-0b8f61b2c4cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.755 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a0d6dfa-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.755 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.757 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a0d6dfa-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:26 compute-0 NetworkManager[54920]: <info>  [1769160866.7593] manager: (tap1a0d6dfa-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Jan 23 09:34:26 compute-0 kernel: tap1a0d6dfa-00: entered promiscuous mode
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.760 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a0d6dfa-00, col_values=(('external_ids', {'iface-id': 'ce525290-d2d6-41ee-a73a-700c8cf045a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.761 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:26 compute-0 ovn_controller[94697]: 2026-01-23T09:34:26Z|00532|binding|INFO|Releasing lport ce525290-d2d6-41ee-a73a-700c8cf045a2 from this chassis (sb_readonly=0)
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.766 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.766 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[230764c2-8df7-4ed5-9f4f-38a835cee6a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.767 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:34:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:26.769 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'env', 'PROCESS_TAG=haproxy-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.774 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.779 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 77396169-1daa-41c7-8ba6-c68b50815e2a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.779 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160866.779029, 77396169-1daa-41c7-8ba6-c68b50815e2a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.779 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] VM Resumed (Lifecycle Event)
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.790 182096 DEBUG nova.compute.manager [None req-690413c2-f9e8-4051-bff4-26d08912ba76 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.797 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.799 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.832 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.832 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160866.7812424, 77396169-1daa-41c7-8ba6-c68b50815e2a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.832 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] VM Started (Lifecycle Event)
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.855 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:26 compute-0 nova_compute[182092]: 2026-01-23 09:34:26.857 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:34:27 compute-0 podman[226972]: 2026-01-23 09:34:27.049166519 +0000 UTC m=+0.030493316 container create 74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:34:27 compute-0 systemd[1]: Started libpod-conmon-74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6.scope.
Jan 23 09:34:27 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:34:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8e9466b35a12ed8946dfcbcb8b40c6c60689b8c898cd0458fd83a71a5773be8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:34:27 compute-0 podman[226972]: 2026-01-23 09:34:27.10552544 +0000 UTC m=+0.086852257 container init 74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 09:34:27 compute-0 podman[226972]: 2026-01-23 09:34:27.109633639 +0000 UTC m=+0.090960436 container start 74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 09:34:27 compute-0 podman[226972]: 2026-01-23 09:34:27.036761672 +0000 UTC m=+0.018088489 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:34:27 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226985]: [NOTICE]   (226989) : New worker (226991) forked
Jan 23 09:34:27 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226985]: [NOTICE]   (226989) : Loading success.
Jan 23 09:34:27 compute-0 nova_compute[182092]: 2026-01-23 09:34:27.398 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:27 compute-0 nova_compute[182092]: 2026-01-23 09:34:27.791 182096 DEBUG nova.compute.manager [req-296e3b0c-23d5-4d74-91a1-9522052b4789 req-afa019d3-fe44-499b-b90c-eaf2d163808c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:27 compute-0 nova_compute[182092]: 2026-01-23 09:34:27.791 182096 DEBUG oslo_concurrency.lockutils [req-296e3b0c-23d5-4d74-91a1-9522052b4789 req-afa019d3-fe44-499b-b90c-eaf2d163808c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:27 compute-0 nova_compute[182092]: 2026-01-23 09:34:27.791 182096 DEBUG oslo_concurrency.lockutils [req-296e3b0c-23d5-4d74-91a1-9522052b4789 req-afa019d3-fe44-499b-b90c-eaf2d163808c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:27 compute-0 nova_compute[182092]: 2026-01-23 09:34:27.792 182096 DEBUG oslo_concurrency.lockutils [req-296e3b0c-23d5-4d74-91a1-9522052b4789 req-afa019d3-fe44-499b-b90c-eaf2d163808c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:27 compute-0 nova_compute[182092]: 2026-01-23 09:34:27.792 182096 DEBUG nova.compute.manager [req-296e3b0c-23d5-4d74-91a1-9522052b4789 req-afa019d3-fe44-499b-b90c-eaf2d163808c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:27 compute-0 nova_compute[182092]: 2026-01-23 09:34:27.792 182096 WARNING nova.compute.manager [req-296e3b0c-23d5-4d74-91a1-9522052b4789 req-afa019d3-fe44-499b-b90c-eaf2d163808c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state rescued and task_state None.
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.106 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:29 compute-0 podman[226996]: 2026-01-23 09:34:29.223688507 +0000 UTC m=+0.052924655 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible)
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.626 182096 INFO nova.compute.manager [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Unrescuing
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.626 182096 DEBUG oslo_concurrency.lockutils [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.627 182096 DEBUG oslo_concurrency.lockutils [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquired lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.627 182096 DEBUG nova.network.neutron [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.851 182096 DEBUG nova.compute.manager [req-1e680416-07a9-4561-b678-4a3b4f1558c6 req-3ef7eca2-61c2-4711-b92f-f04928ecbcba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.851 182096 DEBUG oslo_concurrency.lockutils [req-1e680416-07a9-4561-b678-4a3b4f1558c6 req-3ef7eca2-61c2-4711-b92f-f04928ecbcba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.852 182096 DEBUG oslo_concurrency.lockutils [req-1e680416-07a9-4561-b678-4a3b4f1558c6 req-3ef7eca2-61c2-4711-b92f-f04928ecbcba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.852 182096 DEBUG oslo_concurrency.lockutils [req-1e680416-07a9-4561-b678-4a3b4f1558c6 req-3ef7eca2-61c2-4711-b92f-f04928ecbcba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.852 182096 DEBUG nova.compute.manager [req-1e680416-07a9-4561-b678-4a3b4f1558c6 req-3ef7eca2-61c2-4711-b92f-f04928ecbcba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:29 compute-0 nova_compute[182092]: 2026-01-23 09:34:29.852 182096 WARNING nova.compute.manager [req-1e680416-07a9-4561-b678-4a3b4f1558c6 req-3ef7eca2-61c2-4711-b92f-f04928ecbcba 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state rescued and task_state unrescuing.
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.387 182096 DEBUG nova.network.neutron [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updating instance_info_cache with network_info: [{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.400 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.402 182096 DEBUG oslo_concurrency.lockutils [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Releasing lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.403 182096 DEBUG nova.objects.instance [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'flavor' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:32 compute-0 kernel: tapeaf8bed9-e3 (unregistering): left promiscuous mode
Jan 23 09:34:32 compute-0 NetworkManager[54920]: <info>  [1769160872.4415] device (tapeaf8bed9-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:34:32 compute-0 ovn_controller[94697]: 2026-01-23T09:34:32Z|00533|binding|INFO|Releasing lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 from this chassis (sb_readonly=0)
Jan 23 09:34:32 compute-0 ovn_controller[94697]: 2026-01-23T09:34:32Z|00534|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 down in Southbound
Jan 23 09:34:32 compute-0 ovn_controller[94697]: 2026-01-23T09:34:32Z|00535|binding|INFO|Removing iface tapeaf8bed9-e3 ovn-installed in OVS
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.450 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.460 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:a6:68 10.100.0.12'], port_security=['fa:16:3e:e0:a6:68 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da36a2e2bae4483caec82cba10014d48', 'neutron:revision_number': '6', 'neutron:security_group_ids': '58a2818d-8ca4-4e70-8576-1bc8674f2c75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10cde387-ecdb-487a-958c-8320067c4581, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=eaf8bed9-e3ff-491c-adfa-61a91e2b0509) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.462 103978 INFO neutron.agent.ovn.metadata.agent [-] Port eaf8bed9-e3ff-491c-adfa-61a91e2b0509 in datapath 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 unbound from our chassis
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.463 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.464 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.465 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[25beaae1-c2b8-49cd-b1fa-ba45c2867723]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.465 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 namespace which is not needed anymore
Jan 23 09:34:32 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000089.scope: Deactivated successfully.
Jan 23 09:34:32 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000089.scope: Consumed 5.837s CPU time.
Jan 23 09:34:32 compute-0 systemd-machined[153562]: Machine qemu-70-instance-00000089 terminated.
Jan 23 09:34:32 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226985]: [NOTICE]   (226989) : haproxy version is 2.8.14-c23fe91
Jan 23 09:34:32 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226985]: [NOTICE]   (226989) : path to executable is /usr/sbin/haproxy
Jan 23 09:34:32 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226985]: [WARNING]  (226989) : Exiting Master process...
Jan 23 09:34:32 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226985]: [WARNING]  (226989) : Exiting Master process...
Jan 23 09:34:32 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226985]: [ALERT]    (226989) : Current worker (226991) exited with code 143 (Terminated)
Jan 23 09:34:32 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[226985]: [WARNING]  (226989) : All workers exited. Exiting... (0)
Jan 23 09:34:32 compute-0 systemd[1]: libpod-74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6.scope: Deactivated successfully.
Jan 23 09:34:32 compute-0 conmon[226985]: conmon 74a53e1a7b8e853bd6b9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6.scope/container/memory.events
Jan 23 09:34:32 compute-0 podman[227034]: 2026-01-23 09:34:32.556639358 +0000 UTC m=+0.033054235 container died 74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 09:34:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6-userdata-shm.mount: Deactivated successfully.
Jan 23 09:34:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-d8e9466b35a12ed8946dfcbcb8b40c6c60689b8c898cd0458fd83a71a5773be8-merged.mount: Deactivated successfully.
Jan 23 09:34:32 compute-0 podman[227034]: 2026-01-23 09:34:32.574189962 +0000 UTC m=+0.050604838 container cleanup 74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:34:32 compute-0 systemd[1]: libpod-conmon-74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6.scope: Deactivated successfully.
Jan 23 09:34:32 compute-0 podman[227058]: 2026-01-23 09:34:32.618832674 +0000 UTC m=+0.025813588 container remove 74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.622 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a22ebfe1-9b23-4dce-88a1-54f43e2130d4]: (4, ('Fri Jan 23 09:34:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 (74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6)\n74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6\nFri Jan 23 09:34:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 (74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6)\n74a53e1a7b8e853bd6b928b0f8b81fbb099a552229d25879c46a035f8b72eac6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.623 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2efe020a-c269-4047-b7a5-0b52d9c1cebb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.624 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a0d6dfa-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:32 compute-0 kernel: tap1a0d6dfa-00: left promiscuous mode
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.626 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 NetworkManager[54920]: <info>  [1769160872.6419] manager: (tapeaf8bed9-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/270)
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.642 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.645 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[de570ee7-014a-46b1-b0b4-843d022c62db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.660 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0e00af-57ea-4009-917c-fa72607862fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.661 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b6d65ec8-8956-43cc-8a46-bd795dcdf9f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.674 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a60a2006-5de4-4207-a46f-79238e8fe782]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437208, 'reachable_time': 31656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227083, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d1a0d6dfa\x2d0d95\x2d490d\x2dab54\x2d4e7f98a34e91.mount: Deactivated successfully.
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.676 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.676 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5fa818-8d69-4b37-a898-46482edddbd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.685 182096 INFO nova.virt.libvirt.driver [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance destroyed successfully.
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.685 182096 DEBUG nova.objects.instance [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'numa_topology' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:32 compute-0 kernel: tapeaf8bed9-e3: entered promiscuous mode
Jan 23 09:34:32 compute-0 systemd-udevd[227017]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:34:32 compute-0 NetworkManager[54920]: <info>  [1769160872.7447] manager: (tapeaf8bed9-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/271)
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.747 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 ovn_controller[94697]: 2026-01-23T09:34:32Z|00536|binding|INFO|Claiming lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for this chassis.
Jan 23 09:34:32 compute-0 ovn_controller[94697]: 2026-01-23T09:34:32Z|00537|binding|INFO|eaf8bed9-e3ff-491c-adfa-61a91e2b0509: Claiming fa:16:3e:e0:a6:68 10.100.0.12
Jan 23 09:34:32 compute-0 NetworkManager[54920]: <info>  [1769160872.7536] device (tapeaf8bed9-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:34:32 compute-0 NetworkManager[54920]: <info>  [1769160872.7540] device (tapeaf8bed9-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.759 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:a6:68 10.100.0.12'], port_security=['fa:16:3e:e0:a6:68 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da36a2e2bae4483caec82cba10014d48', 'neutron:revision_number': '6', 'neutron:security_group_ids': '58a2818d-8ca4-4e70-8576-1bc8674f2c75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10cde387-ecdb-487a-958c-8320067c4581, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=eaf8bed9-e3ff-491c-adfa-61a91e2b0509) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:34:32 compute-0 ovn_controller[94697]: 2026-01-23T09:34:32Z|00538|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 up in Southbound
Jan 23 09:34:32 compute-0 ovn_controller[94697]: 2026-01-23T09:34:32Z|00539|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 ovn-installed in OVS
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.761 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.760 103978 INFO neutron.agent.ovn.metadata.agent [-] Port eaf8bed9-e3ff-491c-adfa-61a91e2b0509 in datapath 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 bound to our chassis
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.761 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.762 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.764 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.771 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ef390434-683f-4519-9cdb-e1a20726b6db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.771 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a0d6dfa-01 in ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.772 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a0d6dfa-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.772 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7d89055f-1e4b-443a-8bdc-5a29dd6684fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.773 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4a811205-630a-4224-b863-6ca588c0a679]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.782 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[897c3b67-2814-401e-abb1-5af6b65b5c0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 systemd-machined[153562]: New machine qemu-71-instance-00000089.
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.791 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c339d74-c945-4b8a-9e24-3a0de458ceda]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 systemd[1]: Started Virtual Machine qemu-71-instance-00000089.
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.811 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d52fc9-17bc-4f96-b1e8-37503a0b5bc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.815 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[22f6aa04-5308-4688-b8fd-2c7203192afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 NetworkManager[54920]: <info>  [1769160872.8175] manager: (tap1a0d6dfa-00): new Veth device (/org/freedesktop/NetworkManager/Devices/272)
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.837 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[86f2c9b9-df56-49a4-b45d-13515d0ec2de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.839 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[0be94055-6d9c-4c8e-be22-6c9f73f439ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 NetworkManager[54920]: <info>  [1769160872.8586] device (tap1a0d6dfa-00): carrier: link connected
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.864 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa4a43f-7c29-4155-9115-6fbb87c73d0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.875 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[601b35fe-5913-4cf7-81a4-849f9048e4cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a0d6dfa-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:b6:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437834, 'reachable_time': 25962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227129, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.887 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[53922168-3a13-4cbb-9051-d13f01c1a38c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:b644'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 437834, 'tstamp': 437834}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227130, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.901 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8a2b19-e6f0-42d3-b924-826588898c15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a0d6dfa-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:b6:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437834, 'reachable_time': 25962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227131, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.921 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c76a0d22-37f6-45bc-b468-5529bf7c7727]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.960 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ec715f2c-cf93-4858-918a-6e61b45bf34b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.960 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a0d6dfa-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.961 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.961 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a0d6dfa-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:32 compute-0 NetworkManager[54920]: <info>  [1769160872.9631] manager: (tap1a0d6dfa-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 23 09:34:32 compute-0 kernel: tap1a0d6dfa-00: entered promiscuous mode
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.964 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a0d6dfa-00, col_values=(('external_ids', {'iface-id': 'ce525290-d2d6-41ee-a73a-700c8cf045a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:32 compute-0 ovn_controller[94697]: 2026-01-23T09:34:32Z|00540|binding|INFO|Releasing lport ce525290-d2d6-41ee-a73a-700c8cf045a2 from this chassis (sb_readonly=0)
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.962 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.979 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:34:32 compute-0 nova_compute[182092]: 2026-01-23 09:34:32.979 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.980 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3d134fc2-8ecf-49b5-8a2f-3288fc8216d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.980 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.pid.haproxy
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:34:32 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:32.982 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'env', 'PROCESS_TAG=haproxy-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a0d6dfa-0d95-490d-ab54-4e7f98a34e91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.002 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000089', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'da36a2e2bae4483caec82cba10014d48', 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'hostId': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.005 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 77396169-1daa-41c7-8ba6-c68b50815e2a / tapeaf8bed9-e3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.005 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ba4223b-1966-4537-9523-ec67843c71ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.003627', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b95d7f18-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': '5581839f970996e1fc4b2d27b3899cd9ac75cbbe117533f9a2a253d372f7f9b6'}]}, 'timestamp': '2026-01-23 09:34:33.006013', '_unique_id': 'd2d05d25fbb343c7a24320d9ff0d5912'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.006 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.007 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c17ab43b-6e4a-4d70-a25b-a38133fe95d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.007942', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b95dd74c-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': 'c07b7f433edff46f7f7564de9546592098323f450224642c35fe4be6016b4b51'}]}, 'timestamp': '2026-01-23 09:34:33.008239', '_unique_id': '43429bfce02e413b968a69aa66c5dffa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.008 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.095 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 77396169-1daa-41c7-8ba6-c68b50815e2a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.096 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160873.0928278, 77396169-1daa-41c7-8ba6-c68b50815e2a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.097 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] VM Resumed (Lifecycle Event)
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.099 182096 DEBUG nova.compute.manager [None req-23cdf1ce-b59a-4bdb-94c8-307dd35a85db 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.122 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.124 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.137 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.137 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23d98134-50c4-4ca8-bfc2-089275acc3b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.009854', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b971a0d8-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': 'f847ef40f56896cdb78a29438ed931aa84fec9fb24f0566511b6745dc513cfff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.009854', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b971ac9a-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': '78fbd8cc4154f758a06ecd207e082db90579f3e1868ec2478da6e0446d5a8238'}]}, 'timestamp': '2026-01-23 09:34:33.138211', '_unique_id': 'd3ab4a0f6b99403eaacfc570b084dbd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.138 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.139 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b3fe1f4-fadd-4301-bf1f-710b7e9fabd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.139918', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b971fa92-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': 'cf2fcd55d98a9647a378a4dc3db5ca7c36c07e1c2874cfef11ead3dab2854d39'}]}, 'timestamp': '2026-01-23 09:34:33.140209', '_unique_id': 'b4741ca386244efbab63c2207dab9c56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.140 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.141 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.incoming.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6239886e-006e-4718-aaee-2937f222b869', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.141684', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b9723f84-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': 'c8e9d24e0663cdc0b6554fb33e53029a75675be11a963a1123207abd49a4ee0b'}]}, 'timestamp': '2026-01-23 09:34:33.141956', '_unique_id': 'f184687b40234649b2ecd442eca97d4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.142 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.143 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12e8f485-89be-4b09-9713-a00eeae1f193', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.143365', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b97280de-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': 'd6aa03f40536901104e948352e25f535e27ba64730923680d0bc14ed1cff3143'}]}, 'timestamp': '2026-01-23 09:34:33.143628', '_unique_id': 'edb7f165833f43e788187dfcab1f5e8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.145 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.145 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bc4450e-3283-4f3b-a0d4-286ac05611b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.145031', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b972c260-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': '1729a36b74129012f705bb90a62f95784429a28763e19bfb13810799c4090d98'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.145031', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b972cb84-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': 'd0202611e0e49ca861f60d56f34ecd3efa50b0d8d52dbe3154bfd26ca3f384d4'}]}, 'timestamp': '2026-01-23 09:34:33.145527', '_unique_id': '59a41729701f42a7a2cb87c47ec78ccd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.146 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad8214b9-d593-4da6-b121-9a4e4121a7fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.146933', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b9730c20-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': 'f107d6656bf204d3a0346ccdf3b4aacad2bab40b9804c562c88de1e90d395d00'}]}, 'timestamp': '2026-01-23 09:34:33.147208', '_unique_id': 'aa4e2d23b2cd44178f8c2ac09ec02209'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.147 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.150 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] During sync_power_state the instance has a pending task (unrescuing). Skip.
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.150 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160873.0975616, 77396169-1daa-41c7-8ba6-c68b50815e2a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.150 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] VM Started (Lifecycle Event)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.165 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d5dc4e1-ab38-455a-a799-f152b8638d07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.153840', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b975e800-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': 'ea38d7bbad0248ec5023f8bc049a888d965565cadd100af6941ce8f5aa243457'}]}, 'timestamp': '2026-01-23 09:34:33.165932', '_unique_id': '7b1bfd9ce8694eb680abae570bf0e8ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.166 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd349913b-59e0-4cb6-8570-c5fe3294731c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.167208', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b97622f2-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': 'cbe77dce4be336c874e2edafe37ecd4c82af71b23a5ddcf1b9af774b66101674'}]}, 'timestamp': '2026-01-23 09:34:33.167438', '_unique_id': '78e7a4c3c87c4ab59154b6e3d0eb8520'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.167 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.168 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.174 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.176 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.187 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.187 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 77396169-1daa-41c7-8ba6-c68b50815e2a: ceilometer.compute.pollsters.NoVolumeException
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.187 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.187 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af1380aa-ee41-4da0-8fe2-b2a9d41a3f20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.187688', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9794342-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': '1f048e2c6051fda3f7965cc9fc4aa23ef66c0cec433f728c66d29e6e9df2784a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.187688', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9794ba8-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': 'aa94a0d9714bad5cc140f6b500ea1537ce4c056e25400c5630a7e4be148b6b3a'}]}, 'timestamp': '2026-01-23 09:34:33.188128', '_unique_id': '19479796e0ca4ae799292b862bb21a43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.188 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.189 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.189 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/cpu volume: 50000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '109e02b7-df72-488a-9be4-67b7dc0c75d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50000000, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'timestamp': '2026-01-23T09:34:33.189251', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b9797fce-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.716724444, 'message_signature': '6850402504d59fa861312b1241797af96025c5778eb763633b3738487a73ebc8'}]}, 'timestamp': '2026-01-23 09:34:33.189469', '_unique_id': 'bec735573c2346aaad054e1014becac6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.190 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.191 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e6181de-5303-4d8b-b53d-8370929e8a2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.191481', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b979d708-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': 'defd70f22ec377c75fffb5c41a1d6dd88459ece4587f75fe2d3f94fc9479d467'}]}, 'timestamp': '2026-01-23 09:34:33.191732', '_unique_id': 'f85bfe9e14fc402db8b7f0b143945924'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.212 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.212 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1ee9295-4b2b-402b-9470-ec86a3ba2cab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.192804', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b97d0c52-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.72219865, 'message_signature': '80f74378feb90145aadcaa178734ed18dc2af5d6431447d845b91aa4a256f7d0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.192804', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b97d1ba2-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.72219865, 'message_signature': '1ed8df69d0922db11703c7335fbbfb79dda4e8c2b3f807a1bd52851e5efa2810'}]}, 'timestamp': '2026-01-23 09:34:33.213136', '_unique_id': '2e1e1d3990464baaa56d89fc7ee7ddad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.213 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.216 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.216 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-361105592>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-361105592>]
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.217 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.217 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3203d69-8b8c-48a7-8d0d-d6af5ba29476', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.217230', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b97dc642-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.72219865, 'message_signature': 'c0bcd3d424317326aefeb2d946f1dc23588e0707f758a150c4a062c8dabd942f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.217230', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b97dcff2-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.72219865, 'message_signature': '9466780784dff50a053a61b2768882dc52c472f02a46408da17ae1a60f0f5653'}]}, 'timestamp': '2026-01-23 09:34:33.217761', '_unique_id': 'd7ca68ba4e8c4e9c9843f81846d707dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.218 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.219 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.219 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-361105592>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-361105592>]
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.219 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.219 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-361105592>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-361105592>]
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.219 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/network.incoming.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2258a6a-a982-4dc2-84c1-e7c0331998bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': 'instance-00000089-77396169-1daa-41c7-8ba6-c68b50815e2a-tapeaf8bed9-e3', 'timestamp': '2026-01-23T09:34:33.219862', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'tapeaf8bed9-e3', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e0:a6:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeaf8bed9-e3'}, 'message_id': 'b97e2bdc-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.533158904, 'message_signature': '6f4c0367db3622a062488eee8c6f159e5506cb1e121f8b860f558b0936e98c5d'}]}, 'timestamp': '2026-01-23 09:34:33.220101', '_unique_id': '767f52b4d1134701bee48e2e0102e570'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.220 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.221 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.221 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '003d2c56-4a61-4abc-a839-5753d63f8dc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.221475', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b97e6a8e-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.72219865, 'message_signature': '17cdf61e179deeba5cff1ddd730d0752cc947f02a18e4decdc5d63d2f2cd2565'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.221475', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b97e73da-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.72219865, 'message_signature': '13ba81d9ae058220e4879ac2bc6fca4088387eff45c3c06e91463ff64f695a10'}]}, 'timestamp': '2026-01-23 09:34:33.221927', '_unique_id': '0d0e48f3033b4e559d524665e40fdead'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.223 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.223 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-361105592>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-361105592>]
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.223 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.223 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dec57c51-f5bc-4f27-9a3b-d64d3cc24ba2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.223343', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b97eb386-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': 'cfbc956b9cfdf08f99b01a2eedbaa52db1a401974a7b66f0e00cb4782572311b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.223343', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b97ebc8c-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': 'bad2acd259e7439f3c1ee856bdf28432efc08521c60146d4126d6c583ccac3c2'}]}, 'timestamp': '2026-01-23 09:34:33.223787', '_unique_id': '8e59f6425e08456db4c14a45995ac690'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.224 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '400b05b9-30cf-4381-a9ea-0e707dcf017a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.224873', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b97eef2c-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': 'c34500131b0cf7e510c52773336c844ec09840b98dedf4fd77592ebe8d355cd1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.224873', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b97ef74c-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': '2f2487109d45d4b89239dece071c34a23ef607c7f2161ccc511e1690fa12f5b4'}]}, 'timestamp': '2026-01-23 09:34:33.225290', '_unique_id': 'cfdd2c46497741f6a3a13db14f2ae23f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.225 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.228 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.228 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.228 12 DEBUG ceilometer.compute.pollsters [-] 77396169-1daa-41c7-8ba6-c68b50815e2a/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c22b608d-63cb-4fdb-91ad-9a60b01bcee0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-vda', 'timestamp': '2026-01-23T09:34:33.228380', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b97f7942-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': 'f069d601780724a47408ec020f30726c2127ca598cea93bb00ec1813d175014a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '800ea9ca92114ca5bf7589f4500f4bec', 'user_name': None, 'project_id': 'da36a2e2bae4483caec82cba10014d48', 'project_name': None, 'resource_id': '77396169-1daa-41c7-8ba6-c68b50815e2a-sda', 'timestamp': '2026-01-23T09:34:33.228380', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-361105592', 'name': 'instance-00000089', 'instance_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'instance_type': 'm1.nano', 'host': '33201920bf2a68604a3f33fb1c72d204c1b886ac1480286a5c99b18c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b97f8388-f83e-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4378.539242309, 'message_signature': 'f4868d32b4522d48b15866c2a5ce90f83c92017d487b85b1a0656daa2044d231'}]}, 'timestamp': '2026-01-23 09:34:33.228886', '_unique_id': '6c8e60bf49d54f36bf436ae1154bc9c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.229 182096 DEBUG nova.compute.manager [req-ee62a72c-e940-485d-9ce7-391a905706ac req-060d237f-71c6-4fbb-92d1-297daa142486 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.229 182096 DEBUG oslo_concurrency.lockutils [req-ee62a72c-e940-485d-9ce7-391a905706ac req-060d237f-71c6-4fbb-92d1-297daa142486 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.229 182096 DEBUG oslo_concurrency.lockutils [req-ee62a72c-e940-485d-9ce7-391a905706ac req-060d237f-71c6-4fbb-92d1-297daa142486 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.230 182096 DEBUG oslo_concurrency.lockutils [req-ee62a72c-e940-485d-9ce7-391a905706ac req-060d237f-71c6-4fbb-92d1-297daa142486 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.230 182096 DEBUG nova.compute.manager [req-ee62a72c-e940-485d-9ce7-391a905706ac req-060d237f-71c6-4fbb-92d1-297daa142486 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:34:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:34:33.229 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:34:33 compute-0 nova_compute[182092]: 2026-01-23 09:34:33.230 182096 WARNING nova.compute.manager [req-ee62a72c-e940-485d-9ce7-391a905706ac req-060d237f-71c6-4fbb-92d1-297daa142486 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state active and task_state None.
Jan 23 09:34:33 compute-0 podman[227166]: 2026-01-23 09:34:33.291116327 +0000 UTC m=+0.033087258 container create 1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:34:33 compute-0 systemd[1]: Started libpod-conmon-1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19.scope.
Jan 23 09:34:33 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:34:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6efe530b6f340e3b2c4086766d63dd9b71c581a8596f7f03253dc5e7f469e74f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:34:33 compute-0 podman[227166]: 2026-01-23 09:34:33.35072648 +0000 UTC m=+0.092697411 container init 1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 09:34:33 compute-0 podman[227166]: 2026-01-23 09:34:33.356184183 +0000 UTC m=+0.098155105 container start 1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 09:34:33 compute-0 podman[227166]: 2026-01-23 09:34:33.276827947 +0000 UTC m=+0.018798888 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:34:33 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[227178]: [NOTICE]   (227182) : New worker (227184) forked
Jan 23 09:34:33 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[227178]: [NOTICE]   (227182) : Loading success.
Jan 23 09:34:34 compute-0 nova_compute[182092]: 2026-01-23 09:34:34.110 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.309 182096 DEBUG nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.309 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.309 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.309 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.310 182096 DEBUG nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.310 182096 WARNING nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state active and task_state None.
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.310 182096 DEBUG nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.310 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.310 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.311 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.311 182096 DEBUG nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.311 182096 WARNING nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state active and task_state None.
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.311 182096 DEBUG nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.312 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.312 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.312 182096 DEBUG oslo_concurrency.lockutils [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.312 182096 DEBUG nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:35 compute-0 nova_compute[182092]: 2026-01-23 09:34:35.313 182096 WARNING nova.compute.manager [req-82eef47a-9269-42f7-9ee1-675e239e2049 req-160b8b9b-5bff-419a-80fd-7e94a4fa69fd 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state active and task_state None.
Jan 23 09:34:37 compute-0 nova_compute[182092]: 2026-01-23 09:34:37.401 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:38 compute-0 podman[227190]: 2026-01-23 09:34:38.221002004 +0000 UTC m=+0.056935710 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 23 09:34:39 compute-0 nova_compute[182092]: 2026-01-23 09:34:39.112 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:39.866 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:39.866 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:39.867 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:42 compute-0 nova_compute[182092]: 2026-01-23 09:34:42.403 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:44 compute-0 nova_compute[182092]: 2026-01-23 09:34:44.113 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:45 compute-0 podman[227220]: 2026-01-23 09:34:45.201136875 +0000 UTC m=+0.035946141 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:34:45 compute-0 podman[227219]: 2026-01-23 09:34:45.20823344 +0000 UTC m=+0.045286707 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 09:34:45 compute-0 ovn_controller[94697]: 2026-01-23T09:34:45Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:a6:68 10.100.0.12
Jan 23 09:34:47 compute-0 nova_compute[182092]: 2026-01-23 09:34:47.403 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:49 compute-0 nova_compute[182092]: 2026-01-23 09:34:49.115 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.641 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.641 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.657 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.744 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.744 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.750 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.750 182096 INFO nova.compute.claims [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.848 182096 DEBUG nova.compute.provider_tree [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.857 182096 DEBUG nova.scheduler.client.report [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.870 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.870 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.904 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.904 182096 DEBUG nova.network.neutron [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.914 182096 INFO nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.930 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.996 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.997 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.997 182096 INFO nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Creating image(s)
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.998 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "/var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.998 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:50 compute-0 nova_compute[182092]: 2026-01-23 09:34:50.998 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.008 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.043 182096 DEBUG nova.policy [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.058 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.058 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.059 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.067 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.116 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.117 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.138 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.139 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.139 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.186 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.187 182096 DEBUG nova.virt.disk.api [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Checking if we can resize image /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.187 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.234 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.235 182096 DEBUG nova.virt.disk.api [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Cannot resize image /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.235 182096 DEBUG nova.objects.instance [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.246 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.246 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Ensure instance console log exists: /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.247 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.247 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.247 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:51.726 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.727 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:51.728 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:34:51 compute-0 nova_compute[182092]: 2026-01-23 09:34:51.808 182096 DEBUG nova.network.neutron [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Successfully created port: dde40d00-302a-4c36-a09f-66e55993a2e3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.405 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.589 182096 DEBUG nova.network.neutron [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Successfully updated port: dde40d00-302a-4c36-a09f-66e55993a2e3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.602 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "refresh_cache-5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.603 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquired lock "refresh_cache-5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.603 182096 DEBUG nova.network.neutron [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.678 182096 DEBUG nova.compute.manager [req-b05c0c83-6a64-4d6e-8ebd-fca2af6a96c6 req-9ef8fa3e-74da-4a4c-b688-733fa16a961d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received event network-changed-dde40d00-302a-4c36-a09f-66e55993a2e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.679 182096 DEBUG nova.compute.manager [req-b05c0c83-6a64-4d6e-8ebd-fca2af6a96c6 req-9ef8fa3e-74da-4a4c-b688-733fa16a961d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Refreshing instance network info cache due to event network-changed-dde40d00-302a-4c36-a09f-66e55993a2e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.679 182096 DEBUG oslo_concurrency.lockutils [req-b05c0c83-6a64-4d6e-8ebd-fca2af6a96c6 req-9ef8fa3e-74da-4a4c-b688-733fa16a961d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:34:52 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:52.731 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:52 compute-0 nova_compute[182092]: 2026-01-23 09:34:52.735 182096 DEBUG nova.network.neutron [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.532 182096 DEBUG nova.network.neutron [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Updating instance_info_cache with network_info: [{"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.551 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Releasing lock "refresh_cache-5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.552 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Instance network_info: |[{"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.552 182096 DEBUG oslo_concurrency.lockutils [req-b05c0c83-6a64-4d6e-8ebd-fca2af6a96c6 req-9ef8fa3e-74da-4a4c-b688-733fa16a961d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.552 182096 DEBUG nova.network.neutron [req-b05c0c83-6a64-4d6e-8ebd-fca2af6a96c6 req-9ef8fa3e-74da-4a4c-b688-733fa16a961d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Refreshing network info cache for port dde40d00-302a-4c36-a09f-66e55993a2e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.555 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Start _get_guest_xml network_info=[{"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.559 182096 WARNING nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.566 182096 DEBUG nova.virt.libvirt.host [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.566 182096 DEBUG nova.virt.libvirt.host [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.569 182096 DEBUG nova.virt.libvirt.host [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.569 182096 DEBUG nova.virt.libvirt.host [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.570 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.570 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.571 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.571 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.571 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.572 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.572 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.572 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.573 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.573 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.573 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.573 182096 DEBUG nova.virt.hardware [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.576 182096 DEBUG nova.virt.libvirt.vif [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:34:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-882367117',display_name='tempest-TestNetworkBasicOps-server-882367117',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-882367117',id=141,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHbfJRju0E632aWwmt87mM5PvYlhIcLn7MQHjk+yjQY9JUGFfOA11xGE0ldioLa+5I9ygeeGjMH0uMwzTyOgLIJuj6TrXOFYaixsc4s71Cb7h2BILPhDaC+6FsYUPepQhw==',key_name='tempest-TestNetworkBasicOps-440484849',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-s3rxxmeo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:34:50Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=5ded6067-8713-4f6f-96bd-1a43e2d8b0e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.577 182096 DEBUG nova.network.os_vif_util [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.577 182096 DEBUG nova.network.os_vif_util [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:1a:b2,bridge_name='br-int',has_traffic_filtering=True,id=dde40d00-302a-4c36-a09f-66e55993a2e3,network=Network(a1939a22-80d8-4e65-b27f-a9a368a3d7d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdde40d00-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.578 182096 DEBUG nova.objects.instance [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.606 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <uuid>5ded6067-8713-4f6f-96bd-1a43e2d8b0e9</uuid>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <name>instance-0000008d</name>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkBasicOps-server-882367117</nova:name>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:34:53</nova:creationTime>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:34:53 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:34:53 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:34:53 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:34:53 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:34:53 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:34:53 compute-0 nova_compute[182092]:         <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:34:53 compute-0 nova_compute[182092]:         <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:34:53 compute-0 nova_compute[182092]:         <nova:port uuid="dde40d00-302a-4c36-a09f-66e55993a2e3">
Jan 23 09:34:53 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <system>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <entry name="serial">5ded6067-8713-4f6f-96bd-1a43e2d8b0e9</entry>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <entry name="uuid">5ded6067-8713-4f6f-96bd-1a43e2d8b0e9</entry>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </system>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <os>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   </os>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <features>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   </features>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk.config"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:4a:1a:b2"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <target dev="tapdde40d00-30"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/console.log" append="off"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <video>
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </video>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:34:53 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:34:53 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:34:53 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:34:53 compute-0 nova_compute[182092]: </domain>
Jan 23 09:34:53 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.607 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Preparing to wait for external event network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.607 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.607 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.607 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.608 182096 DEBUG nova.virt.libvirt.vif [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:34:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-882367117',display_name='tempest-TestNetworkBasicOps-server-882367117',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-882367117',id=141,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHbfJRju0E632aWwmt87mM5PvYlhIcLn7MQHjk+yjQY9JUGFfOA11xGE0ldioLa+5I9ygeeGjMH0uMwzTyOgLIJuj6TrXOFYaixsc4s71Cb7h2BILPhDaC+6FsYUPepQhw==',key_name='tempest-TestNetworkBasicOps-440484849',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-s3rxxmeo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:34:50Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=5ded6067-8713-4f6f-96bd-1a43e2d8b0e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.608 182096 DEBUG nova.network.os_vif_util [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.608 182096 DEBUG nova.network.os_vif_util [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:1a:b2,bridge_name='br-int',has_traffic_filtering=True,id=dde40d00-302a-4c36-a09f-66e55993a2e3,network=Network(a1939a22-80d8-4e65-b27f-a9a368a3d7d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdde40d00-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.609 182096 DEBUG os_vif [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:1a:b2,bridge_name='br-int',has_traffic_filtering=True,id=dde40d00-302a-4c36-a09f-66e55993a2e3,network=Network(a1939a22-80d8-4e65-b27f-a9a368a3d7d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdde40d00-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.609 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.609 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.610 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.613 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.613 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdde40d00-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.614 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdde40d00-30, col_values=(('external_ids', {'iface-id': 'dde40d00-302a-4c36-a09f-66e55993a2e3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:1a:b2', 'vm-uuid': '5ded6067-8713-4f6f-96bd-1a43e2d8b0e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:53 compute-0 NetworkManager[54920]: <info>  [1769160893.6159] manager: (tapdde40d00-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.615 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.617 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.620 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.621 182096 INFO os_vif [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:1a:b2,bridge_name='br-int',has_traffic_filtering=True,id=dde40d00-302a-4c36-a09f-66e55993a2e3,network=Network(a1939a22-80d8-4e65-b27f-a9a368a3d7d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdde40d00-30')
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.653 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.654 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.654 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No VIF found with MAC fa:16:3e:4a:1a:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.654 182096 INFO nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Using config drive
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.905 182096 INFO nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Creating config drive at /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk.config
Jan 23 09:34:53 compute-0 nova_compute[182092]: 2026-01-23 09:34:53.909 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpns2d_4k9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.028 182096 DEBUG oslo_concurrency.processutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpns2d_4k9" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:34:54 compute-0 kernel: tapdde40d00-30: entered promiscuous mode
Jan 23 09:34:54 compute-0 NetworkManager[54920]: <info>  [1769160894.0697] manager: (tapdde40d00-30): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Jan 23 09:34:54 compute-0 ovn_controller[94697]: 2026-01-23T09:34:54Z|00541|binding|INFO|Claiming lport dde40d00-302a-4c36-a09f-66e55993a2e3 for this chassis.
Jan 23 09:34:54 compute-0 ovn_controller[94697]: 2026-01-23T09:34:54Z|00542|binding|INFO|dde40d00-302a-4c36-a09f-66e55993a2e3: Claiming fa:16:3e:4a:1a:b2 10.100.0.23
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.069 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.076 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:1a:b2 10.100.0.23'], port_security=['fa:16:3e:4a:1a:b2 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '5ded6067-8713-4f6f-96bd-1a43e2d8b0e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7f61c8af-d74f-4a5f-9a97-e72f691af1e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19dbcaf6-a56c-49d4-8765-993dcb07bdc2, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=dde40d00-302a-4c36-a09f-66e55993a2e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.077 103978 INFO neutron.agent.ovn.metadata.agent [-] Port dde40d00-302a-4c36-a09f-66e55993a2e3 in datapath a1939a22-80d8-4e65-b27f-a9a368a3d7d2 bound to our chassis
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.078 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1939a22-80d8-4e65-b27f-a9a368a3d7d2
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.087 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8341ec2c-79f8-4d1f-b5f6-9579be776f2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.088 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1939a22-81 in ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.089 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1939a22-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.089 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d743e7c6-a3d5-4846-af85-ba95b6d87548]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.090 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[74867358-f67a-4d55-ac15-733c01843471]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 systemd-machined[153562]: New machine qemu-72-instance-0000008d.
Jan 23 09:34:54 compute-0 systemd-udevd[227293]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:34:54 compute-0 systemd[1]: Started Virtual Machine qemu-72-instance-0000008d.
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.100 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[cf4214b5-aa4d-44b6-a9e1-82deb23f3e82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.109 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:54 compute-0 ovn_controller[94697]: 2026-01-23T09:34:54Z|00543|binding|INFO|Setting lport dde40d00-302a-4c36-a09f-66e55993a2e3 ovn-installed in OVS
Jan 23 09:34:54 compute-0 ovn_controller[94697]: 2026-01-23T09:34:54Z|00544|binding|INFO|Setting lport dde40d00-302a-4c36-a09f-66e55993a2e3 up in Southbound
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.111 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.112 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[34650010-acd1-429b-b364-e67c413d11d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 NetworkManager[54920]: <info>  [1769160894.1167] device (tapdde40d00-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:34:54 compute-0 NetworkManager[54920]: <info>  [1769160894.1172] device (tapdde40d00-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.135 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c8287d55-7278-494b-a0cf-1279127f7dd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 NetworkManager[54920]: <info>  [1769160894.1395] manager: (tapa1939a22-80): new Veth device (/org/freedesktop/NetworkManager/Devices/276)
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.139 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e786c625-1ad9-4f19-813b-d041992cbefb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.165 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[86b691bb-8c6f-4ab5-b39a-3db3535a00c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.169 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4f0d45-0936-4133-b47d-4ef24b22366b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 NetworkManager[54920]: <info>  [1769160894.1859] device (tapa1939a22-80): carrier: link connected
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.189 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b3bd9794-cfb1-4918-a3ad-8aab3e4b5536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.202 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6b88f9c4-af50-4033-98b7-39c45b2f1303]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1939a22-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:6c:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439967, 'reachable_time': 44333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227316, 'error': None, 'target': 'ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.212 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[303bbc0c-06b7-44a8-a53b-fa2992a1cd06]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:6c36'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439967, 'tstamp': 439967}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227317, 'error': None, 'target': 'ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.224 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9f15d252-1e6c-4105-8d95-c3b22794a0d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1939a22-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:6c:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439967, 'reachable_time': 44333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227319, 'error': None, 'target': 'ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.243 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2bef2c-6bbe-4cff-9b9f-fa152ecdeb91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.280 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1f504603-33e8-4193-9cf2-4dafa21fcb74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.281 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1939a22-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.282 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.282 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1939a22-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:54 compute-0 NetworkManager[54920]: <info>  [1769160894.2845] manager: (tapa1939a22-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Jan 23 09:34:54 compute-0 kernel: tapa1939a22-80: entered promiscuous mode
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.285 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.287 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1939a22-80, col_values=(('external_ids', {'iface-id': '13c50867-eb70-4f1d-9c02-9c1ee6914bff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.288 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:54 compute-0 ovn_controller[94697]: 2026-01-23T09:34:54Z|00545|binding|INFO|Releasing lport 13c50867-eb70-4f1d-9c02-9c1ee6914bff from this chassis (sb_readonly=0)
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.289 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.290 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1939a22-80d8-4e65-b27f-a9a368a3d7d2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1939a22-80d8-4e65-b27f-a9a368a3d7d2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.291 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5633f349-16ad-4a4e-8d48-22bb459ea2ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.292 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-a1939a22-80d8-4e65-b27f-a9a368a3d7d2
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/a1939a22-80d8-4e65-b27f-a9a368a3d7d2.pid.haproxy
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID a1939a22-80d8-4e65-b27f-a9a368a3d7d2
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:34:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:34:54.293 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'env', 'PROCESS_TAG=haproxy-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1939a22-80d8-4e65-b27f-a9a368a3d7d2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.297 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160894.2973359, 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.297 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] VM Started (Lifecycle Event)
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.300 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.313 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.316 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160894.2973998, 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.316 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] VM Paused (Lifecycle Event)
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.332 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.334 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.348 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.459 182096 DEBUG nova.compute.manager [req-8478d463-0b44-46b1-9427-968b7502f973 req-d60e3405-4cf6-436f-98e7-a2493e9c53a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received event network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.459 182096 DEBUG oslo_concurrency.lockutils [req-8478d463-0b44-46b1-9427-968b7502f973 req-d60e3405-4cf6-436f-98e7-a2493e9c53a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.464 182096 DEBUG oslo_concurrency.lockutils [req-8478d463-0b44-46b1-9427-968b7502f973 req-d60e3405-4cf6-436f-98e7-a2493e9c53a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.464 182096 DEBUG oslo_concurrency.lockutils [req-8478d463-0b44-46b1-9427-968b7502f973 req-d60e3405-4cf6-436f-98e7-a2493e9c53a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.464 182096 DEBUG nova.compute.manager [req-8478d463-0b44-46b1-9427-968b7502f973 req-d60e3405-4cf6-436f-98e7-a2493e9c53a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Processing event network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.465 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.467 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160894.4676898, 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.468 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] VM Resumed (Lifecycle Event)
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.468 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.470 182096 INFO nova.virt.libvirt.driver [-] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Instance spawned successfully.
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.471 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.492 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.500 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.501 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.501 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.501 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.502 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.502 182096 DEBUG nova.virt.libvirt.driver [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.505 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.534 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.554 182096 INFO nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Took 3.56 seconds to spawn the instance on the hypervisor.
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.554 182096 DEBUG nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:34:54 compute-0 podman[227353]: 2026-01-23 09:34:54.583676886 +0000 UTC m=+0.032918150 container create cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:34:54 compute-0 systemd[1]: Started libpod-conmon-cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd.scope.
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.620 182096 INFO nova.compute.manager [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Took 3.92 seconds to build instance.
Jan 23 09:34:54 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.636 182096 DEBUG oslo_concurrency.lockutils [None req-468ab3c0-19d0-4165-b5bd-9d488d8098dd 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a282128bd48ece4be13469a68ba7d64c43393043d25470733c0a92d58c43ef92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.645 182096 DEBUG nova.network.neutron [req-b05c0c83-6a64-4d6e-8ebd-fca2af6a96c6 req-9ef8fa3e-74da-4a4c-b688-733fa16a961d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Updated VIF entry in instance network info cache for port dde40d00-302a-4c36-a09f-66e55993a2e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.646 182096 DEBUG nova.network.neutron [req-b05c0c83-6a64-4d6e-8ebd-fca2af6a96c6 req-9ef8fa3e-74da-4a4c-b688-733fa16a961d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Updating instance_info_cache with network_info: [{"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:34:54 compute-0 podman[227353]: 2026-01-23 09:34:54.648375797 +0000 UTC m=+0.097617081 container init cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:34:54 compute-0 podman[227353]: 2026-01-23 09:34:54.653275759 +0000 UTC m=+0.102517023 container start cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:34:54 compute-0 podman[227353]: 2026-01-23 09:34:54.569924276 +0000 UTC m=+0.019165560 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:34:54 compute-0 nova_compute[182092]: 2026-01-23 09:34:54.656 182096 DEBUG oslo_concurrency.lockutils [req-b05c0c83-6a64-4d6e-8ebd-fca2af6a96c6 req-9ef8fa3e-74da-4a4c-b688-733fa16a961d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:34:54 compute-0 neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2[227365]: [NOTICE]   (227369) : New worker (227371) forked
Jan 23 09:34:54 compute-0 neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2[227365]: [NOTICE]   (227369) : Loading success.
Jan 23 09:34:56 compute-0 nova_compute[182092]: 2026-01-23 09:34:56.540 182096 DEBUG nova.compute.manager [req-e7e54372-1309-418d-9524-453171e508d8 req-cb1fed51-6532-4691-b783-77b9f5f8535f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received event network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:34:56 compute-0 nova_compute[182092]: 2026-01-23 09:34:56.541 182096 DEBUG oslo_concurrency.lockutils [req-e7e54372-1309-418d-9524-453171e508d8 req-cb1fed51-6532-4691-b783-77b9f5f8535f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:34:56 compute-0 nova_compute[182092]: 2026-01-23 09:34:56.541 182096 DEBUG oslo_concurrency.lockutils [req-e7e54372-1309-418d-9524-453171e508d8 req-cb1fed51-6532-4691-b783-77b9f5f8535f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:34:56 compute-0 nova_compute[182092]: 2026-01-23 09:34:56.541 182096 DEBUG oslo_concurrency.lockutils [req-e7e54372-1309-418d-9524-453171e508d8 req-cb1fed51-6532-4691-b783-77b9f5f8535f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:34:56 compute-0 nova_compute[182092]: 2026-01-23 09:34:56.541 182096 DEBUG nova.compute.manager [req-e7e54372-1309-418d-9524-453171e508d8 req-cb1fed51-6532-4691-b783-77b9f5f8535f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] No waiting events found dispatching network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:34:56 compute-0 nova_compute[182092]: 2026-01-23 09:34:56.541 182096 WARNING nova.compute.manager [req-e7e54372-1309-418d-9524-453171e508d8 req-cb1fed51-6532-4691-b783-77b9f5f8535f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received unexpected event network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 for instance with vm_state active and task_state None.
Jan 23 09:34:57 compute-0 podman[227376]: 2026-01-23 09:34:57.206756508 +0000 UTC m=+0.042986209 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 09:34:57 compute-0 podman[227377]: 2026-01-23 09:34:57.226155578 +0000 UTC m=+0.060885259 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:34:57 compute-0 nova_compute[182092]: 2026-01-23 09:34:57.408 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:34:58 compute-0 nova_compute[182092]: 2026-01-23 09:34:58.616 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:00 compute-0 podman[227413]: 2026-01-23 09:35:00.227339689 +0000 UTC m=+0.065005260 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 23 09:35:02 compute-0 NetworkManager[54920]: <info>  [1769160902.0380] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Jan 23 09:35:02 compute-0 NetworkManager[54920]: <info>  [1769160902.0385] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Jan 23 09:35:02 compute-0 nova_compute[182092]: 2026-01-23 09:35:02.038 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:02 compute-0 nova_compute[182092]: 2026-01-23 09:35:02.173 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:02 compute-0 ovn_controller[94697]: 2026-01-23T09:35:02Z|00546|binding|INFO|Releasing lport 13c50867-eb70-4f1d-9c02-9c1ee6914bff from this chassis (sb_readonly=0)
Jan 23 09:35:02 compute-0 ovn_controller[94697]: 2026-01-23T09:35:02Z|00547|binding|INFO|Releasing lport ce525290-d2d6-41ee-a73a-700c8cf045a2 from this chassis (sb_readonly=0)
Jan 23 09:35:02 compute-0 nova_compute[182092]: 2026-01-23 09:35:02.188 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:02 compute-0 nova_compute[182092]: 2026-01-23 09:35:02.408 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:03 compute-0 nova_compute[182092]: 2026-01-23 09:35:03.618 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:05 compute-0 ovn_controller[94697]: 2026-01-23T09:35:05Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4a:1a:b2 10.100.0.23
Jan 23 09:35:05 compute-0 ovn_controller[94697]: 2026-01-23T09:35:05Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4a:1a:b2 10.100.0.23
Jan 23 09:35:06 compute-0 nova_compute[182092]: 2026-01-23 09:35:06.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:07 compute-0 nova_compute[182092]: 2026-01-23 09:35:07.410 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:08 compute-0 nova_compute[182092]: 2026-01-23 09:35:08.622 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:08 compute-0 podman[227446]: 2026-01-23 09:35:08.704236786 +0000 UTC m=+0.060031298 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:35:09 compute-0 nova_compute[182092]: 2026-01-23 09:35:09.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:09 compute-0 nova_compute[182092]: 2026-01-23 09:35:09.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:09 compute-0 nova_compute[182092]: 2026-01-23 09:35:09.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:09 compute-0 nova_compute[182092]: 2026-01-23 09:35:09.879 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:10 compute-0 nova_compute[182092]: 2026-01-23 09:35:10.957 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:10 compute-0 nova_compute[182092]: 2026-01-23 09:35:10.958 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:10 compute-0 nova_compute[182092]: 2026-01-23 09:35:10.958 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:10 compute-0 nova_compute[182092]: 2026-01-23 09:35:10.958 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.291 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.338 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.339 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.386 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.390 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.438 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.439 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.486 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.701 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.702 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5412MB free_disk=73.16934204101562GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.702 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:11 compute-0 nova_compute[182092]: 2026-01-23 09:35:11.703 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:12 compute-0 nova_compute[182092]: 2026-01-23 09:35:12.411 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.393 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 77396169-1daa-41c7-8ba6-c68b50815e2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.393 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.393 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.394 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.453 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.470 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.499 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.499 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:13 compute-0 nova_compute[182092]: 2026-01-23 09:35:13.623 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.452 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.452 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.452 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.453 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.453 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.462 182096 INFO nova.compute.manager [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Terminating instance
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.468 182096 DEBUG nova.compute.manager [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:35:14 compute-0 kernel: tapdde40d00-30 (unregistering): left promiscuous mode
Jan 23 09:35:14 compute-0 NetworkManager[54920]: <info>  [1769160914.4945] device (tapdde40d00-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.502 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.502 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.502 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:35:14 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.503 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:35:14 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00548|binding|INFO|Releasing lport dde40d00-302a-4c36-a09f-66e55993a2e3 from this chassis (sb_readonly=0)
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00549|binding|INFO|Setting lport dde40d00-302a-4c36-a09f-66e55993a2e3 down in Southbound
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00550|binding|INFO|Removing iface tapdde40d00-30 ovn-installed in OVS
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.513 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.514 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:1a:b2 10.100.0.23'], port_security=['fa:16:3e:4a:1a:b2 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '5ded6067-8713-4f6f-96bd-1a43e2d8b0e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7f61c8af-d74f-4a5f-9a97-e72f691af1e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19dbcaf6-a56c-49d4-8765-993dcb07bdc2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=dde40d00-302a-4c36-a09f-66e55993a2e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.515 103978 INFO neutron.agent.ovn.metadata.agent [-] Port dde40d00-302a-4c36-a09f-66e55993a2e3 in datapath a1939a22-80d8-4e65-b27f-a9a368a3d7d2 unbound from our chassis
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.517 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1939a22-80d8-4e65-b27f-a9a368a3d7d2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.518 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[874cdf04-a2b5-4be7-a9be-66ca7d8f5108]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.519 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2 namespace which is not needed anymore
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.524 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000008d.scope: Deactivated successfully.
Jan 23 09:35:14 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000008d.scope: Consumed 10.909s CPU time.
Jan 23 09:35:14 compute-0 systemd-machined[153562]: Machine qemu-72-instance-0000008d terminated.
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.550 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 23 09:35:14 compute-0 neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2[227365]: [NOTICE]   (227369) : haproxy version is 2.8.14-c23fe91
Jan 23 09:35:14 compute-0 neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2[227365]: [NOTICE]   (227369) : path to executable is /usr/sbin/haproxy
Jan 23 09:35:14 compute-0 neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2[227365]: [WARNING]  (227369) : Exiting Master process...
Jan 23 09:35:14 compute-0 neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2[227365]: [WARNING]  (227369) : Exiting Master process...
Jan 23 09:35:14 compute-0 neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2[227365]: [ALERT]    (227369) : Current worker (227371) exited with code 143 (Terminated)
Jan 23 09:35:14 compute-0 neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2[227365]: [WARNING]  (227369) : All workers exited. Exiting... (0)
Jan 23 09:35:14 compute-0 systemd[1]: libpod-cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd.scope: Deactivated successfully.
Jan 23 09:35:14 compute-0 podman[227504]: 2026-01-23 09:35:14.618484009 +0000 UTC m=+0.034573441 container died cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 09:35:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd-userdata-shm.mount: Deactivated successfully.
Jan 23 09:35:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-a282128bd48ece4be13469a68ba7d64c43393043d25470733c0a92d58c43ef92-merged.mount: Deactivated successfully.
Jan 23 09:35:14 compute-0 podman[227504]: 2026-01-23 09:35:14.638528195 +0000 UTC m=+0.054617628 container cleanup cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:35:14 compute-0 systemd[1]: libpod-conmon-cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd.scope: Deactivated successfully.
Jan 23 09:35:14 compute-0 podman[227527]: 2026-01-23 09:35:14.676928514 +0000 UTC m=+0.023585818 container remove cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:35:14 compute-0 kernel: tapdde40d00-30: entered promiscuous mode
Jan 23 09:35:14 compute-0 kernel: tapdde40d00-30 (unregistering): left promiscuous mode
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.681 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e74802d7-710e-450e-bbd6-fe37233db530]: (4, ('Fri Jan 23 09:35:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2 (cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd)\ncbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd\nFri Jan 23 09:35:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2 (cbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd)\ncbc104c7b1802c336fa578284c6519c0efb1be3ff7921fe6ad0c460e5f2f0bfd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00551|binding|INFO|Claiming lport dde40d00-302a-4c36-a09f-66e55993a2e3 for this chassis.
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00552|binding|INFO|dde40d00-302a-4c36-a09f-66e55993a2e3: Claiming fa:16:3e:4a:1a:b2 10.100.0.23
Jan 23 09:35:14 compute-0 NetworkManager[54920]: <info>  [1769160914.6839] manager: (tapdde40d00-30): new Tun device (/org/freedesktop/NetworkManager/Devices/280)
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.682 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.687 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[31fa79c1-0c1d-41fc-84ab-4d9258a729ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.688 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1939a22-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.690 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.691 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:1a:b2 10.100.0.23'], port_security=['fa:16:3e:4a:1a:b2 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '5ded6067-8713-4f6f-96bd-1a43e2d8b0e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7f61c8af-d74f-4a5f-9a97-e72f691af1e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19dbcaf6-a56c-49d4-8765-993dcb07bdc2, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=dde40d00-302a-4c36-a09f-66e55993a2e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.700 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 kernel: tapa1939a22-80: left promiscuous mode
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.712 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00553|binding|INFO|Setting lport dde40d00-302a-4c36-a09f-66e55993a2e3 up in Southbound
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00554|binding|INFO|Releasing lport dde40d00-302a-4c36-a09f-66e55993a2e3 from this chassis (sb_readonly=1)
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00555|if_status|INFO|Dropped 1 log messages in last 82 seconds (most recently, 82 seconds ago) due to excessive rate
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00556|if_status|INFO|Not setting lport dde40d00-302a-4c36-a09f-66e55993a2e3 down as sb is readonly
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00557|binding|INFO|Releasing lport dde40d00-302a-4c36-a09f-66e55993a2e3 from this chassis (sb_readonly=0)
Jan 23 09:35:14 compute-0 ovn_controller[94697]: 2026-01-23T09:35:14Z|00558|binding|INFO|Setting lport dde40d00-302a-4c36-a09f-66e55993a2e3 down in Southbound
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.725 182096 INFO nova.virt.libvirt.driver [-] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Instance destroyed successfully.
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.725 182096 DEBUG nova.objects.instance [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'resources' on Instance uuid 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.727 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:1a:b2 10.100.0.23'], port_security=['fa:16:3e:4a:1a:b2 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '5ded6067-8713-4f6f-96bd-1a43e2d8b0e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7f61c8af-d74f-4a5f-9a97-e72f691af1e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19dbcaf6-a56c-49d4-8765-993dcb07bdc2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=dde40d00-302a-4c36-a09f-66e55993a2e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.728 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.729 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.731 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c69f9b-1e12-4409-baf6-6a5045629988]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.742 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e267ae-5158-4302-97d8-888e6e5e8e15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.743 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1cff7ce0-d6cf-4645-ae2a-3dfe16e8aacd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.755 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[efd68309-9192-409d-bd4d-504f129b2069]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439961, 'reachable_time': 40778, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227559, 'error': None, 'target': 'ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 systemd[1]: run-netns-ovnmeta\x2da1939a22\x2d80d8\x2d4e65\x2db27f\x2da9a368a3d7d2.mount: Deactivated successfully.
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.757 182096 DEBUG nova.virt.libvirt.vif [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:34:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-882367117',display_name='tempest-TestNetworkBasicOps-server-882367117',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-882367117',id=141,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHbfJRju0E632aWwmt87mM5PvYlhIcLn7MQHjk+yjQY9JUGFfOA11xGE0ldioLa+5I9ygeeGjMH0uMwzTyOgLIJuj6TrXOFYaixsc4s71Cb7h2BILPhDaC+6FsYUPepQhw==',key_name='tempest-TestNetworkBasicOps-440484849',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:34:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-s3rxxmeo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:34:54Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=5ded6067-8713-4f6f-96bd-1a43e2d8b0e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.757 182096 DEBUG nova.network.os_vif_util [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "dde40d00-302a-4c36-a09f-66e55993a2e3", "address": "fa:16:3e:4a:1a:b2", "network": {"id": "a1939a22-80d8-4e65-b27f-a9a368a3d7d2", "bridge": "br-int", "label": "tempest-network-smoke--276554027", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdde40d00-30", "ovs_interfaceid": "dde40d00-302a-4c36-a09f-66e55993a2e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.758 182096 DEBUG nova.network.os_vif_util [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:1a:b2,bridge_name='br-int',has_traffic_filtering=True,id=dde40d00-302a-4c36-a09f-66e55993a2e3,network=Network(a1939a22-80d8-4e65-b27f-a9a368a3d7d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdde40d00-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.758 182096 DEBUG os_vif [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:1a:b2,bridge_name='br-int',has_traffic_filtering=True,id=dde40d00-302a-4c36-a09f-66e55993a2e3,network=Network(a1939a22-80d8-4e65-b27f-a9a368a3d7d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdde40d00-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.759 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.759 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1939a22-80d8-4e65-b27f-a9a368a3d7d2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.760 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdde40d00-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.759 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[5f98ec01-e300-460b-b0de-12b53d87e685]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.760 103978 INFO neutron.agent.ovn.metadata.agent [-] Port dde40d00-302a-4c36-a09f-66e55993a2e3 in datapath a1939a22-80d8-4e65-b27f-a9a368a3d7d2 unbound from our chassis
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.761 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1939a22-80d8-4e65-b27f-a9a368a3d7d2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.762 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.763 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.762 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[13ca77df-56ab-4bff-a771-e0065e407bf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.763 103978 INFO neutron.agent.ovn.metadata.agent [-] Port dde40d00-302a-4c36-a09f-66e55993a2e3 in datapath a1939a22-80d8-4e65-b27f-a9a368a3d7d2 unbound from our chassis
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.764 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1939a22-80d8-4e65-b27f-a9a368a3d7d2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.765 182096 INFO os_vif [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:1a:b2,bridge_name='br-int',has_traffic_filtering=True,id=dde40d00-302a-4c36-a09f-66e55993a2e3,network=Network(a1939a22-80d8-4e65-b27f-a9a368a3d7d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdde40d00-30')
Jan 23 09:35:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:14.765 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2c721042-6e59-43d7-a0d9-937c17d16bd3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.765 182096 INFO nova.virt.libvirt.driver [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Deleting instance files /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9_del
Jan 23 09:35:14 compute-0 nova_compute[182092]: 2026-01-23 09:35:14.765 182096 INFO nova.virt.libvirt.driver [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Deletion of /var/lib/nova/instances/5ded6067-8713-4f6f-96bd-1a43e2d8b0e9_del complete
Jan 23 09:35:15 compute-0 nova_compute[182092]: 2026-01-23 09:35:15.443 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:35:15 compute-0 nova_compute[182092]: 2026-01-23 09:35:15.443 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:35:15 compute-0 nova_compute[182092]: 2026-01-23 09:35:15.443 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:35:15 compute-0 nova_compute[182092]: 2026-01-23 09:35:15.444 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:35:16 compute-0 podman[227561]: 2026-01-23 09:35:16.206578893 +0000 UTC m=+0.038565882 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:35:16 compute-0 podman[227560]: 2026-01-23 09:35:16.211559497 +0000 UTC m=+0.045341853 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 23 09:35:16 compute-0 nova_compute[182092]: 2026-01-23 09:35:16.543 182096 INFO nova.compute.manager [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Took 2.08 seconds to destroy the instance on the hypervisor.
Jan 23 09:35:16 compute-0 nova_compute[182092]: 2026-01-23 09:35:16.544 182096 DEBUG oslo.service.loopingcall [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:35:16 compute-0 nova_compute[182092]: 2026-01-23 09:35:16.544 182096 DEBUG nova.compute.manager [-] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:35:16 compute-0 nova_compute[182092]: 2026-01-23 09:35:16.544 182096 DEBUG nova.network.neutron [-] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:35:17 compute-0 nova_compute[182092]: 2026-01-23 09:35:17.414 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.710 182096 DEBUG nova.compute.manager [req-7a6addef-3a56-409e-a24d-ef21937ccfa7 req-df7bbe22-a2bc-4cb7-b13c-f83b80433464 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received event network-vif-unplugged-dde40d00-302a-4c36-a09f-66e55993a2e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.710 182096 DEBUG oslo_concurrency.lockutils [req-7a6addef-3a56-409e-a24d-ef21937ccfa7 req-df7bbe22-a2bc-4cb7-b13c-f83b80433464 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.710 182096 DEBUG oslo_concurrency.lockutils [req-7a6addef-3a56-409e-a24d-ef21937ccfa7 req-df7bbe22-a2bc-4cb7-b13c-f83b80433464 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.710 182096 DEBUG oslo_concurrency.lockutils [req-7a6addef-3a56-409e-a24d-ef21937ccfa7 req-df7bbe22-a2bc-4cb7-b13c-f83b80433464 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.711 182096 DEBUG nova.compute.manager [req-7a6addef-3a56-409e-a24d-ef21937ccfa7 req-df7bbe22-a2bc-4cb7-b13c-f83b80433464 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] No waiting events found dispatching network-vif-unplugged-dde40d00-302a-4c36-a09f-66e55993a2e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.711 182096 DEBUG nova.compute.manager [req-7a6addef-3a56-409e-a24d-ef21937ccfa7 req-df7bbe22-a2bc-4cb7-b13c-f83b80433464 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received event network-vif-unplugged-dde40d00-302a-4c36-a09f-66e55993a2e3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.777 182096 DEBUG nova.network.neutron [-] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.799 182096 INFO nova.compute.manager [-] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Took 2.25 seconds to deallocate network for instance.
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.867 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.867 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.942 182096 DEBUG nova.compute.provider_tree [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.959 182096 DEBUG nova.scheduler.client.report [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:35:18 compute-0 nova_compute[182092]: 2026-01-23 09:35:18.997 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.033 182096 INFO nova.scheduler.client.report [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Deleted allocations for instance 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.096 182096 DEBUG oslo_concurrency.lockutils [None req-a90a57a7-fa6b-4a0c-9287-06fdbb17a9c0 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.207 182096 DEBUG nova.compute.manager [req-18912852-008a-46dd-b0c6-0b011b011b12 req-bcabb3cc-8edc-40e4-9b40-75151f59da8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received event network-vif-deleted-dde40d00-302a-4c36-a09f-66e55993a2e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.419 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updating instance_info_cache with network_info: [{"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.440 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-77396169-1daa-41c7-8ba6-c68b50815e2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.440 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.441 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.441 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.441 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.441 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.761 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.924 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.925 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:19 compute-0 nova_compute[182092]: 2026-01-23 09:35:19.945 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.031 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.031 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.035 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.035 182096 INFO nova.compute.claims [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.202 182096 DEBUG nova.compute.provider_tree [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.227 182096 DEBUG nova.scheduler.client.report [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.262 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.262 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.322 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.322 182096 DEBUG nova.network.neutron [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.355 182096 INFO nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.367 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.449 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.450 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.450 182096 INFO nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Creating image(s)
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.451 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "/var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.451 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "/var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.452 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "/var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.462 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.511 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.512 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.513 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.522 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.545 182096 DEBUG nova.policy [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a2ccb949ece647429b744fd0f25e41b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5ee87b8f51c547bc8fbc47a9fc3839be', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.583 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.583 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.618 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.619 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.619 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:20 compute-0 ovn_controller[94697]: 2026-01-23T09:35:20Z|00559|binding|INFO|Releasing lport ce525290-d2d6-41ee-a73a-700c8cf045a2 from this chassis (sb_readonly=0)
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.639 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.687 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.688 182096 DEBUG nova.virt.disk.api [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Checking if we can resize image /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.688 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.752 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.753 182096 DEBUG nova.virt.disk.api [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Cannot resize image /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.753 182096 DEBUG nova.objects.instance [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lazy-loading 'migration_context' on Instance uuid 6c69a90c-edeb-4866-b2da-0b4acfeb2355 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.773 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.774 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Ensure instance console log exists: /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.774 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.774 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.775 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:20 compute-0 ovn_controller[94697]: 2026-01-23T09:35:20Z|00560|binding|INFO|Releasing lport ce525290-d2d6-41ee-a73a-700c8cf045a2 from this chassis (sb_readonly=0)
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.815 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.819 182096 DEBUG nova.compute.manager [req-cfda433c-dafc-4ef1-92cb-4a8b8c0d9a5d req-8cb26c7c-579c-4ec0-897a-2cb5ad04f068 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received event network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.820 182096 DEBUG oslo_concurrency.lockutils [req-cfda433c-dafc-4ef1-92cb-4a8b8c0d9a5d req-8cb26c7c-579c-4ec0-897a-2cb5ad04f068 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.820 182096 DEBUG oslo_concurrency.lockutils [req-cfda433c-dafc-4ef1-92cb-4a8b8c0d9a5d req-8cb26c7c-579c-4ec0-897a-2cb5ad04f068 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.820 182096 DEBUG oslo_concurrency.lockutils [req-cfda433c-dafc-4ef1-92cb-4a8b8c0d9a5d req-8cb26c7c-579c-4ec0-897a-2cb5ad04f068 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "5ded6067-8713-4f6f-96bd-1a43e2d8b0e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.820 182096 DEBUG nova.compute.manager [req-cfda433c-dafc-4ef1-92cb-4a8b8c0d9a5d req-8cb26c7c-579c-4ec0-897a-2cb5ad04f068 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] No waiting events found dispatching network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:35:20 compute-0 nova_compute[182092]: 2026-01-23 09:35:20.821 182096 WARNING nova.compute.manager [req-cfda433c-dafc-4ef1-92cb-4a8b8c0d9a5d req-8cb26c7c-579c-4ec0-897a-2cb5ad04f068 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Received unexpected event network-vif-plugged-dde40d00-302a-4c36-a09f-66e55993a2e3 for instance with vm_state deleted and task_state None.
Jan 23 09:35:22 compute-0 nova_compute[182092]: 2026-01-23 09:35:22.416 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:23 compute-0 nova_compute[182092]: 2026-01-23 09:35:23.201 182096 DEBUG nova.network.neutron [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Successfully created port: 073e61a9-60a4-439f-a2f4-36ae5c072adc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.354 182096 DEBUG nova.network.neutron [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Successfully updated port: 073e61a9-60a4-439f-a2f4-36ae5c072adc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.383 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.383 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquired lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.383 182096 DEBUG nova.network.neutron [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.488 182096 DEBUG nova.compute.manager [req-f7e7ee67-405e-439e-8090-7c42e0bf9530 req-75e4358b-0c56-4c67-b7cf-414eb9fe6d32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-changed-073e61a9-60a4-439f-a2f4-36ae5c072adc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.488 182096 DEBUG nova.compute.manager [req-f7e7ee67-405e-439e-8090-7c42e0bf9530 req-75e4358b-0c56-4c67-b7cf-414eb9fe6d32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Refreshing instance network info cache due to event network-changed-073e61a9-60a4-439f-a2f4-36ae5c072adc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.488 182096 DEBUG oslo_concurrency.lockutils [req-f7e7ee67-405e-439e-8090-7c42e0bf9530 req-75e4358b-0c56-4c67-b7cf-414eb9fe6d32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.607 182096 DEBUG nova.network.neutron [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:35:24 compute-0 nova_compute[182092]: 2026-01-23 09:35:24.762 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.390 182096 DEBUG nova.network.neutron [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updating instance_info_cache with network_info: [{"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.408 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Releasing lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.408 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Instance network_info: |[{"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.408 182096 DEBUG oslo_concurrency.lockutils [req-f7e7ee67-405e-439e-8090-7c42e0bf9530 req-75e4358b-0c56-4c67-b7cf-414eb9fe6d32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.409 182096 DEBUG nova.network.neutron [req-f7e7ee67-405e-439e-8090-7c42e0bf9530 req-75e4358b-0c56-4c67-b7cf-414eb9fe6d32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Refreshing network info cache for port 073e61a9-60a4-439f-a2f4-36ae5c072adc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.411 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Start _get_guest_xml network_info=[{"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.415 182096 WARNING nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.423 182096 DEBUG nova.virt.libvirt.host [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.423 182096 DEBUG nova.virt.libvirt.host [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.426 182096 DEBUG nova.virt.libvirt.host [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.427 182096 DEBUG nova.virt.libvirt.host [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.428 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.428 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.428 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.429 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.429 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.429 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.429 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.429 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.430 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.430 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.430 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.430 182096 DEBUG nova.virt.hardware [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.433 182096 DEBUG nova.virt.libvirt.vif [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-275687208-access_point-946796410',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-275687208-access_point-946796410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-275687208-acc',id=142,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDCRgH5f9fNAe+kWWWE1cX+DmnsGpTnYchAJ3P9bncspSdyO/BPBk+/eEXnS9GohvsEXpfFZL+Px5hyY0cktFgiS+i2z7DYQCRG1mHzRuamYVxyUS4ddWC35fBRR0CTOaQ==',key_name='tempest-TestSecurityGroupsBasicOps-851949540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5ee87b8f51c547bc8fbc47a9fc3839be',ramdisk_id='',reservation_id='r-5nqyk5tz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-275687208',owner_user_name='tempest-TestSecurityGroupsBasicOps-275687208-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:35:20Z,user_data=None,user_id='a2ccb949ece647429b744fd0f25e41b4',uuid=6c69a90c-edeb-4866-b2da-0b4acfeb2355,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.434 182096 DEBUG nova.network.os_vif_util [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Converting VIF {"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.434 182096 DEBUG nova.network.os_vif_util [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:cc,bridge_name='br-int',has_traffic_filtering=True,id=073e61a9-60a4-439f-a2f4-36ae5c072adc,network=Network(b9132fe1-3e71-471b-b4bd-f2fa5155726e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap073e61a9-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.435 182096 DEBUG nova.objects.instance [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lazy-loading 'pci_devices' on Instance uuid 6c69a90c-edeb-4866-b2da-0b4acfeb2355 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.445 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <uuid>6c69a90c-edeb-4866-b2da-0b4acfeb2355</uuid>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <name>instance-0000008e</name>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-275687208-access_point-946796410</nova:name>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:35:26</nova:creationTime>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:35:26 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:35:26 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:35:26 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:35:26 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:35:26 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:35:26 compute-0 nova_compute[182092]:         <nova:user uuid="a2ccb949ece647429b744fd0f25e41b4">tempest-TestSecurityGroupsBasicOps-275687208-project-member</nova:user>
Jan 23 09:35:26 compute-0 nova_compute[182092]:         <nova:project uuid="5ee87b8f51c547bc8fbc47a9fc3839be">tempest-TestSecurityGroupsBasicOps-275687208</nova:project>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:35:26 compute-0 nova_compute[182092]:         <nova:port uuid="073e61a9-60a4-439f-a2f4-36ae5c072adc">
Jan 23 09:35:26 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <system>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <entry name="serial">6c69a90c-edeb-4866-b2da-0b4acfeb2355</entry>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <entry name="uuid">6c69a90c-edeb-4866-b2da-0b4acfeb2355</entry>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </system>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <os>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   </os>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <features>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   </features>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk.config"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:c9:b1:cc"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <target dev="tap073e61a9-60"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/console.log" append="off"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <video>
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </video>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:35:26 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:35:26 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:35:26 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:35:26 compute-0 nova_compute[182092]: </domain>
Jan 23 09:35:26 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.447 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Preparing to wait for external event network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.447 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.447 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.448 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.448 182096 DEBUG nova.virt.libvirt.vif [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-275687208-access_point-946796410',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-275687208-access_point-946796410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-275687208-acc',id=142,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDCRgH5f9fNAe+kWWWE1cX+DmnsGpTnYchAJ3P9bncspSdyO/BPBk+/eEXnS9GohvsEXpfFZL+Px5hyY0cktFgiS+i2z7DYQCRG1mHzRuamYVxyUS4ddWC35fBRR0CTOaQ==',key_name='tempest-TestSecurityGroupsBasicOps-851949540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5ee87b8f51c547bc8fbc47a9fc3839be',ramdisk_id='',reservation_id='r-5nqyk5tz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-275687208',owner_user_name='tempest-TestSecurityGroupsBasicOps-275687208-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:35:20Z,user_data=None,user_id='a2ccb949ece647429b744fd0f25e41b4',uuid=6c69a90c-edeb-4866-b2da-0b4acfeb2355,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.449 182096 DEBUG nova.network.os_vif_util [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Converting VIF {"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.449 182096 DEBUG nova.network.os_vif_util [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:cc,bridge_name='br-int',has_traffic_filtering=True,id=073e61a9-60a4-439f-a2f4-36ae5c072adc,network=Network(b9132fe1-3e71-471b-b4bd-f2fa5155726e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap073e61a9-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.450 182096 DEBUG os_vif [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:cc,bridge_name='br-int',has_traffic_filtering=True,id=073e61a9-60a4-439f-a2f4-36ae5c072adc,network=Network(b9132fe1-3e71-471b-b4bd-f2fa5155726e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap073e61a9-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.450 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.451 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.451 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.453 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.453 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap073e61a9-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.454 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap073e61a9-60, col_values=(('external_ids', {'iface-id': '073e61a9-60a4-439f-a2f4-36ae5c072adc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:b1:cc', 'vm-uuid': '6c69a90c-edeb-4866-b2da-0b4acfeb2355'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:26 compute-0 NetworkManager[54920]: <info>  [1769160926.4558] manager: (tap073e61a9-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.457 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.459 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.460 182096 INFO os_vif [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:cc,bridge_name='br-int',has_traffic_filtering=True,id=073e61a9-60a4-439f-a2f4-36ae5c072adc,network=Network(b9132fe1-3e71-471b-b4bd-f2fa5155726e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap073e61a9-60')
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.497 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.497 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.497 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] No VIF found with MAC fa:16:3e:c9:b1:cc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:35:26 compute-0 nova_compute[182092]: 2026-01-23 09:35:26.499 182096 INFO nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Using config drive
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.417 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.449 182096 INFO nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Creating config drive at /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk.config
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.453 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpup3zyhay execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.575 182096 DEBUG oslo_concurrency.processutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpup3zyhay" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:27 compute-0 NetworkManager[54920]: <info>  [1769160927.6305] manager: (tap073e61a9-60): new Tun device (/org/freedesktop/NetworkManager/Devices/282)
Jan 23 09:35:27 compute-0 kernel: tap073e61a9-60: entered promiscuous mode
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.636 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:27 compute-0 ovn_controller[94697]: 2026-01-23T09:35:27Z|00561|binding|INFO|Claiming lport 073e61a9-60a4-439f-a2f4-36ae5c072adc for this chassis.
Jan 23 09:35:27 compute-0 ovn_controller[94697]: 2026-01-23T09:35:27Z|00562|binding|INFO|073e61a9-60a4-439f-a2f4-36ae5c072adc: Claiming fa:16:3e:c9:b1:cc 10.100.0.5
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.647 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:b1:cc 10.100.0.5'], port_security=['fa:16:3e:c9:b1:cc 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6c69a90c-edeb-4866-b2da-0b4acfeb2355', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9132fe1-3e71-471b-b4bd-f2fa5155726e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ee87b8f51c547bc8fbc47a9fc3839be', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4347c1ec-b03a-433a-b1a4-bfdc6f5a2670 d6921887-b47d-44d3-ae5c-aedf46e6ccbb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88512b09-a7ca-4424-b849-b1221a082f4f, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=073e61a9-60a4-439f-a2f4-36ae5c072adc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.648 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 073e61a9-60a4-439f-a2f4-36ae5c072adc in datapath b9132fe1-3e71-471b-b4bd-f2fa5155726e bound to our chassis
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.649 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b9132fe1-3e71-471b-b4bd-f2fa5155726e
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.657 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[73a293aa-f00b-4759-9386-684fe597074b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.658 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb9132fe1-31 in ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.659 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb9132fe1-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.659 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[afe0758e-734b-43b4-9df5-ee65c96034dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.660 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aabab3f4-26b6-4475-9cec-db525613e131]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.672 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[a4da7bb3-02b1-4613-a432-37c5b3512f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 systemd-machined[153562]: New machine qemu-73-instance-0000008e.
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.695 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.695 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed4b5b4-328b-4c4d-aeb7-3facbaf62d94]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 systemd[1]: Started Virtual Machine qemu-73-instance-0000008e.
Jan 23 09:35:27 compute-0 ovn_controller[94697]: 2026-01-23T09:35:27Z|00563|binding|INFO|Setting lport 073e61a9-60a4-439f-a2f4-36ae5c072adc ovn-installed in OVS
Jan 23 09:35:27 compute-0 ovn_controller[94697]: 2026-01-23T09:35:27Z|00564|binding|INFO|Setting lport 073e61a9-60a4-439f-a2f4-36ae5c072adc up in Southbound
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.700 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:27 compute-0 podman[227629]: 2026-01-23 09:35:27.714821641 +0000 UTC m=+0.086428448 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:35:27 compute-0 systemd-udevd[227678]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:35:27 compute-0 podman[227628]: 2026-01-23 09:35:27.726790296 +0000 UTC m=+0.100984292 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.726 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5e04c5d4-9b32-4cdf-9b5d-9d79fbbd0d9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.732 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[987a32a4-010f-4063-a12e-42083f510eec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 NetworkManager[54920]: <info>  [1769160927.7353] manager: (tapb9132fe1-30): new Veth device (/org/freedesktop/NetworkManager/Devices/283)
Jan 23 09:35:27 compute-0 systemd-udevd[227682]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:35:27 compute-0 NetworkManager[54920]: <info>  [1769160927.7383] device (tap073e61a9-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:35:27 compute-0 NetworkManager[54920]: <info>  [1769160927.7387] device (tap073e61a9-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.760 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d614d136-21b0-4e16-9b5a-8ad8eb7699e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.762 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[16c39901-7845-40fe-b1f9-2cdff79221f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 NetworkManager[54920]: <info>  [1769160927.7779] device (tapb9132fe1-30): carrier: link connected
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.782 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[52e065bd-6f12-4d47-b5a1-ca27be03811a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.794 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ba13f1c2-7cad-437f-85c4-6c8e69600a53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9132fe1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:01:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443326, 'reachable_time': 28369, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227699, 'error': None, 'target': 'ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.805 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[593e7666-0d47-4a78-ae8e-7bed441d2398]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefb:1ce'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443326, 'tstamp': 443326}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227700, 'error': None, 'target': 'ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.817 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e018e3a1-1afc-4dad-8ff7-8d3084f9ddff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9132fe1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fb:01:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443326, 'reachable_time': 28369, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227701, 'error': None, 'target': 'ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.839 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[38e5cb16-193a-4f81-b273-36376d18d087]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.879 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d65878ac-f000-43d7-9778-e45580ff8572]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.880 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9132fe1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.880 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.880 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9132fe1-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:27 compute-0 kernel: tapb9132fe1-30: entered promiscuous mode
Jan 23 09:35:27 compute-0 NetworkManager[54920]: <info>  [1769160927.8824] manager: (tapb9132fe1-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.884 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb9132fe1-30, col_values=(('external_ids', {'iface-id': 'a2356132-1424-44aa-a17c-91a644594cb4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:27 compute-0 ovn_controller[94697]: 2026-01-23T09:35:27Z|00565|binding|INFO|Releasing lport a2356132-1424-44aa-a17c-91a644594cb4 from this chassis (sb_readonly=0)
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.896 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:27 compute-0 nova_compute[182092]: 2026-01-23 09:35:27.898 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.899 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b9132fe1-3e71-471b-b4bd-f2fa5155726e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b9132fe1-3e71-471b-b4bd-f2fa5155726e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.899 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[89618898-e2b4-4923-b3d4-b008b145e206]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.900 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-b9132fe1-3e71-471b-b4bd-f2fa5155726e
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/b9132fe1-3e71-471b-b4bd-f2fa5155726e.pid.haproxy
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID b9132fe1-3e71-471b-b4bd-f2fa5155726e
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:35:27 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:27.900 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e', 'env', 'PROCESS_TAG=haproxy-b9132fe1-3e71-471b-b4bd-f2fa5155726e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b9132fe1-3e71-471b-b4bd-f2fa5155726e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.069 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.070 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.070 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.070 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.070 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.078 182096 INFO nova.compute.manager [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Terminating instance
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.084 182096 DEBUG nova.compute.manager [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:35:28 compute-0 kernel: tapeaf8bed9-e3 (unregistering): left promiscuous mode
Jan 23 09:35:28 compute-0 NetworkManager[54920]: <info>  [1769160928.1070] device (tapeaf8bed9-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:35:28 compute-0 ovn_controller[94697]: 2026-01-23T09:35:28Z|00566|binding|INFO|Releasing lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 from this chassis (sb_readonly=0)
Jan 23 09:35:28 compute-0 ovn_controller[94697]: 2026-01-23T09:35:28Z|00567|binding|INFO|Setting lport eaf8bed9-e3ff-491c-adfa-61a91e2b0509 down in Southbound
Jan 23 09:35:28 compute-0 ovn_controller[94697]: 2026-01-23T09:35:28Z|00568|binding|INFO|Removing iface tapeaf8bed9-e3 ovn-installed in OVS
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.118 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.129 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:a6:68 10.100.0.12'], port_security=['fa:16:3e:e0:a6:68 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '77396169-1daa-41c7-8ba6-c68b50815e2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da36a2e2bae4483caec82cba10014d48', 'neutron:revision_number': '8', 'neutron:security_group_ids': '58a2818d-8ca4-4e70-8576-1bc8674f2c75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10cde387-ecdb-487a-958c-8320067c4581, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=eaf8bed9-e3ff-491c-adfa-61a91e2b0509) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.129 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:28 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000089.scope: Deactivated successfully.
Jan 23 09:35:28 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000089.scope: Consumed 11.831s CPU time.
Jan 23 09:35:28 compute-0 systemd-machined[153562]: Machine qemu-71-instance-00000089 terminated.
Jan 23 09:35:28 compute-0 podman[227732]: 2026-01-23 09:35:28.197909744 +0000 UTC m=+0.029536300 container create 11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:35:28 compute-0 systemd[1]: Started libpod-conmon-11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88.scope.
Jan 23 09:35:28 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:35:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58cc7044cb679753d1697f7cde4774244703b05cde6021ecd1342e6101dc382c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:35:28 compute-0 podman[227732]: 2026-01-23 09:35:28.253431787 +0000 UTC m=+0.085058353 container init 11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:35:28 compute-0 podman[227732]: 2026-01-23 09:35:28.259261232 +0000 UTC m=+0.090887778 container start 11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 09:35:28 compute-0 podman[227732]: 2026-01-23 09:35:28.184282421 +0000 UTC m=+0.015908997 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:35:28 compute-0 neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e[227744]: [NOTICE]   (227748) : New worker (227750) forked
Jan 23 09:35:28 compute-0 neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e[227744]: [NOTICE]   (227748) : Loading success.
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.298 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.302 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.314 103978 INFO neutron.agent.ovn.metadata.agent [-] Port eaf8bed9-e3ff-491c-adfa-61a91e2b0509 in datapath 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 unbound from our chassis
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.315 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.316 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d4745090-195b-4786-81b4-a04faf941c29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.317 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 namespace which is not needed anymore
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.330 182096 INFO nova.virt.libvirt.driver [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Instance destroyed successfully.
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.332 182096 DEBUG nova.objects.instance [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lazy-loading 'resources' on Instance uuid 77396169-1daa-41c7-8ba6-c68b50815e2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.342 182096 DEBUG nova.virt.libvirt.vif [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-361105592',display_name='tempest-ServerStableDeviceRescueTest-server-361105592',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-361105592',id=137,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:34:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da36a2e2bae4483caec82cba10014d48',ramdisk_id='',reservation_id='r-42fam2uc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1355926257',owner_user_name='tempest-ServerStableDeviceRescueTest-1355926257-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:34:33Z,user_data=None,user_id='800ea9ca92114ca5bf7589f4500f4bec',uuid=77396169-1daa-41c7-8ba6-c68b50815e2a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.343 182096 DEBUG nova.network.os_vif_util [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Converting VIF {"id": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "address": "fa:16:3e:e0:a6:68", "network": {"id": "1a0d6dfa-0d95-490d-ab54-4e7f98a34e91", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1264562620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da36a2e2bae4483caec82cba10014d48", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaf8bed9-e3", "ovs_interfaceid": "eaf8bed9-e3ff-491c-adfa-61a91e2b0509", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.343 182096 DEBUG nova.network.os_vif_util [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:a6:68,bridge_name='br-int',has_traffic_filtering=True,id=eaf8bed9-e3ff-491c-adfa-61a91e2b0509,network=Network(1a0d6dfa-0d95-490d-ab54-4e7f98a34e91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaf8bed9-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.344 182096 DEBUG os_vif [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:a6:68,bridge_name='br-int',has_traffic_filtering=True,id=eaf8bed9-e3ff-491c-adfa-61a91e2b0509,network=Network(1a0d6dfa-0d95-490d-ab54-4e7f98a34e91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaf8bed9-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.345 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.345 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeaf8bed9-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.348 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.349 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.351 182096 INFO os_vif [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:a6:68,bridge_name='br-int',has_traffic_filtering=True,id=eaf8bed9-e3ff-491c-adfa-61a91e2b0509,network=Network(1a0d6dfa-0d95-490d-ab54-4e7f98a34e91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaf8bed9-e3')
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.351 182096 INFO nova.virt.libvirt.driver [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Deleting instance files /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a_del
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.352 182096 INFO nova.virt.libvirt.driver [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Deletion of /var/lib/nova/instances/77396169-1daa-41c7-8ba6-c68b50815e2a_del complete
Jan 23 09:35:28 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[227178]: [NOTICE]   (227182) : haproxy version is 2.8.14-c23fe91
Jan 23 09:35:28 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[227178]: [NOTICE]   (227182) : path to executable is /usr/sbin/haproxy
Jan 23 09:35:28 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[227178]: [ALERT]    (227182) : Current worker (227184) exited with code 143 (Terminated)
Jan 23 09:35:28 compute-0 neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91[227178]: [WARNING]  (227182) : All workers exited. Exiting... (0)
Jan 23 09:35:28 compute-0 systemd[1]: libpod-1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19.scope: Deactivated successfully.
Jan 23 09:35:28 compute-0 podman[227784]: 2026-01-23 09:35:28.417485515 +0000 UTC m=+0.033579717 container died 1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.431 182096 INFO nova.compute.manager [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.432 182096 DEBUG oslo.service.loopingcall [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.433 182096 DEBUG nova.compute.manager [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.433 182096 DEBUG nova.network.neutron [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.435 182096 DEBUG nova.compute.manager [req-d1726a85-fb67-4653-a4a1-03d377e11065 req-ce66a0dd-b3ce-4a92-b475-7ac1ee58b77b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.435 182096 DEBUG oslo_concurrency.lockutils [req-d1726a85-fb67-4653-a4a1-03d377e11065 req-ce66a0dd-b3ce-4a92-b475-7ac1ee58b77b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.436 182096 DEBUG oslo_concurrency.lockutils [req-d1726a85-fb67-4653-a4a1-03d377e11065 req-ce66a0dd-b3ce-4a92-b475-7ac1ee58b77b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.436 182096 DEBUG oslo_concurrency.lockutils [req-d1726a85-fb67-4653-a4a1-03d377e11065 req-ce66a0dd-b3ce-4a92-b475-7ac1ee58b77b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.436 182096 DEBUG nova.compute.manager [req-d1726a85-fb67-4653-a4a1-03d377e11065 req-ce66a0dd-b3ce-4a92-b475-7ac1ee58b77b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.436 182096 DEBUG nova.compute.manager [req-d1726a85-fb67-4653-a4a1-03d377e11065 req-ce66a0dd-b3ce-4a92-b475-7ac1ee58b77b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-unplugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:35:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-6efe530b6f340e3b2c4086766d63dd9b71c581a8596f7f03253dc5e7f469e74f-merged.mount: Deactivated successfully.
Jan 23 09:35:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19-userdata-shm.mount: Deactivated successfully.
Jan 23 09:35:28 compute-0 podman[227784]: 2026-01-23 09:35:28.440805531 +0000 UTC m=+0.056899731 container cleanup 1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.445 182096 DEBUG nova.network.neutron [req-f7e7ee67-405e-439e-8090-7c42e0bf9530 req-75e4358b-0c56-4c67-b7cf-414eb9fe6d32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updated VIF entry in instance network info cache for port 073e61a9-60a4-439f-a2f4-36ae5c072adc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.446 182096 DEBUG nova.network.neutron [req-f7e7ee67-405e-439e-8090-7c42e0bf9530 req-75e4358b-0c56-4c67-b7cf-414eb9fe6d32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updating instance_info_cache with network_info: [{"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:28 compute-0 systemd[1]: libpod-conmon-1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19.scope: Deactivated successfully.
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.459 182096 DEBUG oslo_concurrency.lockutils [req-f7e7ee67-405e-439e-8090-7c42e0bf9530 req-75e4358b-0c56-4c67-b7cf-414eb9fe6d32 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:35:28 compute-0 podman[227807]: 2026-01-23 09:35:28.48453466 +0000 UTC m=+0.024315596 container remove 1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.488 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1c4cda-fb41-46b9-a328-36488c83ce4e]: (4, ('Fri Jan 23 09:35:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 (1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19)\n1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19\nFri Jan 23 09:35:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 (1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19)\n1c67e749401fcb7e95a877680e41fad0e9e4956a00ed16657305b7b7e6541d19\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.489 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[55578a6a-f07b-43ee-810a-25225770eb59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.490 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a0d6dfa-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.491 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:28 compute-0 kernel: tap1a0d6dfa-00: left promiscuous mode
Jan 23 09:35:28 compute-0 nova_compute[182092]: 2026-01-23 09:35:28.504 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.507 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[175d7937-86a1-412f-81ac-936fedee725f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.518 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2357ef86-8b47-4b2e-b548-69d2177ca389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.519 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8157cd1f-0dc9-45b3-8478-71cb658d9e24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.530 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[acf6fe45-413f-45cb-bbe0-75ff1fd6cb78]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437829, 'reachable_time': 42833, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227820, 'error': None, 'target': 'ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d1a0d6dfa\x2d0d95\x2d490d\x2dab54\x2d4e7f98a34e91.mount: Deactivated successfully.
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.532 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a0d6dfa-0d95-490d-ab54-4e7f98a34e91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:35:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:28.532 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9ad62c-043b-48c0-82f7-64b940ac5fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.004 182096 DEBUG nova.compute.manager [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.005 182096 DEBUG oslo_concurrency.lockutils [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.005 182096 DEBUG oslo_concurrency.lockutils [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.005 182096 DEBUG oslo_concurrency.lockutils [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.006 182096 DEBUG nova.compute.manager [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Processing event network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.006 182096 DEBUG nova.compute.manager [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.006 182096 DEBUG oslo_concurrency.lockutils [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.006 182096 DEBUG oslo_concurrency.lockutils [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.007 182096 DEBUG oslo_concurrency.lockutils [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.007 182096 DEBUG nova.compute.manager [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] No waiting events found dispatching network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.007 182096 WARNING nova.compute.manager [req-bebd28c9-1465-4457-8463-3421cee7a326 req-030db178-6b80-4d1f-9efb-88e155b50364 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received unexpected event network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc for instance with vm_state building and task_state spawning.
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.721 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160914.721176, 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.722 182096 INFO nova.compute.manager [-] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] VM Stopped (Lifecycle Event)
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.750 182096 DEBUG nova.compute.manager [None req-2f85c52b-a60f-4560-95d6-b5ca373e824c - - - - - -] [instance: 5ded6067-8713-4f6f-96bd-1a43e2d8b0e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.779 182096 DEBUG nova.network.neutron [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.793 182096 INFO nova.compute.manager [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Took 1.36 seconds to deallocate network for instance.
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.803 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.803 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160929.8037899, 6c69a90c-edeb-4866-b2da-0b4acfeb2355 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.804 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] VM Started (Lifecycle Event)
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.812 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.814 182096 INFO nova.virt.libvirt.driver [-] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Instance spawned successfully.
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.814 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.829 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.832 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.835 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.835 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.835 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.836 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.836 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.836 182096 DEBUG nova.virt.libvirt.driver [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.888 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.889 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160929.803851, 6c69a90c-edeb-4866-b2da-0b4acfeb2355 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.889 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] VM Paused (Lifecycle Event)
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.918 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.922 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160929.8051405, 6c69a90c-edeb-4866-b2da-0b4acfeb2355 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.922 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] VM Resumed (Lifecycle Event)
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.925 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.925 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.946 182096 INFO nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Took 9.50 seconds to spawn the instance on the hypervisor.
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.947 182096 DEBUG nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.948 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.953 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:35:29 compute-0 nova_compute[182092]: 2026-01-23 09:35:29.981 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.016 182096 DEBUG nova.compute.provider_tree [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.029 182096 DEBUG nova.scheduler.client.report [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.032 182096 INFO nova.compute.manager [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Took 10.04 seconds to build instance.
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.048 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.052 182096 DEBUG oslo_concurrency.lockutils [None req-821867ec-1363-4517-9082-ae4598f8b14a a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.073 182096 INFO nova.scheduler.client.report [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Deleted allocations for instance 77396169-1daa-41c7-8ba6-c68b50815e2a
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.132 182096 DEBUG oslo_concurrency.lockutils [None req-4a8f20f3-2ec1-418b-a421-0332d9264950 800ea9ca92114ca5bf7589f4500f4bec da36a2e2bae4483caec82cba10014d48 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.524 182096 DEBUG nova.compute.manager [req-31a34064-5356-4492-82bb-a78bf8c50c24 req-d1ae98ec-c676-4e6e-849e-d731fee15313 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.524 182096 DEBUG oslo_concurrency.lockutils [req-31a34064-5356-4492-82bb-a78bf8c50c24 req-d1ae98ec-c676-4e6e-849e-d731fee15313 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.524 182096 DEBUG oslo_concurrency.lockutils [req-31a34064-5356-4492-82bb-a78bf8c50c24 req-d1ae98ec-c676-4e6e-849e-d731fee15313 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.525 182096 DEBUG oslo_concurrency.lockutils [req-31a34064-5356-4492-82bb-a78bf8c50c24 req-d1ae98ec-c676-4e6e-849e-d731fee15313 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "77396169-1daa-41c7-8ba6-c68b50815e2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.525 182096 DEBUG nova.compute.manager [req-31a34064-5356-4492-82bb-a78bf8c50c24 req-d1ae98ec-c676-4e6e-849e-d731fee15313 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] No waiting events found dispatching network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:35:30 compute-0 nova_compute[182092]: 2026-01-23 09:35:30.525 182096 WARNING nova.compute.manager [req-31a34064-5356-4492-82bb-a78bf8c50c24 req-d1ae98ec-c676-4e6e-849e-d731fee15313 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received unexpected event network-vif-plugged-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 for instance with vm_state deleted and task_state None.
Jan 23 09:35:31 compute-0 nova_compute[182092]: 2026-01-23 09:35:31.102 182096 DEBUG nova.compute.manager [req-16ac5821-1847-4b55-9bcb-44750e9df18b req-29a4844a-883c-4dfc-9d68-8d479a36b19f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Received event network-vif-deleted-eaf8bed9-e3ff-491c-adfa-61a91e2b0509 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:31 compute-0 podman[227829]: 2026-01-23 09:35:31.204638875 +0000 UTC m=+0.040901697 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Jan 23 09:35:32 compute-0 nova_compute[182092]: 2026-01-23 09:35:32.419 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:33 compute-0 nova_compute[182092]: 2026-01-23 09:35:33.347 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:33 compute-0 NetworkManager[54920]: <info>  [1769160933.5502] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Jan 23 09:35:33 compute-0 nova_compute[182092]: 2026-01-23 09:35:33.547 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:33 compute-0 NetworkManager[54920]: <info>  [1769160933.5511] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Jan 23 09:35:33 compute-0 nova_compute[182092]: 2026-01-23 09:35:33.636 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:33 compute-0 ovn_controller[94697]: 2026-01-23T09:35:33Z|00569|binding|INFO|Releasing lport a2356132-1424-44aa-a17c-91a644594cb4 from this chassis (sb_readonly=0)
Jan 23 09:35:33 compute-0 nova_compute[182092]: 2026-01-23 09:35:33.646 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:34 compute-0 nova_compute[182092]: 2026-01-23 09:35:34.112 182096 DEBUG nova.compute.manager [req-65ce43b4-9afc-494c-b4dd-1e5726f9aec8 req-d2b69e9f-de90-488b-913b-af5c01ef9dd7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-changed-073e61a9-60a4-439f-a2f4-36ae5c072adc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:34 compute-0 nova_compute[182092]: 2026-01-23 09:35:34.113 182096 DEBUG nova.compute.manager [req-65ce43b4-9afc-494c-b4dd-1e5726f9aec8 req-d2b69e9f-de90-488b-913b-af5c01ef9dd7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Refreshing instance network info cache due to event network-changed-073e61a9-60a4-439f-a2f4-36ae5c072adc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:35:34 compute-0 nova_compute[182092]: 2026-01-23 09:35:34.113 182096 DEBUG oslo_concurrency.lockutils [req-65ce43b4-9afc-494c-b4dd-1e5726f9aec8 req-d2b69e9f-de90-488b-913b-af5c01ef9dd7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:35:34 compute-0 nova_compute[182092]: 2026-01-23 09:35:34.113 182096 DEBUG oslo_concurrency.lockutils [req-65ce43b4-9afc-494c-b4dd-1e5726f9aec8 req-d2b69e9f-de90-488b-913b-af5c01ef9dd7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:35:34 compute-0 nova_compute[182092]: 2026-01-23 09:35:34.113 182096 DEBUG nova.network.neutron [req-65ce43b4-9afc-494c-b4dd-1e5726f9aec8 req-d2b69e9f-de90-488b-913b-af5c01ef9dd7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Refreshing network info cache for port 073e61a9-60a4-439f-a2f4-36ae5c072adc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:35:35 compute-0 nova_compute[182092]: 2026-01-23 09:35:35.505 182096 DEBUG nova.network.neutron [req-65ce43b4-9afc-494c-b4dd-1e5726f9aec8 req-d2b69e9f-de90-488b-913b-af5c01ef9dd7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updated VIF entry in instance network info cache for port 073e61a9-60a4-439f-a2f4-36ae5c072adc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:35:35 compute-0 nova_compute[182092]: 2026-01-23 09:35:35.506 182096 DEBUG nova.network.neutron [req-65ce43b4-9afc-494c-b4dd-1e5726f9aec8 req-d2b69e9f-de90-488b-913b-af5c01ef9dd7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updating instance_info_cache with network_info: [{"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:35 compute-0 nova_compute[182092]: 2026-01-23 09:35:35.530 182096 DEBUG oslo_concurrency.lockutils [req-65ce43b4-9afc-494c-b4dd-1e5726f9aec8 req-d2b69e9f-de90-488b-913b-af5c01ef9dd7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:35:35 compute-0 nova_compute[182092]: 2026-01-23 09:35:35.797 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:37 compute-0 nova_compute[182092]: 2026-01-23 09:35:37.421 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:37 compute-0 ovn_controller[94697]: 2026-01-23T09:35:37Z|00570|binding|INFO|Releasing lport a2356132-1424-44aa-a17c-91a644594cb4 from this chassis (sb_readonly=0)
Jan 23 09:35:37 compute-0 nova_compute[182092]: 2026-01-23 09:35:37.606 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:38 compute-0 nova_compute[182092]: 2026-01-23 09:35:38.349 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:39 compute-0 podman[227848]: 2026-01-23 09:35:39.224843236 +0000 UTC m=+0.062553146 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 23 09:35:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:39.867 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:39.867 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:39.868 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:40 compute-0 ovn_controller[94697]: 2026-01-23T09:35:40Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:b1:cc 10.100.0.5
Jan 23 09:35:40 compute-0 ovn_controller[94697]: 2026-01-23T09:35:40Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:b1:cc 10.100.0.5
Jan 23 09:35:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:41.568 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:35:41 compute-0 nova_compute[182092]: 2026-01-23 09:35:41.568 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:41.569 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:35:42 compute-0 nova_compute[182092]: 2026-01-23 09:35:42.422 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:43 compute-0 nova_compute[182092]: 2026-01-23 09:35:43.328 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160928.3271272, 77396169-1daa-41c7-8ba6-c68b50815e2a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:35:43 compute-0 nova_compute[182092]: 2026-01-23 09:35:43.328 182096 INFO nova.compute.manager [-] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] VM Stopped (Lifecycle Event)
Jan 23 09:35:43 compute-0 nova_compute[182092]: 2026-01-23 09:35:43.351 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:43 compute-0 nova_compute[182092]: 2026-01-23 09:35:43.356 182096 DEBUG nova.compute.manager [None req-8b454dff-a307-4397-a6f9-0782026195a7 - - - - - -] [instance: 77396169-1daa-41c7-8ba6-c68b50815e2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:35:44 compute-0 nova_compute[182092]: 2026-01-23 09:35:44.521 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:45.572 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:47 compute-0 podman[227880]: 2026-01-23 09:35:47.20823388 +0000 UTC m=+0.042728664 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:35:47 compute-0 podman[227879]: 2026-01-23 09:35:47.21338242 +0000 UTC m=+0.051035109 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:35:47 compute-0 nova_compute[182092]: 2026-01-23 09:35:47.423 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:48 compute-0 nova_compute[182092]: 2026-01-23 09:35:48.352 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:52 compute-0 nova_compute[182092]: 2026-01-23 09:35:52.424 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:53 compute-0 nova_compute[182092]: 2026-01-23 09:35:53.354 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:53 compute-0 nova_compute[182092]: 2026-01-23 09:35:53.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.377 182096 DEBUG nova.compute.manager [req-058204ee-3919-47d0-93ef-8b5f055132ac req-e40f1406-c1b2-4425-a43b-2d61786d2c1b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-changed-073e61a9-60a4-439f-a2f4-36ae5c072adc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.377 182096 DEBUG nova.compute.manager [req-058204ee-3919-47d0-93ef-8b5f055132ac req-e40f1406-c1b2-4425-a43b-2d61786d2c1b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Refreshing instance network info cache due to event network-changed-073e61a9-60a4-439f-a2f4-36ae5c072adc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.377 182096 DEBUG oslo_concurrency.lockutils [req-058204ee-3919-47d0-93ef-8b5f055132ac req-e40f1406-c1b2-4425-a43b-2d61786d2c1b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.377 182096 DEBUG oslo_concurrency.lockutils [req-058204ee-3919-47d0-93ef-8b5f055132ac req-e40f1406-c1b2-4425-a43b-2d61786d2c1b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.378 182096 DEBUG nova.network.neutron [req-058204ee-3919-47d0-93ef-8b5f055132ac req-e40f1406-c1b2-4425-a43b-2d61786d2c1b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Refreshing network info cache for port 073e61a9-60a4-439f-a2f4-36ae5c072adc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.431 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.431 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.432 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.432 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.432 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.439 182096 INFO nova.compute.manager [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Terminating instance
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.444 182096 DEBUG nova.compute.manager [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:35:56 compute-0 kernel: tap073e61a9-60 (unregistering): left promiscuous mode
Jan 23 09:35:56 compute-0 NetworkManager[54920]: <info>  [1769160956.4707] device (tap073e61a9-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.472 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 ovn_controller[94697]: 2026-01-23T09:35:56Z|00571|binding|INFO|Releasing lport 073e61a9-60a4-439f-a2f4-36ae5c072adc from this chassis (sb_readonly=0)
Jan 23 09:35:56 compute-0 ovn_controller[94697]: 2026-01-23T09:35:56Z|00572|binding|INFO|Setting lport 073e61a9-60a4-439f-a2f4-36ae5c072adc down in Southbound
Jan 23 09:35:56 compute-0 ovn_controller[94697]: 2026-01-23T09:35:56Z|00573|binding|INFO|Removing iface tap073e61a9-60 ovn-installed in OVS
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.473 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.477 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:b1:cc 10.100.0.5'], port_security=['fa:16:3e:c9:b1:cc 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6c69a90c-edeb-4866-b2da-0b4acfeb2355', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9132fe1-3e71-471b-b4bd-f2fa5155726e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ee87b8f51c547bc8fbc47a9fc3839be', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4347c1ec-b03a-433a-b1a4-bfdc6f5a2670 d6921887-b47d-44d3-ae5c-aedf46e6ccbb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88512b09-a7ca-4424-b849-b1221a082f4f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=073e61a9-60a4-439f-a2f4-36ae5c072adc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.478 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 073e61a9-60a4-439f-a2f4-36ae5c072adc in datapath b9132fe1-3e71-471b-b4bd-f2fa5155726e unbound from our chassis
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.480 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9132fe1-3e71-471b-b4bd-f2fa5155726e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.481 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6f370d2a-496c-4bae-a4b5-9aceb44498e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.482 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e namespace which is not needed anymore
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.489 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Jan 23 09:35:56 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000008e.scope: Consumed 13.319s CPU time.
Jan 23 09:35:56 compute-0 systemd-machined[153562]: Machine qemu-73-instance-0000008e terminated.
Jan 23 09:35:56 compute-0 neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e[227744]: [NOTICE]   (227748) : haproxy version is 2.8.14-c23fe91
Jan 23 09:35:56 compute-0 neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e[227744]: [NOTICE]   (227748) : path to executable is /usr/sbin/haproxy
Jan 23 09:35:56 compute-0 neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e[227744]: [WARNING]  (227748) : Exiting Master process...
Jan 23 09:35:56 compute-0 neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e[227744]: [ALERT]    (227748) : Current worker (227750) exited with code 143 (Terminated)
Jan 23 09:35:56 compute-0 neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e[227744]: [WARNING]  (227748) : All workers exited. Exiting... (0)
Jan 23 09:35:56 compute-0 systemd[1]: libpod-11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88.scope: Deactivated successfully.
Jan 23 09:35:56 compute-0 podman[227938]: 2026-01-23 09:35:56.580074066 +0000 UTC m=+0.033092078 container died 11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 09:35:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88-userdata-shm.mount: Deactivated successfully.
Jan 23 09:35:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-58cc7044cb679753d1697f7cde4774244703b05cde6021ecd1342e6101dc382c-merged.mount: Deactivated successfully.
Jan 23 09:35:56 compute-0 podman[227938]: 2026-01-23 09:35:56.595206417 +0000 UTC m=+0.048224430 container cleanup 11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:35:56 compute-0 systemd[1]: libpod-conmon-11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88.scope: Deactivated successfully.
Jan 23 09:35:56 compute-0 podman[227961]: 2026-01-23 09:35:56.63276058 +0000 UTC m=+0.023014850 container remove 11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.638 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dd411c5d-97f6-49c2-9c82-8c85ee69e54d]: (4, ('Fri Jan 23 09:35:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e (11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88)\n11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88\nFri Jan 23 09:35:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e (11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88)\n11a1535f84faad88ea5ebf97cec8460ba9b1192189bc86d6d5a2be4641f85a88\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.640 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4167dfd7-66ea-488f-830f-f4471bb362b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.640 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9132fe1-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.642 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 kernel: tapb9132fe1-30: left promiscuous mode
Jan 23 09:35:56 compute-0 NetworkManager[54920]: <info>  [1769160956.6582] manager: (tap073e61a9-60): new Tun device (/org/freedesktop/NetworkManager/Devices/287)
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.658 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.660 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2b663588-8224-4964-b309-b206e78451c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.670 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5418c2-f0fe-438f-8df3-6086f559d722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.671 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e85ec38e-ae34-4eac-897e-6cdf8ce898cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.683 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4b6bd5dc-1798-4a8a-9c30-bcb905ce0239]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443320, 'reachable_time': 31866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227986, 'error': None, 'target': 'ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:56 compute-0 systemd[1]: run-netns-ovnmeta\x2db9132fe1\x2d3e71\x2d471b\x2db4bd\x2df2fa5155726e.mount: Deactivated successfully.
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.685 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b9132fe1-3e71-471b-b4bd-f2fa5155726e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:35:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:35:56.685 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[1b86c901-0398-491b-a0c6-57f230e7d9dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.689 182096 INFO nova.virt.libvirt.driver [-] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Instance destroyed successfully.
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.689 182096 DEBUG nova.objects.instance [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lazy-loading 'resources' on Instance uuid 6c69a90c-edeb-4866-b2da-0b4acfeb2355 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.702 182096 DEBUG nova.virt.libvirt.vif [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-275687208-access_point-946796410',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-275687208-access_point-946796410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-275687208-acc',id=142,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDCRgH5f9fNAe+kWWWE1cX+DmnsGpTnYchAJ3P9bncspSdyO/BPBk+/eEXnS9GohvsEXpfFZL+Px5hyY0cktFgiS+i2z7DYQCRG1mHzRuamYVxyUS4ddWC35fBRR0CTOaQ==',key_name='tempest-TestSecurityGroupsBasicOps-851949540',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:35:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5ee87b8f51c547bc8fbc47a9fc3839be',ramdisk_id='',reservation_id='r-5nqyk5tz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-275687208',owner_user_name='tempest-TestSecurityGroupsBasicOps-275687208-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:35:29Z,user_data=None,user_id='a2ccb949ece647429b744fd0f25e41b4',uuid=6c69a90c-edeb-4866-b2da-0b4acfeb2355,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.703 182096 DEBUG nova.network.os_vif_util [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Converting VIF {"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.703 182096 DEBUG nova.network.os_vif_util [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:b1:cc,bridge_name='br-int',has_traffic_filtering=True,id=073e61a9-60a4-439f-a2f4-36ae5c072adc,network=Network(b9132fe1-3e71-471b-b4bd-f2fa5155726e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap073e61a9-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.704 182096 DEBUG os_vif [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:b1:cc,bridge_name='br-int',has_traffic_filtering=True,id=073e61a9-60a4-439f-a2f4-36ae5c072adc,network=Network(b9132fe1-3e71-471b-b4bd-f2fa5155726e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap073e61a9-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.704 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.705 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap073e61a9-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.705 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.707 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.709 182096 INFO os_vif [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:b1:cc,bridge_name='br-int',has_traffic_filtering=True,id=073e61a9-60a4-439f-a2f4-36ae5c072adc,network=Network(b9132fe1-3e71-471b-b4bd-f2fa5155726e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap073e61a9-60')
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.710 182096 INFO nova.virt.libvirt.driver [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Deleting instance files /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355_del
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.710 182096 INFO nova.virt.libvirt.driver [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Deletion of /var/lib/nova/instances/6c69a90c-edeb-4866-b2da-0b4acfeb2355_del complete
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.786 182096 INFO nova.compute.manager [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.786 182096 DEBUG oslo.service.loopingcall [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.787 182096 DEBUG nova.compute.manager [-] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:35:56 compute-0 nova_compute[182092]: 2026-01-23 09:35:56.787 182096 DEBUG nova.network.neutron [-] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.208 182096 DEBUG nova.compute.manager [req-ea36e46b-5252-4dc1-b019-501eeb2b593f req-f798a8f7-4bda-4831-a410-031397e8fd99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-vif-unplugged-073e61a9-60a4-439f-a2f4-36ae5c072adc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.208 182096 DEBUG oslo_concurrency.lockutils [req-ea36e46b-5252-4dc1-b019-501eeb2b593f req-f798a8f7-4bda-4831-a410-031397e8fd99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.208 182096 DEBUG oslo_concurrency.lockutils [req-ea36e46b-5252-4dc1-b019-501eeb2b593f req-f798a8f7-4bda-4831-a410-031397e8fd99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.208 182096 DEBUG oslo_concurrency.lockutils [req-ea36e46b-5252-4dc1-b019-501eeb2b593f req-f798a8f7-4bda-4831-a410-031397e8fd99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.209 182096 DEBUG nova.compute.manager [req-ea36e46b-5252-4dc1-b019-501eeb2b593f req-f798a8f7-4bda-4831-a410-031397e8fd99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] No waiting events found dispatching network-vif-unplugged-073e61a9-60a4-439f-a2f4-36ae5c072adc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.209 182096 DEBUG nova.compute.manager [req-ea36e46b-5252-4dc1-b019-501eeb2b593f req-f798a8f7-4bda-4831-a410-031397e8fd99 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-vif-unplugged-073e61a9-60a4-439f-a2f4-36ae5c072adc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.427 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.456 182096 DEBUG nova.network.neutron [-] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.468 182096 INFO nova.compute.manager [-] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Took 0.68 seconds to deallocate network for instance.
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.480 182096 DEBUG nova.network.neutron [req-058204ee-3919-47d0-93ef-8b5f055132ac req-e40f1406-c1b2-4425-a43b-2d61786d2c1b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updated VIF entry in instance network info cache for port 073e61a9-60a4-439f-a2f4-36ae5c072adc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.481 182096 DEBUG nova.network.neutron [req-058204ee-3919-47d0-93ef-8b5f055132ac req-e40f1406-c1b2-4425-a43b-2d61786d2c1b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updating instance_info_cache with network_info: [{"id": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "address": "fa:16:3e:c9:b1:cc", "network": {"id": "b9132fe1-3e71-471b-b4bd-f2fa5155726e", "bridge": "br-int", "label": "tempest-network-smoke--984737951", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ee87b8f51c547bc8fbc47a9fc3839be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap073e61a9-60", "ovs_interfaceid": "073e61a9-60a4-439f-a2f4-36ae5c072adc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.495 182096 DEBUG oslo_concurrency.lockutils [req-058204ee-3919-47d0-93ef-8b5f055132ac req-e40f1406-c1b2-4425-a43b-2d61786d2c1b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-6c69a90c-edeb-4866-b2da-0b4acfeb2355" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.502 182096 DEBUG nova.compute.manager [req-a075da02-708a-4202-a87f-1b737da8bef1 req-8537091f-bad6-44cf-bc0a-b69a3fee128a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-vif-deleted-073e61a9-60a4-439f-a2f4-36ae5c072adc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.503 182096 INFO nova.compute.manager [req-a075da02-708a-4202-a87f-1b737da8bef1 req-8537091f-bad6-44cf-bc0a-b69a3fee128a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Neutron deleted interface 073e61a9-60a4-439f-a2f4-36ae5c072adc; detaching it from the instance and deleting it from the info cache
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.503 182096 DEBUG nova.network.neutron [req-a075da02-708a-4202-a87f-1b737da8bef1 req-8537091f-bad6-44cf-bc0a-b69a3fee128a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.522 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.522 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.525 182096 DEBUG nova.compute.manager [req-a075da02-708a-4202-a87f-1b737da8bef1 req-8537091f-bad6-44cf-bc0a-b69a3fee128a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Detach interface failed, port_id=073e61a9-60a4-439f-a2f4-36ae5c072adc, reason: Instance 6c69a90c-edeb-4866-b2da-0b4acfeb2355 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.563 182096 DEBUG nova.compute.provider_tree [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.572 182096 DEBUG nova.scheduler.client.report [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.583 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.603 182096 INFO nova.scheduler.client.report [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Deleted allocations for instance 6c69a90c-edeb-4866-b2da-0b4acfeb2355
Jan 23 09:35:57 compute-0 nova_compute[182092]: 2026-01-23 09:35:57.664 182096 DEBUG oslo_concurrency.lockutils [None req-3e47bb50-80c8-4fc7-9cd9-bbfc8f7b8d54 a2ccb949ece647429b744fd0f25e41b4 5ee87b8f51c547bc8fbc47a9fc3839be - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:58 compute-0 podman[227992]: 2026-01-23 09:35:58.205353044 +0000 UTC m=+0.039567111 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 09:35:58 compute-0 podman[227993]: 2026-01-23 09:35:58.209336196 +0000 UTC m=+0.042068517 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.439 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.439 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.456 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.523 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.523 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.528 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.528 182096 INFO nova.compute.claims [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.627 182096 DEBUG nova.compute.provider_tree [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.637 182096 DEBUG nova.scheduler.client.report [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.654 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.655 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.686 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.687 182096 DEBUG nova.network.neutron [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.697 182096 INFO nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.708 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.798 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.799 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.799 182096 INFO nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Creating image(s)
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.800 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.800 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.800 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.810 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.858 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.859 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.860 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.869 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.914 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.915 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.938 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.939 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.939 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.986 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.986 182096 DEBUG nova.virt.disk.api [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Checking if we can resize image /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:35:58 compute-0 nova_compute[182092]: 2026-01-23 09:35:58.987 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.034 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.035 182096 DEBUG nova.virt.disk.api [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Cannot resize image /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.035 182096 DEBUG nova.objects.instance [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'migration_context' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.048 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.048 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Ensure instance console log exists: /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.049 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.049 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.049 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.289 182096 DEBUG nova.compute.manager [req-f11a5731-b915-402e-99b0-37eaf7d208e2 req-24229b0f-28df-4a52-bfd7-f7c51a9b4df8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received event network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.289 182096 DEBUG oslo_concurrency.lockutils [req-f11a5731-b915-402e-99b0-37eaf7d208e2 req-24229b0f-28df-4a52-bfd7-f7c51a9b4df8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.290 182096 DEBUG oslo_concurrency.lockutils [req-f11a5731-b915-402e-99b0-37eaf7d208e2 req-24229b0f-28df-4a52-bfd7-f7c51a9b4df8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.290 182096 DEBUG oslo_concurrency.lockutils [req-f11a5731-b915-402e-99b0-37eaf7d208e2 req-24229b0f-28df-4a52-bfd7-f7c51a9b4df8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "6c69a90c-edeb-4866-b2da-0b4acfeb2355-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.290 182096 DEBUG nova.compute.manager [req-f11a5731-b915-402e-99b0-37eaf7d208e2 req-24229b0f-28df-4a52-bfd7-f7c51a9b4df8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] No waiting events found dispatching network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.290 182096 WARNING nova.compute.manager [req-f11a5731-b915-402e-99b0-37eaf7d208e2 req-24229b0f-28df-4a52-bfd7-f7c51a9b4df8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Received unexpected event network-vif-plugged-073e61a9-60a4-439f-a2f4-36ae5c072adc for instance with vm_state deleted and task_state None.
Jan 23 09:35:59 compute-0 nova_compute[182092]: 2026-01-23 09:35:59.470 182096 DEBUG nova.policy [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:36:00 compute-0 nova_compute[182092]: 2026-01-23 09:36:00.014 182096 DEBUG nova.network.neutron [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Successfully created port: 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:36:00 compute-0 nova_compute[182092]: 2026-01-23 09:36:00.634 182096 DEBUG nova.network.neutron [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Successfully updated port: 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:36:00 compute-0 nova_compute[182092]: 2026-01-23 09:36:00.644 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:36:00 compute-0 nova_compute[182092]: 2026-01-23 09:36:00.644 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquired lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:36:00 compute-0 nova_compute[182092]: 2026-01-23 09:36:00.645 182096 DEBUG nova.network.neutron [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:36:00 compute-0 nova_compute[182092]: 2026-01-23 09:36:00.791 182096 DEBUG nova.network.neutron [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.384 182096 DEBUG nova.compute.manager [req-849f7073-88c2-439c-99ba-3d84a62adcc7 req-f23f88d9-8ed7-4814-9f66-2961d4f70259 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-changed-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.384 182096 DEBUG nova.compute.manager [req-849f7073-88c2-439c-99ba-3d84a62adcc7 req-f23f88d9-8ed7-4814-9f66-2961d4f70259 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Refreshing instance network info cache due to event network-changed-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.385 182096 DEBUG oslo_concurrency.lockutils [req-849f7073-88c2-439c-99ba-3d84a62adcc7 req-f23f88d9-8ed7-4814-9f66-2961d4f70259 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.470 182096 DEBUG nova.network.neutron [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Updating instance_info_cache with network_info: [{"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.485 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Releasing lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.486 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance network_info: |[{"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.486 182096 DEBUG oslo_concurrency.lockutils [req-849f7073-88c2-439c-99ba-3d84a62adcc7 req-f23f88d9-8ed7-4814-9f66-2961d4f70259 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.486 182096 DEBUG nova.network.neutron [req-849f7073-88c2-439c-99ba-3d84a62adcc7 req-f23f88d9-8ed7-4814-9f66-2961d4f70259 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Refreshing network info cache for port 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.488 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Start _get_guest_xml network_info=[{"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.492 182096 WARNING nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.496 182096 DEBUG nova.virt.libvirt.host [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.496 182096 DEBUG nova.virt.libvirt.host [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.498 182096 DEBUG nova.virt.libvirt.host [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.499 182096 DEBUG nova.virt.libvirt.host [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.500 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.500 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.500 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.501 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.501 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.501 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.501 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.502 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.502 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.502 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.503 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.503 182096 DEBUG nova.virt.hardware [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.506 182096 DEBUG nova.virt.libvirt.vif [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:35:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-185163690',display_name='tempest-ServerRescueNegativeTestJSON-server-185163690',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-185163690',id=145,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7101771f6abd46148c36f80fed40c40e',ramdisk_id='',reservation_id='r-4xmktry0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-776163304',owner_user_name='tempest-ServerRescueNegativeTestJSON-776163304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:35:58Z,user_data=None,user_id='35902c3964864ac9b7e446cb9f746d80',uuid=614c42dd-39f8-42e7-b9c5-b0d69d8e17c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.506 182096 DEBUG nova.network.os_vif_util [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Converting VIF {"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.507 182096 DEBUG nova.network.os_vif_util [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:6a:08,bridge_name='br-int',has_traffic_filtering=True,id=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc,network=Network(4643f642-fdbf-416e-8a97-b503a2990568),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18e2ec6e-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.508 182096 DEBUG nova.objects.instance [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'pci_devices' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.522 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <uuid>614c42dd-39f8-42e7-b9c5-b0d69d8e17c0</uuid>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <name>instance-00000091</name>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerRescueNegativeTestJSON-server-185163690</nova:name>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:36:01</nova:creationTime>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:36:01 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:36:01 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:36:01 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:36:01 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:36:01 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:36:01 compute-0 nova_compute[182092]:         <nova:user uuid="35902c3964864ac9b7e446cb9f746d80">tempest-ServerRescueNegativeTestJSON-776163304-project-member</nova:user>
Jan 23 09:36:01 compute-0 nova_compute[182092]:         <nova:project uuid="7101771f6abd46148c36f80fed40c40e">tempest-ServerRescueNegativeTestJSON-776163304</nova:project>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:36:01 compute-0 nova_compute[182092]:         <nova:port uuid="18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc">
Jan 23 09:36:01 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <system>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <entry name="serial">614c42dd-39f8-42e7-b9c5-b0d69d8e17c0</entry>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <entry name="uuid">614c42dd-39f8-42e7-b9c5-b0d69d8e17c0</entry>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </system>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <os>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   </os>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <features>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   </features>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.config"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:80:6a:08"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <target dev="tap18e2ec6e-de"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/console.log" append="off"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <video>
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </video>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:36:01 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:36:01 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:36:01 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:36:01 compute-0 nova_compute[182092]: </domain>
Jan 23 09:36:01 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.523 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Preparing to wait for external event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.524 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.524 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.524 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.525 182096 DEBUG nova.virt.libvirt.vif [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:35:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-185163690',display_name='tempest-ServerRescueNegativeTestJSON-server-185163690',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-185163690',id=145,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7101771f6abd46148c36f80fed40c40e',ramdisk_id='',reservation_id='r-4xmktry0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-776163304',owner_user_name='tempest-ServerRescueNegativeTestJSON-776163304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:35:58Z,user_data=None,user_id='35902c3964864ac9b7e446cb9f746d80',uuid=614c42dd-39f8-42e7-b9c5-b0d69d8e17c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.525 182096 DEBUG nova.network.os_vif_util [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Converting VIF {"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.526 182096 DEBUG nova.network.os_vif_util [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:6a:08,bridge_name='br-int',has_traffic_filtering=True,id=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc,network=Network(4643f642-fdbf-416e-8a97-b503a2990568),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18e2ec6e-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.526 182096 DEBUG os_vif [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:6a:08,bridge_name='br-int',has_traffic_filtering=True,id=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc,network=Network(4643f642-fdbf-416e-8a97-b503a2990568),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18e2ec6e-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.526 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.527 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.527 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.529 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.529 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18e2ec6e-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.530 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap18e2ec6e-de, col_values=(('external_ids', {'iface-id': '18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:6a:08', 'vm-uuid': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.531 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:01 compute-0 NetworkManager[54920]: <info>  [1769160961.5324] manager: (tap18e2ec6e-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.535 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.536 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.536 182096 INFO os_vif [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:6a:08,bridge_name='br-int',has_traffic_filtering=True,id=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc,network=Network(4643f642-fdbf-416e-8a97-b503a2990568),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18e2ec6e-de')
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.571 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.572 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.572 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] No VIF found with MAC fa:16:3e:80:6a:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:36:01 compute-0 nova_compute[182092]: 2026-01-23 09:36:01.573 182096 INFO nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Using config drive
Jan 23 09:36:01 compute-0 podman[228050]: 2026-01-23 09:36:01.611326886 +0000 UTC m=+0.047728092 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter)
Jan 23 09:36:02 compute-0 nova_compute[182092]: 2026-01-23 09:36:02.428 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:02 compute-0 nova_compute[182092]: 2026-01-23 09:36:02.534 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:02 compute-0 nova_compute[182092]: 2026-01-23 09:36:02.741 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:02 compute-0 nova_compute[182092]: 2026-01-23 09:36:02.787 182096 INFO nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Creating config drive at /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.config
Jan 23 09:36:02 compute-0 nova_compute[182092]: 2026-01-23 09:36:02.791 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyfcy304y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:02 compute-0 nova_compute[182092]: 2026-01-23 09:36:02.909 182096 DEBUG oslo_concurrency.processutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyfcy304y" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:02 compute-0 NetworkManager[54920]: <info>  [1769160962.9463] manager: (tap18e2ec6e-de): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Jan 23 09:36:02 compute-0 kernel: tap18e2ec6e-de: entered promiscuous mode
Jan 23 09:36:02 compute-0 ovn_controller[94697]: 2026-01-23T09:36:02Z|00574|binding|INFO|Claiming lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for this chassis.
Jan 23 09:36:02 compute-0 ovn_controller[94697]: 2026-01-23T09:36:02Z|00575|binding|INFO|18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc: Claiming fa:16:3e:80:6a:08 10.100.0.14
Jan 23 09:36:02 compute-0 nova_compute[182092]: 2026-01-23 09:36:02.950 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.957 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:6a:08 10.100.0.14'], port_security=['fa:16:3e:80:6a:08 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4643f642-fdbf-416e-8a97-b503a2990568', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7101771f6abd46148c36f80fed40c40e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '92222b96-d0dc-43a0-9343-875648372333', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f71413d6-fb1e-4318-a143-8bcb0cb14270, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.958 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc in datapath 4643f642-fdbf-416e-8a97-b503a2990568 bound to our chassis
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.959 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4643f642-fdbf-416e-8a97-b503a2990568
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.967 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b025c263-9108-42d2-a508-aeb579e12da7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.967 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4643f642-f1 in ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.969 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4643f642-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.969 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5c84f449-1469-4129-9269-37d7a243112e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.969 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7c23a124-eda5-4896-99a6-a9e56149bd67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:02 compute-0 systemd-udevd[228086]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:36:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:02.983 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1e69f7-297d-45ef-b01f-ce1aa975eb29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:02 compute-0 NetworkManager[54920]: <info>  [1769160962.9898] device (tap18e2ec6e-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:36:02 compute-0 NetworkManager[54920]: <info>  [1769160962.9911] device (tap18e2ec6e-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:36:03 compute-0 systemd-machined[153562]: New machine qemu-74-instance-00000091.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.008 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.006 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6d6ca7dc-7f2d-4b72-8edd-cb2f2451973b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_controller[94697]: 2026-01-23T09:36:03Z|00576|binding|INFO|Setting lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc ovn-installed in OVS
Jan 23 09:36:03 compute-0 ovn_controller[94697]: 2026-01-23T09:36:03Z|00577|binding|INFO|Setting lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc up in Southbound
Jan 23 09:36:03 compute-0 systemd[1]: Started Virtual Machine qemu-74-instance-00000091.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.013 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.033 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[42a74700-800e-40c7-a834-71aa601dbd06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 systemd-udevd[228091]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:36:03 compute-0 NetworkManager[54920]: <info>  [1769160963.0379] manager: (tap4643f642-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.037 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[15086904-b918-4dc1-b8a3-d1cb5a394440]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.061 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[67fd7b51-5049-4908-9c4f-d8777cfbc88f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.064 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[aa101865-4128-407c-bd42-957c4462d868]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 NetworkManager[54920]: <info>  [1769160963.0808] device (tap4643f642-f0): carrier: link connected
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.086 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[a843a611-7448-42c4-b1d9-f2e31bd485ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.099 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fd22a75d-0f2c-4d7f-9865-7555538fe1cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4643f642-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:96:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446856, 'reachable_time': 41648, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228112, 'error': None, 'target': 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.110 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6d99195d-bead-4724-86bd-20c2acd5db05]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:9627'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446856, 'tstamp': 446856}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228113, 'error': None, 'target': 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.122 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fd69ef1c-88ae-4279-90cf-1aaf7e1f1453]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4643f642-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:96:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446856, 'reachable_time': 41648, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228114, 'error': None, 'target': 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.148 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b8184dc0-a565-4aed-b981-b254bbdbd39b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.189 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4f0a07ad-b1ac-413f-b809-e2648c2ffbdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.190 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4643f642-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.190 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.190 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4643f642-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.191 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:03 compute-0 NetworkManager[54920]: <info>  [1769160963.1923] manager: (tap4643f642-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 23 09:36:03 compute-0 kernel: tap4643f642-f0: entered promiscuous mode
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.193 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.198 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4643f642-f0, col_values=(('external_ids', {'iface-id': 'b964e497-de7e-4294-b14d-3cf88cb4cf4d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.199 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:03 compute-0 ovn_controller[94697]: 2026-01-23T09:36:03Z|00578|binding|INFO|Releasing lport b964e497-de7e-4294-b14d-3cf88cb4cf4d from this chassis (sb_readonly=0)
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.200 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.211 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.211 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4643f642-fdbf-416e-8a97-b503a2990568.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4643f642-fdbf-416e-8a97-b503a2990568.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.212 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[56e370f0-1c41-44b8-b1ff-7159b8eb650f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.213 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-4643f642-fdbf-416e-8a97-b503a2990568
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/4643f642-fdbf-416e-8a97-b503a2990568.pid.haproxy
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 4643f642-fdbf-416e-8a97-b503a2990568
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:36:03 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:03.213 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'env', 'PROCESS_TAG=haproxy-4643f642-fdbf-416e-8a97-b503a2990568', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4643f642-fdbf-416e-8a97-b503a2990568.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.470 182096 DEBUG nova.compute.manager [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.471 182096 DEBUG oslo_concurrency.lockutils [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.472 182096 DEBUG oslo_concurrency.lockutils [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.472 182096 DEBUG oslo_concurrency.lockutils [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.472 182096 DEBUG nova.compute.manager [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Processing event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.473 182096 DEBUG nova.compute.manager [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.473 182096 DEBUG oslo_concurrency.lockutils [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.473 182096 DEBUG oslo_concurrency.lockutils [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.473 182096 DEBUG oslo_concurrency.lockutils [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.474 182096 DEBUG nova.compute.manager [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] No waiting events found dispatching network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.474 182096 WARNING nova.compute.manager [req-fb3679ea-8203-45ea-a108-0a95f1863758 req-5771aa05-0cc2-4ebc-ab94-8fc648d97eed 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received unexpected event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for instance with vm_state building and task_state spawning.
Jan 23 09:36:03 compute-0 podman[228142]: 2026-01-23 09:36:03.491868657 +0000 UTC m=+0.038136392 container create 6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:36:03 compute-0 systemd[1]: Started libpod-conmon-6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845.scope.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.515 182096 DEBUG nova.network.neutron [req-849f7073-88c2-439c-99ba-3d84a62adcc7 req-f23f88d9-8ed7-4814-9f66-2961d4f70259 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Updated VIF entry in instance network info cache for port 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.516 182096 DEBUG nova.network.neutron [req-849f7073-88c2-439c-99ba-3d84a62adcc7 req-f23f88d9-8ed7-4814-9f66-2961d4f70259 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Updating instance_info_cache with network_info: [{"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.533 182096 DEBUG oslo_concurrency.lockutils [req-849f7073-88c2-439c-99ba-3d84a62adcc7 req-f23f88d9-8ed7-4814-9f66-2961d4f70259 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:36:03 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:36:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29c2f42ad5abaceaaf93713afd100fb01757cee0c68197a1ed4b24eb00ace8c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:36:03 compute-0 podman[228142]: 2026-01-23 09:36:03.544413172 +0000 UTC m=+0.090680917 container init 6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 09:36:03 compute-0 podman[228142]: 2026-01-23 09:36:03.549024952 +0000 UTC m=+0.095292687 container start 6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:36:03 compute-0 podman[228142]: 2026-01-23 09:36:03.476888873 +0000 UTC m=+0.023156627 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:36:03 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228154]: [NOTICE]   (228158) : New worker (228160) forked
Jan 23 09:36:03 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228154]: [NOTICE]   (228158) : Loading success.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.721 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.723 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160963.721617, 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.723 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] VM Started (Lifecycle Event)
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.730 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.733 182096 INFO nova.virt.libvirt.driver [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance spawned successfully.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.734 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.748 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.752 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.754 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.755 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.755 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.755 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.756 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.756 182096 DEBUG nova.virt.libvirt.driver [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.774 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.775 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160963.7222052, 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.775 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] VM Paused (Lifecycle Event)
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.790 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.792 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160963.7260613, 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.792 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] VM Resumed (Lifecycle Event)
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.802 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.803 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.815 182096 INFO nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Took 5.02 seconds to spawn the instance on the hypervisor.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.816 182096 DEBUG nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.823 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.978 182096 INFO nova.compute.manager [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Took 5.47 seconds to build instance.
Jan 23 09:36:03 compute-0 nova_compute[182092]: 2026-01-23 09:36:03.998 182096 DEBUG oslo_concurrency.lockutils [None req-36215542-10ad-46dc-8516-084505ec126b 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:05 compute-0 nova_compute[182092]: 2026-01-23 09:36:05.818 182096 INFO nova.compute.manager [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Rescuing
Jan 23 09:36:05 compute-0 nova_compute[182092]: 2026-01-23 09:36:05.819 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:36:05 compute-0 nova_compute[182092]: 2026-01-23 09:36:05.819 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquired lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:36:05 compute-0 nova_compute[182092]: 2026-01-23 09:36:05.819 182096 DEBUG nova.network.neutron [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:36:06 compute-0 nova_compute[182092]: 2026-01-23 09:36:06.532 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:07 compute-0 nova_compute[182092]: 2026-01-23 09:36:07.431 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:07 compute-0 nova_compute[182092]: 2026-01-23 09:36:07.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:09 compute-0 nova_compute[182092]: 2026-01-23 09:36:09.471 182096 DEBUG nova.network.neutron [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Updating instance_info_cache with network_info: [{"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:36:09 compute-0 nova_compute[182092]: 2026-01-23 09:36:09.489 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Releasing lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:36:09 compute-0 nova_compute[182092]: 2026-01-23 09:36:09.695 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 23 09:36:10 compute-0 podman[228172]: 2026-01-23 09:36:10.224176941 +0000 UTC m=+0.061138888 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:36:10 compute-0 nova_compute[182092]: 2026-01-23 09:36:10.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.534 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.675 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.676 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.688 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160956.6875503, 6c69a90c-edeb-4866-b2da-0b4acfeb2355 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.688 182096 INFO nova.compute.manager [-] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] VM Stopped (Lifecycle Event)
Jan 23 09:36:11 compute-0 nova_compute[182092]: 2026-01-23 09:36:11.704 182096 DEBUG nova.compute.manager [None req-4b79844b-efaa-4eb5-980c-30a5e46baf75 - - - - - -] [instance: 6c69a90c-edeb-4866-b2da-0b4acfeb2355] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:12 compute-0 nova_compute[182092]: 2026-01-23 09:36:12.433 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.769 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Updating instance_info_cache with network_info: [{"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.785 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.785 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.785 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.785 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.786 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.849 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.850 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.850 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.850 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.898 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.958 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:13 compute-0 nova_compute[182092]: 2026-01-23 09:36:13.959 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.006 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:14 compute-0 ovn_controller[94697]: 2026-01-23T09:36:14Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:80:6a:08 10.100.0.14
Jan 23 09:36:14 compute-0 ovn_controller[94697]: 2026-01-23T09:36:14Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:6a:08 10.100.0.14
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.229 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.230 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5581MB free_disk=73.20173263549805GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.231 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.231 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.288 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.289 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.289 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.350 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.374 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.397 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:36:14 compute-0 nova_compute[182092]: 2026-01-23 09:36:14.398 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:15 compute-0 nova_compute[182092]: 2026-01-23 09:36:15.394 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:16 compute-0 nova_compute[182092]: 2026-01-23 09:36:16.536 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:16 compute-0 nova_compute[182092]: 2026-01-23 09:36:16.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:16 compute-0 nova_compute[182092]: 2026-01-23 09:36:16.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:16 compute-0 nova_compute[182092]: 2026-01-23 09:36:16.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:36:17 compute-0 nova_compute[182092]: 2026-01-23 09:36:17.435 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:18 compute-0 podman[228219]: 2026-01-23 09:36:18.202197236 +0000 UTC m=+0.036080675 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:36:18 compute-0 podman[228218]: 2026-01-23 09:36:18.236223254 +0000 UTC m=+0.071942454 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 23 09:36:19 compute-0 nova_compute[182092]: 2026-01-23 09:36:19.726 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 23 09:36:20 compute-0 nova_compute[182092]: 2026-01-23 09:36:20.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:36:21 compute-0 nova_compute[182092]: 2026-01-23 09:36:21.539 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:21 compute-0 kernel: tap18e2ec6e-de (unregistering): left promiscuous mode
Jan 23 09:36:21 compute-0 NetworkManager[54920]: <info>  [1769160981.8415] device (tap18e2ec6e-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:36:21 compute-0 ovn_controller[94697]: 2026-01-23T09:36:21Z|00579|binding|INFO|Releasing lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc from this chassis (sb_readonly=0)
Jan 23 09:36:21 compute-0 ovn_controller[94697]: 2026-01-23T09:36:21Z|00580|binding|INFO|Setting lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc down in Southbound
Jan 23 09:36:21 compute-0 ovn_controller[94697]: 2026-01-23T09:36:21Z|00581|binding|INFO|Removing iface tap18e2ec6e-de ovn-installed in OVS
Jan 23 09:36:21 compute-0 nova_compute[182092]: 2026-01-23 09:36:21.847 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:21.852 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:6a:08 10.100.0.14'], port_security=['fa:16:3e:80:6a:08 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4643f642-fdbf-416e-8a97-b503a2990568', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7101771f6abd46148c36f80fed40c40e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '92222b96-d0dc-43a0-9343-875648372333', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f71413d6-fb1e-4318-a143-8bcb0cb14270, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:36:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:21.853 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc in datapath 4643f642-fdbf-416e-8a97-b503a2990568 unbound from our chassis
Jan 23 09:36:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:21.854 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4643f642-fdbf-416e-8a97-b503a2990568, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:36:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:21.860 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1a8837c7-48bd-4d1e-a775-822791567b8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:21 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:21.860 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 namespace which is not needed anymore
Jan 23 09:36:21 compute-0 nova_compute[182092]: 2026-01-23 09:36:21.866 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:21 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 23 09:36:21 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000091.scope: Consumed 11.272s CPU time.
Jan 23 09:36:21 compute-0 systemd-machined[153562]: Machine qemu-74-instance-00000091 terminated.
Jan 23 09:36:21 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228154]: [NOTICE]   (228158) : haproxy version is 2.8.14-c23fe91
Jan 23 09:36:21 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228154]: [NOTICE]   (228158) : path to executable is /usr/sbin/haproxy
Jan 23 09:36:21 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228154]: [WARNING]  (228158) : Exiting Master process...
Jan 23 09:36:21 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228154]: [ALERT]    (228158) : Current worker (228160) exited with code 143 (Terminated)
Jan 23 09:36:21 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228154]: [WARNING]  (228158) : All workers exited. Exiting... (0)
Jan 23 09:36:21 compute-0 systemd[1]: libpod-6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845.scope: Deactivated successfully.
Jan 23 09:36:21 compute-0 podman[228283]: 2026-01-23 09:36:21.958183102 +0000 UTC m=+0.033827665 container died 6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:36:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-29c2f42ad5abaceaaf93713afd100fb01757cee0c68197a1ed4b24eb00ace8c6-merged.mount: Deactivated successfully.
Jan 23 09:36:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845-userdata-shm.mount: Deactivated successfully.
Jan 23 09:36:21 compute-0 podman[228283]: 2026-01-23 09:36:21.977385991 +0000 UTC m=+0.053030554 container cleanup 6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:36:21 compute-0 systemd[1]: libpod-conmon-6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845.scope: Deactivated successfully.
Jan 23 09:36:22 compute-0 podman[228306]: 2026-01-23 09:36:22.0191905 +0000 UTC m=+0.026134274 container remove 6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.022 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6daecc-104e-406e-b482-9add31c345fa]: (4, ('Fri Jan 23 09:36:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 (6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845)\n6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845\nFri Jan 23 09:36:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 (6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845)\n6eb380145452ee207351c81d8f5da04543180a913f2f89933e9874b8bc318845\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.024 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0f3e6a-8525-43cb-b3f0-e949688ece1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.025 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4643f642-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.026 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.038 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:22 compute-0 kernel: tap4643f642-f0: left promiscuous mode
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.043 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.045 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2b41e199-a3be-4908-9c73-998ed8404d3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.053 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c2717409-53b4-4d6c-ba86-e3c6edeeae6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.054 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d9fe2184-9e17-4d4c-8b3d-036e2987fa75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.062 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.068 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.069 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b5976c21-99e1-42c1-8b69-47b382eb23d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446851, 'reachable_time': 16723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228323, 'error': None, 'target': 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.071 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:36:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:22.071 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[5726ca76-5736-4a3d-8f34-7753f8170a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:22 compute-0 systemd[1]: run-netns-ovnmeta\x2d4643f642\x2dfdbf\x2d416e\x2d8a97\x2db503a2990568.mount: Deactivated successfully.
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.437 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.736 182096 INFO nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance shutdown successfully after 13 seconds.
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.740 182096 INFO nova.virt.libvirt.driver [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance destroyed successfully.
Jan 23 09:36:22 compute-0 nova_compute[182092]: 2026-01-23 09:36:22.740 182096 DEBUG nova.objects.instance [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'numa_topology' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.008 182096 DEBUG nova.compute.manager [req-d5712c3b-f2d2-44cf-8cac-acfd761cbbb0 req-6c52ebe4-bc72-4bda-92cd-1b8a07225424 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-unplugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.008 182096 DEBUG oslo_concurrency.lockutils [req-d5712c3b-f2d2-44cf-8cac-acfd761cbbb0 req-6c52ebe4-bc72-4bda-92cd-1b8a07225424 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.009 182096 DEBUG oslo_concurrency.lockutils [req-d5712c3b-f2d2-44cf-8cac-acfd761cbbb0 req-6c52ebe4-bc72-4bda-92cd-1b8a07225424 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.009 182096 DEBUG oslo_concurrency.lockutils [req-d5712c3b-f2d2-44cf-8cac-acfd761cbbb0 req-6c52ebe4-bc72-4bda-92cd-1b8a07225424 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.009 182096 DEBUG nova.compute.manager [req-d5712c3b-f2d2-44cf-8cac-acfd761cbbb0 req-6c52ebe4-bc72-4bda-92cd-1b8a07225424 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] No waiting events found dispatching network-vif-unplugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.009 182096 WARNING nova.compute.manager [req-d5712c3b-f2d2-44cf-8cac-acfd761cbbb0 req-6c52ebe4-bc72-4bda-92cd-1b8a07225424 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received unexpected event network-vif-unplugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for instance with vm_state active and task_state rescuing.
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.043 182096 INFO nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Attempting rescue
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.043 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.047 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.047 182096 INFO nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Creating image(s)
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.047 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.048 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.048 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.048 182096 DEBUG nova.objects.instance [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.102 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.103 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.112 182096 DEBUG oslo_concurrency.processutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.159 182096 DEBUG oslo_concurrency.processutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.160 182096 DEBUG oslo_concurrency.processutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.179 182096 DEBUG oslo_concurrency.processutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.rescue" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.180 182096 DEBUG oslo_concurrency.lockutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.180 182096 DEBUG nova.objects.instance [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'migration_context' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.194 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.194 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Start _get_guest_xml network_info=[{"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "vif_mac": "fa:16:3e:80:6a:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.195 182096 DEBUG nova.objects.instance [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'resources' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.301 182096 WARNING nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.305 182096 DEBUG nova.virt.libvirt.host [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.305 182096 DEBUG nova.virt.libvirt.host [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.308 182096 DEBUG nova.virt.libvirt.host [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.308 182096 DEBUG nova.virt.libvirt.host [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.309 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.309 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.309 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.309 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.310 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.310 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.310 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.310 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.310 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.311 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.311 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.311 182096 DEBUG nova.virt.hardware [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.311 182096 DEBUG nova.objects.instance [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.323 182096 DEBUG nova.virt.libvirt.vif [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:35:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-185163690',display_name='tempest-ServerRescueNegativeTestJSON-server-185163690',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-185163690',id=145,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:36:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7101771f6abd46148c36f80fed40c40e',ramdisk_id='',reservation_id='r-4xmktry0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-776163304',owner_user_name='tempest-ServerRescueNegativeTestJSON-776163304-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:36:03Z,user_data=None,user_id='35902c3964864ac9b7e446cb9f746d80',uuid=614c42dd-39f8-42e7-b9c5-b0d69d8e17c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "vif_mac": "fa:16:3e:80:6a:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.324 182096 DEBUG nova.network.os_vif_util [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Converting VIF {"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "vif_mac": "fa:16:3e:80:6a:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.324 182096 DEBUG nova.network.os_vif_util [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:6a:08,bridge_name='br-int',has_traffic_filtering=True,id=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc,network=Network(4643f642-fdbf-416e-8a97-b503a2990568),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18e2ec6e-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.325 182096 DEBUG nova.objects.instance [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'pci_devices' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.336 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <uuid>614c42dd-39f8-42e7-b9c5-b0d69d8e17c0</uuid>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <name>instance-00000091</name>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <nova:name>tempest-ServerRescueNegativeTestJSON-server-185163690</nova:name>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:36:24</nova:creationTime>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:36:24 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:36:24 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:36:24 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:36:24 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:36:24 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:36:24 compute-0 nova_compute[182092]:         <nova:user uuid="35902c3964864ac9b7e446cb9f746d80">tempest-ServerRescueNegativeTestJSON-776163304-project-member</nova:user>
Jan 23 09:36:24 compute-0 nova_compute[182092]:         <nova:project uuid="7101771f6abd46148c36f80fed40c40e">tempest-ServerRescueNegativeTestJSON-776163304</nova:project>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:36:24 compute-0 nova_compute[182092]:         <nova:port uuid="18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc">
Jan 23 09:36:24 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <system>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <entry name="serial">614c42dd-39f8-42e7-b9c5-b0d69d8e17c0</entry>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <entry name="uuid">614c42dd-39f8-42e7-b9c5-b0d69d8e17c0</entry>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </system>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <os>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   </os>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <features>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   </features>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.rescue"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <target dev="vdb" bus="virtio"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.config.rescue"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:80:6a:08"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <target dev="tap18e2ec6e-de"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/console.log" append="off"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <video>
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </video>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:36:24 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:36:24 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:36:24 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:36:24 compute-0 nova_compute[182092]: </domain>
Jan 23 09:36:24 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.341 182096 INFO nova.virt.libvirt.driver [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance destroyed successfully.
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.386 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.386 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.386 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.386 182096 DEBUG nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] No VIF found with MAC fa:16:3e:80:6a:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.387 182096 INFO nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Using config drive
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.402 182096 DEBUG nova.objects.instance [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.436 182096 DEBUG nova.objects.instance [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'keypairs' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.835 182096 INFO nova.virt.libvirt.driver [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Creating config drive at /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.config.rescue
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.839 182096 DEBUG oslo_concurrency.processutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkfif1qel execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:24.848 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:36:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:24.849 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.852 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:24 compute-0 nova_compute[182092]: 2026-01-23 09:36:24.959 182096 DEBUG oslo_concurrency.processutils [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkfif1qel" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:25 compute-0 kernel: tap18e2ec6e-de: entered promiscuous mode
Jan 23 09:36:25 compute-0 NetworkManager[54920]: <info>  [1769160985.0022] manager: (tap18e2ec6e-de): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Jan 23 09:36:25 compute-0 ovn_controller[94697]: 2026-01-23T09:36:25Z|00582|binding|INFO|Claiming lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for this chassis.
Jan 23 09:36:25 compute-0 ovn_controller[94697]: 2026-01-23T09:36:25Z|00583|binding|INFO|18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc: Claiming fa:16:3e:80:6a:08 10.100.0.14
Jan 23 09:36:25 compute-0 nova_compute[182092]: 2026-01-23 09:36:25.006 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.013 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:6a:08 10.100.0.14'], port_security=['fa:16:3e:80:6a:08 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4643f642-fdbf-416e-8a97-b503a2990568', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7101771f6abd46148c36f80fed40c40e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '92222b96-d0dc-43a0-9343-875648372333', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f71413d6-fb1e-4318-a143-8bcb0cb14270, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.014 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc in datapath 4643f642-fdbf-416e-8a97-b503a2990568 bound to our chassis
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.015 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4643f642-fdbf-416e-8a97-b503a2990568
Jan 23 09:36:25 compute-0 ovn_controller[94697]: 2026-01-23T09:36:25Z|00584|binding|INFO|Setting lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc ovn-installed in OVS
Jan 23 09:36:25 compute-0 ovn_controller[94697]: 2026-01-23T09:36:25Z|00585|binding|INFO|Setting lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc up in Southbound
Jan 23 09:36:25 compute-0 nova_compute[182092]: 2026-01-23 09:36:25.019 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:25 compute-0 nova_compute[182092]: 2026-01-23 09:36:25.021 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:25 compute-0 nova_compute[182092]: 2026-01-23 09:36:25.024 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.024 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[938333e1-51c2-4d82-914b-4dbe71b981ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.025 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4643f642-f1 in ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.027 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4643f642-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.027 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccc737c-2ebb-4b7b-9050-201a6984bc8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.029 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[37da3b8c-44d2-44b6-9881-d6b6f15ae4e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 systemd-udevd[228365]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.036 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[7553709f-7dad-4fa7-8b49-84c5ba445f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 NetworkManager[54920]: <info>  [1769160985.0400] device (tap18e2ec6e-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:36:25 compute-0 NetworkManager[54920]: <info>  [1769160985.0438] device (tap18e2ec6e-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:36:25 compute-0 systemd-machined[153562]: New machine qemu-75-instance-00000091.
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.047 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[83e605ef-675a-46a6-ab87-664238774611]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 systemd[1]: Started Virtual Machine qemu-75-instance-00000091.
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.067 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2181a2f0-238c-4024-8826-28e8139b1c44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 NetworkManager[54920]: <info>  [1769160985.0727] manager: (tap4643f642-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/293)
Jan 23 09:36:25 compute-0 systemd-udevd[228369]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.073 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[061f4843-162f-4cd6-92aa-ffa7df9e7ce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.097 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[688477ae-8e9d-4966-af54-2128c63a98eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.099 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e945dd1f-df02-46f5-935f-98bbc09dee5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 NetworkManager[54920]: <info>  [1769160985.1189] device (tap4643f642-f0): carrier: link connected
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.123 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e4c8d4-f9b8-4d66-9fb2-5109423037aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.135 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0c5d5393-4c25-41a1-bb3a-1957c330c7df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4643f642-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:96:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449060, 'reachable_time': 18608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228390, 'error': None, 'target': 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.146 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b37c8f3b-6dfd-45fa-b3e3-6a565958a245]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:9627'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449060, 'tstamp': 449060}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228391, 'error': None, 'target': 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.159 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd4525e-7c41-429b-b065-d7515d14c3c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4643f642-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:96:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449060, 'reachable_time': 18608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228392, 'error': None, 'target': 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.181 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[14cb891a-f736-433f-b4c5-afa1ceb7f2e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.221 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce57264-f7ab-45d4-995b-d345a821c2bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.222 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4643f642-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.222 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.223 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4643f642-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:25 compute-0 nova_compute[182092]: 2026-01-23 09:36:25.224 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:25 compute-0 kernel: tap4643f642-f0: entered promiscuous mode
Jan 23 09:36:25 compute-0 NetworkManager[54920]: <info>  [1769160985.2258] manager: (tap4643f642-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.230 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4643f642-f0, col_values=(('external_ids', {'iface-id': 'b964e497-de7e-4294-b14d-3cf88cb4cf4d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:25 compute-0 nova_compute[182092]: 2026-01-23 09:36:25.231 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:25 compute-0 ovn_controller[94697]: 2026-01-23T09:36:25Z|00586|binding|INFO|Releasing lport b964e497-de7e-4294-b14d-3cf88cb4cf4d from this chassis (sb_readonly=0)
Jan 23 09:36:25 compute-0 nova_compute[182092]: 2026-01-23 09:36:25.244 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.244 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4643f642-fdbf-416e-8a97-b503a2990568.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4643f642-fdbf-416e-8a97-b503a2990568.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.244 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[912b6695-ff5d-4428-9272-642622905be7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.245 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-4643f642-fdbf-416e-8a97-b503a2990568
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/4643f642-fdbf-416e-8a97-b503a2990568.pid.haproxy
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 4643f642-fdbf-416e-8a97-b503a2990568
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:36:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:25.245 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'env', 'PROCESS_TAG=haproxy-4643f642-fdbf-416e-8a97-b503a2990568', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4643f642-fdbf-416e-8a97-b503a2990568.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:36:25 compute-0 podman[228420]: 2026-01-23 09:36:25.526287255 +0000 UTC m=+0.033821483 container create f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:36:25 compute-0 systemd[1]: Started libpod-conmon-f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43.scope.
Jan 23 09:36:25 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:36:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70578b1e6f6cd622cabf87c32bfd70886dfa202db7db722b3a0f64978c3cd041/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:36:25 compute-0 podman[228420]: 2026-01-23 09:36:25.577865068 +0000 UTC m=+0.085399317 container init f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:36:25 compute-0 podman[228420]: 2026-01-23 09:36:25.582206907 +0000 UTC m=+0.089741136 container start f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:36:25 compute-0 podman[228420]: 2026-01-23 09:36:25.509497648 +0000 UTC m=+0.017031896 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:36:25 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228432]: [NOTICE]   (228436) : New worker (228438) forked
Jan 23 09:36:25 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228432]: [NOTICE]   (228436) : Loading success.
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.033 182096 DEBUG nova.virt.libvirt.host [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Removed pending event for 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.033 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160986.03282, 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.034 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] VM Resumed (Lifecycle Event)
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.044 182096 DEBUG nova.compute.manager [None req-6f00314f-ef53-4c83-b8a3-3e5617738830 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.097 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.099 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.170 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.170 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769160986.034739, 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.170 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] VM Started (Lifecycle Event)
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.189 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.191 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.219 182096 DEBUG nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.219 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.219 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.220 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.220 182096 DEBUG nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] No waiting events found dispatching network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.220 182096 WARNING nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received unexpected event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for instance with vm_state rescued and task_state None.
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.220 182096 DEBUG nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.221 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.221 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.221 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.221 182096 DEBUG nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] No waiting events found dispatching network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.222 182096 WARNING nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received unexpected event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for instance with vm_state rescued and task_state None.
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.222 182096 DEBUG nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.222 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.222 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.223 182096 DEBUG oslo_concurrency.lockutils [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.223 182096 DEBUG nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] No waiting events found dispatching network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.223 182096 WARNING nova.compute.manager [req-9ccfe3a7-52b5-4d13-b32b-1cb9e17543a3 req-0291b002-a027-4a61-b15d-9aff303b44e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received unexpected event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for instance with vm_state rescued and task_state None.
Jan 23 09:36:26 compute-0 nova_compute[182092]: 2026-01-23 09:36:26.541 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:26.851 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:27 compute-0 nova_compute[182092]: 2026-01-23 09:36:27.440 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:29 compute-0 podman[228450]: 2026-01-23 09:36:29.201858447 +0000 UTC m=+0.040710618 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:36:29 compute-0 podman[228451]: 2026-01-23 09:36:29.236170905 +0000 UTC m=+0.073321476 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:36:31 compute-0 nova_compute[182092]: 2026-01-23 09:36:31.543 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:32 compute-0 podman[228487]: 2026-01-23 09:36:32.210592418 +0000 UTC m=+0.049021131 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:36:32 compute-0 nova_compute[182092]: 2026-01-23 09:36:32.441 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.002 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000091', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7101771f6abd46148c36f80fed40c40e', 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'hostId': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.005 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 / tap18e2ec6e-de inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.005 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f3dc5ac-5d94-4115-9585-60197cfe226d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.003127', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00e40974-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': '9cfb522652aa06bbc404954a3c902dfc100d60a195096d6b66bbf3fcc53a38b7'}]}, 'timestamp': '2026-01-23 09:36:33.005976', '_unique_id': '054a11eb2bd54619ba1b884fa0fe9a49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.006 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.041 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.041 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.041 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c9b5500-4315-48e0-a20f-bcf84ead1a13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.007713', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00e971ca-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '603c811deb8519f7cc30f372d054e605df2676c034349e2a7c18ae2ec301ecfc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.007713', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00e97b52-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': 'dd7a295c047ddb60826233c1143d09e52dffe8f4da947afc3ff0c8151b9810f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.007713', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00e98444-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '2fb944394dd89e8d40056baf18b08911d02407e777c57c37841daa5cb8697444'}]}, 'timestamp': '2026-01-23 09:36:33.041820', '_unique_id': '9cbe1b42da3d4afca13fd22ba3ae6af4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.042 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.043 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.043 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.043 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-185163690>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-185163690>]
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.043 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.043 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.043 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.043 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2bbc379-9de3-40bb-8c43-7712ace6cd91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.043440', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00e9cc2e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '10e35eeccfd0c98fc6ddd40d77660499e51ad2c694033d704c1f8493edf2ae04'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.043440', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00e9d4ee-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '5948c7bdac570649d5ab095faeb06ee01c61e6a83f3317ba572ef06c4cbe9b10'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.043440', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00e9dc6e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '8b6a3da21067e4292cb1c0ed706b906b35bfd3abf5626283586ea6acec84dccf'}]}, 'timestamp': '2026-01-23 09:36:33.044069', '_unique_id': '325adbb9cf764328858c0e0edc55a1d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.044 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.045 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.045 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-185163690>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-185163690>]
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.045 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.045 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.045 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '768ab502-80ed-4251-a802-511ca2c62e31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.045491', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00ea1c38-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '4ce773186cc1b9e3622c2350dc663b486d0068b5ce9a113a884bf8799f8b3ac6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.045491', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00ea2502-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': 'c5a7cf8da16a82330d97171474e4d4b6105b0b96f8afedd37038cafa07229a82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.045491', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00ea2c82-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': 'cae1301db061ba6a15b90d6ca5319243267f98718239e5d66c4b5b81bbe4ef4d'}]}, 'timestamp': '2026-01-23 09:36:33.046118', '_unique_id': '74f3b3e6b1c344d59de7bdb368b3e26e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.047 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.047 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '206d7437-7582-4973-a81a-95e8d4aae810', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.047215', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00ea5ffe-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': '0c154420d61fc83317b3c96232d70e379bc3f571d5a2af1914251e18b4eae1a7'}]}, 'timestamp': '2026-01-23 09:36:33.047451', '_unique_id': '39eeaa4a025e42a287bad6490d126c22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.050 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.065 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.065 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c307df3f-e6c8-42ad-9178-bbb6bd9d577e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.051372', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00ed2c0c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': '1e48b4f258fbbe600cd5092a07c25d0b71fe31399628be830b47ac780266b10b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.051372', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00ed35bc-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': 'd01e21f64798611a05ecb75dae418c70a35eae2d3d98c5186c9c603a8f0d65ae'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.051372', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00ed3d78-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': '2275cf1f9c909000dd02cef60b95d22c28d2c07c10b92675fc39969b31e6657a'}]}, 'timestamp': '2026-01-23 09:36:33.066220', '_unique_id': 'b93797d43cea420daf1289a0f1356474'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.066 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.067 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.067 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.067 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1bf6843-4588-4f8b-a455-937fb9b4d76e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.067444', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00ed75b8-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '01184cb22a6490917934da0ab086bc7cda49cf393b66c99c4513a90a6ece21df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.067444', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00ed7e8c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '134d207732df659aa1fd9ad620c4a2a9cccd996215f6f2236a0cab7518dc9839'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.067444', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00ed85c6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': 'bb2cc45fc909cdb91ba0176a498890abc50bf72e1848ec014e716a4334c5ded2'}]}, 'timestamp': '2026-01-23 09:36:33.068062', '_unique_id': '5f6378d5fb9f48658ac3dcd2d3742767'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.069 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.069 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.latency volume: 181829983 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.070 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.070 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.read.latency volume: 3221655 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0aecdadf-81a2-431f-b01c-a4802322acef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 181829983, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.069899', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00eddc88-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '2731d3f5d8c490a471eeb673e0b1f719b18406d7c535effa976471e96f2e35c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.069899', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00ededea-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '55fd189461cc85c546faf5b3be7e68ac8574f7b975119ae5be5352e9d337f215'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3221655, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.069899', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00edf63c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': 'b8ef7d5f9a1646512ea516c7ab9086e4ef98e2fdcd68969392c02c6f7d911e11'}]}, 'timestamp': '2026-01-23 09:36:33.070945', '_unique_id': '25a19589f5d847d2a955ff39ae4f8ff8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.071 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.072 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02eb23a1-c4c5-4048-8608-c4368ffb92d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.072097', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00ee34da-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': '904b05ec6b80fdc874d0cb2ad1c0adddd1cf2e18c0a9a42deb6e8f507005922b'}]}, 'timestamp': '2026-01-23 09:36:33.072690', '_unique_id': '8e572c433fb942538d98f9230c657ce9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63cb501c-8de2-4ec8-87ad-61601a4ff59f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.075025', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00ee9e66-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': 'b496521cf9b322b8377c8db1a32f1f3d5a0e157993d18a42406189605fde3425'}]}, 'timestamp': '2026-01-23 09:36:33.075265', '_unique_id': 'b07161e1825445eaa5c762c007ad309b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.076 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.076 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-185163690>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-185163690>]
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.077 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84f54ec2-254e-428c-9e97-f2ba3a6c5b00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.077138', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00eef776-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': '193536fd9556e56f2dd92383522d19cb8af4faeb29c1fd1331edf564318172de'}]}, 'timestamp': '2026-01-23 09:36:33.077704', '_unique_id': 'cb3cd99960be4c0abdda42e3c71126fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.078 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.079 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad5ca2db-93b0-4059-adf8-e28fc544aa23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.079717', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00ef555e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': 'ee16f3b882a8ce6bc858ff82cabc2fa2b699ef6e71a890caa97341381f4e0217'}]}, 'timestamp': '2026-01-23 09:36:33.079949', '_unique_id': '0035dd5a952e4d248fdbcb2cb92d1bf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.080 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.081 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2da37382-e9af-46ce-a0d1-9dae354ebc4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.081437', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00efa108-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': 'aef7ce29accba8c5a2dc1b116c9c52cabd18f1c43c6d041143b3a71434de32e1'}]}, 'timestamp': '2026-01-23 09:36:33.082021', '_unique_id': '479ac0ae757e42fcae3995067234a22a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2de600cd-e61f-4083-86cb-cae7c77061aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.083983', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00effbda-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': '300ff3174d0af311956b9281c5bd92c573030df32e1a024eb5b80f9d825d826b'}]}, 'timestamp': '2026-01-23 09:36:33.084211', '_unique_id': 'e903939afbec4995a2034e7f026987b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.084 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.085 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.capacity volume: 117440512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.086 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.086 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9170dbd-4706-40c1-a61e-b1a1e87df550', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 117440512, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.085604', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00f04310-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': '0abb6a5003d322be73fba461c4b83ba2a5e3f80a114b0c50953f32be4091e04a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.085604', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00f05652-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': 'af6aefd561aafd083248c57a1baff8eeaa84be480fb697e324cc53f78728b677'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.085604', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00f069d0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': '8690b81671f3f76c39dd3e96b81642657404a2d06ca9944faf58c40afd30b4e7'}]}, 'timestamp': '2026-01-23 09:36:33.087127', '_unique_id': '4d149681f5694fec8090d084b042e228'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.088 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.089 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96ed88ef-2f18-49e0-a99b-b5325f3ffe6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.088752', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00f0b610-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '2485268630b5881ea5d282d9749f4f9fa882ed952478829d5a32177740253801'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.088752', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00f0bdfe-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': '43e3c241a3f6688af175ed4f2d8a7a6914ad7c2b60594a443afab8939792aa87'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.088752', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00f0c574-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53709924, 'message_signature': 'b1fa82f5a73e8e81732e206b07ae98e11af0b5a802418a37cae9d7cfdf960ab9'}]}, 'timestamp': '2026-01-23 09:36:33.089411', '_unique_id': 'c8d3a056a6cf4e5a97ab657390d50b9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.091 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.102 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.102 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0: ceilometer.compute.pollsters.NoVolumeException
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.103 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a6bb3b8-03b1-45db-b41a-26f715894c83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.103296', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00f2f59c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': 'b98ebc1a8ac80b18a96ee1e3c7769df4441300911cf9e794467e997c9d79d3a7'}]}, 'timestamp': '2026-01-23 09:36:33.103894', '_unique_id': '25f81f7116c14f39afb125fced15a48e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.104 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.105 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.105 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-185163690>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-185163690>]
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.105 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c53c1af6-6ca1-44a1-8ca4-9c11982d0550', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': 'instance-00000091-614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-tap18e2ec6e-de', 'timestamp': '2026-01-23T09:36:33.105456', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'tap18e2ec6e-de', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:6a:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap18e2ec6e-de'}, 'message_id': '00f3429a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.53251319, 'message_signature': 'b66746ba3d959b0791ea233aba7b9155106965b40847443afb24e949fa606b90'}]}, 'timestamp': '2026-01-23 09:36:33.105826', '_unique_id': '87eea2b3528044b5aa3e9e657ba1e58b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.106 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.108 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.108 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/cpu volume: 6930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55f4edae-8faa-447e-8861-1c813e67261f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6930000000, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'timestamp': '2026-01-23T09:36:33.108235', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '00f3b72a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.631900699, 'message_signature': 'e2c31c26fd03f44df41093c4f92ebe3c6c82a0ace7a84883dcfa2ab1828ce1fc'}]}, 'timestamp': '2026-01-23 09:36:33.108820', '_unique_id': '1d451fe525fb43cabf9910e60317a929'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.109 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.usage volume: 196616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.110 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.110 12 DEBUG ceilometer.compute.pollsters [-] 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a137207-a9eb-42f5-b78f-98f9a2939412', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196616, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vda', 'timestamp': '2026-01-23T09:36:33.109916', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00f3f0b4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': '72bba78b556dd2dd2640031759f32fbdaa4f9e8453425be49023ddedf415cc17'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-vdb', 'timestamp': '2026-01-23T09:36:33.109916', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '00f3f8b6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': '9320d3e288ab58965342fc4c1f4f77b0659451e7c16c6a8ce4ce290e3efae20b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '35902c3964864ac9b7e446cb9f746d80', 'user_name': None, 'project_id': '7101771f6abd46148c36f80fed40c40e', 'project_name': None, 'resource_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-sda', 'timestamp': '2026-01-23T09:36:33.109916', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-185163690', 'name': 'instance-00000091', 'instance_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'instance_type': 'm1.nano', 'host': 'ffe7c9ec470c2ec0fd34a06b641b7651c492e555e9f7dfb22091df11', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00f40aae-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4498.58078698, 'message_signature': 'd8c7deb783d5a2ef67bac703e06c0918809d2061bfe95e285480c563cdfb2922'}]}, 'timestamp': '2026-01-23 09:36:33.110931', '_unique_id': 'deff6bb186434b9183b8c3d57756a0d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:36:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:36:33.112 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:36:33 compute-0 nova_compute[182092]: 2026-01-23 09:36:33.994 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:33 compute-0 nova_compute[182092]: 2026-01-23 09:36:33.995 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:33 compute-0 nova_compute[182092]: 2026-01-23 09:36:33.995 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:33 compute-0 nova_compute[182092]: 2026-01-23 09:36:33.995 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:33 compute-0 nova_compute[182092]: 2026-01-23 09:36:33.995 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.003 182096 INFO nova.compute.manager [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Terminating instance
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.009 182096 DEBUG nova.compute.manager [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:36:34 compute-0 kernel: tap18e2ec6e-de (unregistering): left promiscuous mode
Jan 23 09:36:34 compute-0 NetworkManager[54920]: <info>  [1769160994.0279] device (tap18e2ec6e-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:36:34 compute-0 ovn_controller[94697]: 2026-01-23T09:36:34Z|00587|binding|INFO|Releasing lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc from this chassis (sb_readonly=0)
Jan 23 09:36:34 compute-0 ovn_controller[94697]: 2026-01-23T09:36:34Z|00588|binding|INFO|Setting lport 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc down in Southbound
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.034 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 ovn_controller[94697]: 2026-01-23T09:36:34Z|00589|binding|INFO|Removing iface tap18e2ec6e-de ovn-installed in OVS
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.036 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.040 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:6a:08 10.100.0.14'], port_security=['fa:16:3e:80:6a:08 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '614c42dd-39f8-42e7-b9c5-b0d69d8e17c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4643f642-fdbf-416e-8a97-b503a2990568', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7101771f6abd46148c36f80fed40c40e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '92222b96-d0dc-43a0-9343-875648372333', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f71413d6-fb1e-4318-a143-8bcb0cb14270, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.041 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc in datapath 4643f642-fdbf-416e-8a97-b503a2990568 unbound from our chassis
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.042 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4643f642-fdbf-416e-8a97-b503a2990568, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.043 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b99b77de-43fe-4d02-99c2-c7cdecf00e64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.043 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 namespace which is not needed anymore
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.050 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 23 09:36:34 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000091.scope: Consumed 9.035s CPU time.
Jan 23 09:36:34 compute-0 systemd-machined[153562]: Machine qemu-75-instance-00000091 terminated.
Jan 23 09:36:34 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228432]: [NOTICE]   (228436) : haproxy version is 2.8.14-c23fe91
Jan 23 09:36:34 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228432]: [NOTICE]   (228436) : path to executable is /usr/sbin/haproxy
Jan 23 09:36:34 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228432]: [WARNING]  (228436) : Exiting Master process...
Jan 23 09:36:34 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228432]: [ALERT]    (228436) : Current worker (228438) exited with code 143 (Terminated)
Jan 23 09:36:34 compute-0 neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568[228432]: [WARNING]  (228436) : All workers exited. Exiting... (0)
Jan 23 09:36:34 compute-0 systemd[1]: libpod-f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43.scope: Deactivated successfully.
Jan 23 09:36:34 compute-0 conmon[228432]: conmon f29cf717bdd40c583a84 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43.scope/container/memory.events
Jan 23 09:36:34 compute-0 podman[228526]: 2026-01-23 09:36:34.137955279 +0000 UTC m=+0.035388249 container died f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 09:36:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43-userdata-shm.mount: Deactivated successfully.
Jan 23 09:36:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-70578b1e6f6cd622cabf87c32bfd70886dfa202db7db722b3a0f64978c3cd041-merged.mount: Deactivated successfully.
Jan 23 09:36:34 compute-0 podman[228526]: 2026-01-23 09:36:34.153700417 +0000 UTC m=+0.051133366 container cleanup f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:36:34 compute-0 systemd[1]: libpod-conmon-f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43.scope: Deactivated successfully.
Jan 23 09:36:34 compute-0 podman[228548]: 2026-01-23 09:36:34.19004621 +0000 UTC m=+0.022088854 container remove f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.193 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[943b3f21-642f-478a-8252-3e4cb925b8c0]: (4, ('Fri Jan 23 09:36:34 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 (f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43)\nf29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43\nFri Jan 23 09:36:34 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 (f29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43)\nf29cf717bdd40c583a841757c9931efb25386ac794bbc9112d5ec60296e09e43\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.194 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b95bf252-3a11-4f43-8c6c-1761c6608bb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.195 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4643f642-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.196 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 kernel: tap4643f642-f0: left promiscuous mode
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.212 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.214 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.214 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[736c0061-f9be-4834-a5c2-388d93a297b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.224 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[234bc462-2c88-4300-8cc3-83a5cc187cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.225 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bc128feb-7624-44e7-955c-72420112dca6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.237 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[101587da-5e13-48c3-8e3d-0965fb53a55a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449055, 'reachable_time': 25481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228570, 'error': None, 'target': 'ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d4643f642\x2dfdbf\x2d416e\x2d8a97\x2db503a2990568.mount: Deactivated successfully.
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.239 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4643f642-fdbf-416e-8a97-b503a2990568 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:36:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:34.239 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[4891bce7-b903-4aa4-8360-e1f80bf22454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.268 182096 INFO nova.virt.libvirt.driver [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Instance destroyed successfully.
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.269 182096 DEBUG nova.objects.instance [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lazy-loading 'resources' on Instance uuid 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.287 182096 DEBUG nova.virt.libvirt.vif [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:35:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-185163690',display_name='tempest-ServerRescueNegativeTestJSON-server-185163690',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-185163690',id=145,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:36:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7101771f6abd46148c36f80fed40c40e',ramdisk_id='',reservation_id='r-4xmktry0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-776163304',owner_user_name='tempest-ServerRescueNegativeTestJSON-776163304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:36:26Z,user_data=None,user_id='35902c3964864ac9b7e446cb9f746d80',uuid=614c42dd-39f8-42e7-b9c5-b0d69d8e17c0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.288 182096 DEBUG nova.network.os_vif_util [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Converting VIF {"id": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "address": "fa:16:3e:80:6a:08", "network": {"id": "4643f642-fdbf-416e-8a97-b503a2990568", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1622158593-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7101771f6abd46148c36f80fed40c40e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18e2ec6e-de", "ovs_interfaceid": "18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.288 182096 DEBUG nova.network.os_vif_util [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:6a:08,bridge_name='br-int',has_traffic_filtering=True,id=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc,network=Network(4643f642-fdbf-416e-8a97-b503a2990568),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18e2ec6e-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.289 182096 DEBUG os_vif [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:6a:08,bridge_name='br-int',has_traffic_filtering=True,id=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc,network=Network(4643f642-fdbf-416e-8a97-b503a2990568),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18e2ec6e-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.290 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.290 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18e2ec6e-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.291 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.292 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.294 182096 INFO os_vif [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:6a:08,bridge_name='br-int',has_traffic_filtering=True,id=18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc,network=Network(4643f642-fdbf-416e-8a97-b503a2990568),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18e2ec6e-de')
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.295 182096 INFO nova.virt.libvirt.driver [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Deleting instance files /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0_del
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.295 182096 INFO nova.virt.libvirt.driver [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Deletion of /var/lib/nova/instances/614c42dd-39f8-42e7-b9c5-b0d69d8e17c0_del complete
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.300 182096 DEBUG nova.compute.manager [req-82b62c4c-b75f-4beb-8c52-5bbfb8c2e28c req-854af373-7d38-4f3d-9e89-98e39d19ef40 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-unplugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.300 182096 DEBUG oslo_concurrency.lockutils [req-82b62c4c-b75f-4beb-8c52-5bbfb8c2e28c req-854af373-7d38-4f3d-9e89-98e39d19ef40 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.301 182096 DEBUG oslo_concurrency.lockutils [req-82b62c4c-b75f-4beb-8c52-5bbfb8c2e28c req-854af373-7d38-4f3d-9e89-98e39d19ef40 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.301 182096 DEBUG oslo_concurrency.lockutils [req-82b62c4c-b75f-4beb-8c52-5bbfb8c2e28c req-854af373-7d38-4f3d-9e89-98e39d19ef40 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.301 182096 DEBUG nova.compute.manager [req-82b62c4c-b75f-4beb-8c52-5bbfb8c2e28c req-854af373-7d38-4f3d-9e89-98e39d19ef40 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] No waiting events found dispatching network-vif-unplugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.301 182096 DEBUG nova.compute.manager [req-82b62c4c-b75f-4beb-8c52-5bbfb8c2e28c req-854af373-7d38-4f3d-9e89-98e39d19ef40 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-unplugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.375 182096 INFO nova.compute.manager [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.376 182096 DEBUG oslo.service.loopingcall [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.376 182096 DEBUG nova.compute.manager [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:36:34 compute-0 nova_compute[182092]: 2026-01-23 09:36:34.376 182096 DEBUG nova.network.neutron [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.134 182096 DEBUG nova.network.neutron [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.163 182096 INFO nova.compute.manager [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Took 0.79 seconds to deallocate network for instance.
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.248 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.249 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.314 182096 DEBUG nova.compute.provider_tree [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.341 182096 DEBUG nova.scheduler.client.report [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.369 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.402 182096 INFO nova.scheduler.client.report [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Deleted allocations for instance 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0
Jan 23 09:36:35 compute-0 nova_compute[182092]: 2026-01-23 09:36:35.482 182096 DEBUG oslo_concurrency.lockutils [None req-850c96fa-a97a-4c57-96d7-b0fb31ab8d47 35902c3964864ac9b7e446cb9f746d80 7101771f6abd46148c36f80fed40c40e - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.487s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:36 compute-0 nova_compute[182092]: 2026-01-23 09:36:36.447 182096 DEBUG nova.compute.manager [req-e84b0108-7d8e-424a-8400-08c04a930aeb req-fefe7196-9fdf-4ac3-a4b9-21ff9fd356af 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:36 compute-0 nova_compute[182092]: 2026-01-23 09:36:36.448 182096 DEBUG oslo_concurrency.lockutils [req-e84b0108-7d8e-424a-8400-08c04a930aeb req-fefe7196-9fdf-4ac3-a4b9-21ff9fd356af 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:36 compute-0 nova_compute[182092]: 2026-01-23 09:36:36.448 182096 DEBUG oslo_concurrency.lockutils [req-e84b0108-7d8e-424a-8400-08c04a930aeb req-fefe7196-9fdf-4ac3-a4b9-21ff9fd356af 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:36 compute-0 nova_compute[182092]: 2026-01-23 09:36:36.448 182096 DEBUG oslo_concurrency.lockutils [req-e84b0108-7d8e-424a-8400-08c04a930aeb req-fefe7196-9fdf-4ac3-a4b9-21ff9fd356af 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "614c42dd-39f8-42e7-b9c5-b0d69d8e17c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:36 compute-0 nova_compute[182092]: 2026-01-23 09:36:36.448 182096 DEBUG nova.compute.manager [req-e84b0108-7d8e-424a-8400-08c04a930aeb req-fefe7196-9fdf-4ac3-a4b9-21ff9fd356af 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] No waiting events found dispatching network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:36:36 compute-0 nova_compute[182092]: 2026-01-23 09:36:36.448 182096 WARNING nova.compute.manager [req-e84b0108-7d8e-424a-8400-08c04a930aeb req-fefe7196-9fdf-4ac3-a4b9-21ff9fd356af 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received unexpected event network-vif-plugged-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc for instance with vm_state deleted and task_state None.
Jan 23 09:36:36 compute-0 nova_compute[182092]: 2026-01-23 09:36:36.449 182096 DEBUG nova.compute.manager [req-e84b0108-7d8e-424a-8400-08c04a930aeb req-fefe7196-9fdf-4ac3-a4b9-21ff9fd356af 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Received event network-vif-deleted-18e2ec6e-deba-4e49-8a5b-1bcbef0e5dbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:37 compute-0 nova_compute[182092]: 2026-01-23 09:36:37.442 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:39 compute-0 nova_compute[182092]: 2026-01-23 09:36:39.292 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:39 compute-0 nova_compute[182092]: 2026-01-23 09:36:39.743 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:39.868 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:39.868 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:36:39.869 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:41 compute-0 podman[228586]: 2026-01-23 09:36:41.219961384 +0000 UTC m=+0.057215588 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_id=ovn_controller)
Jan 23 09:36:42 compute-0 nova_compute[182092]: 2026-01-23 09:36:42.444 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:44 compute-0 nova_compute[182092]: 2026-01-23 09:36:44.294 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:47 compute-0 nova_compute[182092]: 2026-01-23 09:36:47.445 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:49 compute-0 podman[228610]: 2026-01-23 09:36:49.228204754 +0000 UTC m=+0.064108158 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:36:49 compute-0 podman[228609]: 2026-01-23 09:36:49.235231387 +0000 UTC m=+0.073129042 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:36:49 compute-0 nova_compute[182092]: 2026-01-23 09:36:49.268 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160994.2668304, 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:36:49 compute-0 nova_compute[182092]: 2026-01-23 09:36:49.268 182096 INFO nova.compute.manager [-] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] VM Stopped (Lifecycle Event)
Jan 23 09:36:49 compute-0 nova_compute[182092]: 2026-01-23 09:36:49.290 182096 DEBUG nova.compute.manager [None req-42f972e9-0ff6-4b3d-b71a-d914b3cf0f13 - - - - - -] [instance: 614c42dd-39f8-42e7-b9c5-b0d69d8e17c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:36:49 compute-0 nova_compute[182092]: 2026-01-23 09:36:49.295 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:52 compute-0 nova_compute[182092]: 2026-01-23 09:36:52.447 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:54 compute-0 nova_compute[182092]: 2026-01-23 09:36:54.296 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.193 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.194 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.212 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.301 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.301 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.310 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.311 182096 INFO nova.compute.claims [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.440 182096 DEBUG nova.compute.provider_tree [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.449 182096 DEBUG nova.scheduler.client.report [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.469 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.469 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.525 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.525 182096 DEBUG nova.network.neutron [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.538 182096 INFO nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.552 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.640 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.641 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.641 182096 INFO nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Creating image(s)
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.642 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "/var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.642 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.643 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.653 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.706 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.707 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.708 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.717 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.769 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.770 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.795 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.796 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.796 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.842 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.843 182096 DEBUG nova.virt.disk.api [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Checking if we can resize image /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.843 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.894 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.894 182096 DEBUG nova.virt.disk.api [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Cannot resize image /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.895 182096 DEBUG nova.objects.instance [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'migration_context' on Instance uuid bd78986f-3b06-4345-8540-cb6ab9009fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.922 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.922 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Ensure instance console log exists: /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.923 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.923 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:36:55 compute-0 nova_compute[182092]: 2026-01-23 09:36:55.923 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:36:56 compute-0 nova_compute[182092]: 2026-01-23 09:36:56.510 182096 DEBUG nova.policy [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:36:57 compute-0 nova_compute[182092]: 2026-01-23 09:36:57.450 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.052 182096 DEBUG nova.network.neutron [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Successfully created port: 647c3660-1915-4e07-8634-43e7c7e49321 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.616 182096 DEBUG nova.network.neutron [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Successfully updated port: 647c3660-1915-4e07-8634-43e7c7e49321 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.630 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.630 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquired lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.631 182096 DEBUG nova.network.neutron [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.717 182096 DEBUG nova.compute.manager [req-efe45f7f-3cc6-4c20-810d-d0734d972dde req-7fcd157a-2e7e-4a91-b673-7f146f944450 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-changed-647c3660-1915-4e07-8634-43e7c7e49321 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.717 182096 DEBUG nova.compute.manager [req-efe45f7f-3cc6-4c20-810d-d0734d972dde req-7fcd157a-2e7e-4a91-b673-7f146f944450 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Refreshing instance network info cache due to event network-changed-647c3660-1915-4e07-8634-43e7c7e49321. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.718 182096 DEBUG oslo_concurrency.lockutils [req-efe45f7f-3cc6-4c20-810d-d0734d972dde req-7fcd157a-2e7e-4a91-b673-7f146f944450 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:36:58 compute-0 nova_compute[182092]: 2026-01-23 09:36:58.787 182096 DEBUG nova.network.neutron [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:36:59 compute-0 nova_compute[182092]: 2026-01-23 09:36:59.298 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:00 compute-0 podman[228663]: 2026-01-23 09:37:00.20931731 +0000 UTC m=+0.041985512 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 09:37:00 compute-0 podman[228664]: 2026-01-23 09:37:00.212311828 +0000 UTC m=+0.042631741 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.692 182096 DEBUG nova.network.neutron [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updating instance_info_cache with network_info: [{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.715 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Releasing lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.715 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Instance network_info: |[{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.715 182096 DEBUG oslo_concurrency.lockutils [req-efe45f7f-3cc6-4c20-810d-d0734d972dde req-7fcd157a-2e7e-4a91-b673-7f146f944450 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.716 182096 DEBUG nova.network.neutron [req-efe45f7f-3cc6-4c20-810d-d0734d972dde req-7fcd157a-2e7e-4a91-b673-7f146f944450 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Refreshing network info cache for port 647c3660-1915-4e07-8634-43e7c7e49321 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.718 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Start _get_guest_xml network_info=[{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.721 182096 WARNING nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.730 182096 DEBUG nova.virt.libvirt.host [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.731 182096 DEBUG nova.virt.libvirt.host [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.735 182096 DEBUG nova.virt.libvirt.host [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.735 182096 DEBUG nova.virt.libvirt.host [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.736 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.736 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.737 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.737 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.737 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.737 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.737 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.738 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.738 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.738 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.738 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.738 182096 DEBUG nova.virt.hardware [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.741 182096 DEBUG nova.virt.libvirt.vif [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:36:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-449213939',display_name='tempest-TestNetworkBasicOps-server-449213939',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-449213939',id=147,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDLlh/W8fpdYSmbxeIab5T+zjxt4urFKyLKqSjPhnxgn6AKuUkpHIVSILw1O0d54oiROsTsbOfe+NEugfjZDIrHCSlRU3BF04WQfou4hQRgLlD/lKNDa5dCseNGA1Hqsqw==',key_name='tempest-TestNetworkBasicOps-974513515',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-xkgdgeo1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:36:55Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=bd78986f-3b06-4345-8540-cb6ab9009fb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.741 182096 DEBUG nova.network.os_vif_util [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.742 182096 DEBUG nova.network.os_vif_util [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:49:8c,bridge_name='br-int',has_traffic_filtering=True,id=647c3660-1915-4e07-8634-43e7c7e49321,network=Network(2e313631-de8d-446e-b2e7-770dbeeb9852),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap647c3660-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.743 182096 DEBUG nova.objects.instance [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd78986f-3b06-4345-8540-cb6ab9009fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.755 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <uuid>bd78986f-3b06-4345-8540-cb6ab9009fb2</uuid>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <name>instance-00000093</name>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkBasicOps-server-449213939</nova:name>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:37:00</nova:creationTime>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:37:00 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:37:00 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:37:00 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:37:00 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:37:00 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:37:00 compute-0 nova_compute[182092]:         <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:37:00 compute-0 nova_compute[182092]:         <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:37:00 compute-0 nova_compute[182092]:         <nova:port uuid="647c3660-1915-4e07-8634-43e7c7e49321">
Jan 23 09:37:00 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <system>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <entry name="serial">bd78986f-3b06-4345-8540-cb6ab9009fb2</entry>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <entry name="uuid">bd78986f-3b06-4345-8540-cb6ab9009fb2</entry>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </system>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <os>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   </os>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <features>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   </features>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.config"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:b4:49:8c"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <target dev="tap647c3660-19"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/console.log" append="off"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <video>
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </video>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:37:00 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:37:00 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:37:00 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:37:00 compute-0 nova_compute[182092]: </domain>
Jan 23 09:37:00 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.756 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Preparing to wait for external event network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.756 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.756 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.757 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.757 182096 DEBUG nova.virt.libvirt.vif [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:36:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-449213939',display_name='tempest-TestNetworkBasicOps-server-449213939',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-449213939',id=147,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDLlh/W8fpdYSmbxeIab5T+zjxt4urFKyLKqSjPhnxgn6AKuUkpHIVSILw1O0d54oiROsTsbOfe+NEugfjZDIrHCSlRU3BF04WQfou4hQRgLlD/lKNDa5dCseNGA1Hqsqw==',key_name='tempest-TestNetworkBasicOps-974513515',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-xkgdgeo1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:36:55Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=bd78986f-3b06-4345-8540-cb6ab9009fb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.757 182096 DEBUG nova.network.os_vif_util [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.758 182096 DEBUG nova.network.os_vif_util [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:49:8c,bridge_name='br-int',has_traffic_filtering=True,id=647c3660-1915-4e07-8634-43e7c7e49321,network=Network(2e313631-de8d-446e-b2e7-770dbeeb9852),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap647c3660-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.758 182096 DEBUG os_vif [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:49:8c,bridge_name='br-int',has_traffic_filtering=True,id=647c3660-1915-4e07-8634-43e7c7e49321,network=Network(2e313631-de8d-446e-b2e7-770dbeeb9852),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap647c3660-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.758 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.759 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.759 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.761 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.761 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap647c3660-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.762 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap647c3660-19, col_values=(('external_ids', {'iface-id': '647c3660-1915-4e07-8634-43e7c7e49321', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:49:8c', 'vm-uuid': 'bd78986f-3b06-4345-8540-cb6ab9009fb2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.763 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:00 compute-0 NetworkManager[54920]: <info>  [1769161020.7637] manager: (tap647c3660-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.766 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.768 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.768 182096 INFO os_vif [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:49:8c,bridge_name='br-int',has_traffic_filtering=True,id=647c3660-1915-4e07-8634-43e7c7e49321,network=Network(2e313631-de8d-446e-b2e7-770dbeeb9852),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap647c3660-19')
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.806 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.806 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.807 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No VIF found with MAC fa:16:3e:b4:49:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:37:00 compute-0 nova_compute[182092]: 2026-01-23 09:37:00.807 182096 INFO nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Using config drive
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.214 182096 INFO nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Creating config drive at /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.config
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.218 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdvj277mq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.338 182096 DEBUG oslo_concurrency.processutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdvj277mq" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:37:01 compute-0 kernel: tap647c3660-19: entered promiscuous mode
Jan 23 09:37:01 compute-0 NetworkManager[54920]: <info>  [1769161021.3747] manager: (tap647c3660-19): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Jan 23 09:37:01 compute-0 ovn_controller[94697]: 2026-01-23T09:37:01Z|00590|binding|INFO|Claiming lport 647c3660-1915-4e07-8634-43e7c7e49321 for this chassis.
Jan 23 09:37:01 compute-0 ovn_controller[94697]: 2026-01-23T09:37:01Z|00591|binding|INFO|647c3660-1915-4e07-8634-43e7c7e49321: Claiming fa:16:3e:b4:49:8c 10.100.0.10
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.379 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.386 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:49:8c 10.100.0.10'], port_security=['fa:16:3e:b4:49:8c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e313631-de8d-446e-b2e7-770dbeeb9852', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16f4f7bd-41ae-4b52-8026-76105c06c3aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e71e79f1-2acd-43dc-97c0-c92006b6370d, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=647c3660-1915-4e07-8634-43e7c7e49321) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.387 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 647c3660-1915-4e07-8634-43e7c7e49321 in datapath 2e313631-de8d-446e-b2e7-770dbeeb9852 bound to our chassis
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.388 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2e313631-de8d-446e-b2e7-770dbeeb9852
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.396 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d5fa08e0-ec09-4e40-b9f5-b6c32a6495ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.397 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2e313631-d1 in ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.398 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2e313631-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.398 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b53f450-bc93-4b72-bc4b-0e6d804890a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.399 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e6ee9308-88da-4a84-9a38-1dfcf9b3f7a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 systemd-udevd[228719]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.410 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[1d9de557-bb43-41a0-b83b-398473c416de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 NetworkManager[54920]: <info>  [1769161021.4171] device (tap647c3660-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:37:01 compute-0 NetworkManager[54920]: <info>  [1769161021.4178] device (tap647c3660-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:37:01 compute-0 systemd-machined[153562]: New machine qemu-76-instance-00000093.
Jan 23 09:37:01 compute-0 systemd[1]: Started Virtual Machine qemu-76-instance-00000093.
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.443 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.445 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[855e221d-594e-41dc-bd8a-2108e67e2264]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_controller[94697]: 2026-01-23T09:37:01Z|00592|binding|INFO|Setting lport 647c3660-1915-4e07-8634-43e7c7e49321 ovn-installed in OVS
Jan 23 09:37:01 compute-0 ovn_controller[94697]: 2026-01-23T09:37:01Z|00593|binding|INFO|Setting lport 647c3660-1915-4e07-8634-43e7c7e49321 up in Southbound
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.453 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.467 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9727a6f0-0c3c-47b7-988f-8ab423e75c33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.469 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0e807b94-fbf0-4406-8f0e-1fd0f6bc0e4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 NetworkManager[54920]: <info>  [1769161021.4720] manager: (tap2e313631-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/297)
Jan 23 09:37:01 compute-0 systemd-udevd[228724]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.493 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[f592ad28-e03a-4a8b-b3ad-bfb139cbfe71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.495 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[48e0581b-78f9-4642-8230-c93164095ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 NetworkManager[54920]: <info>  [1769161021.5116] device (tap2e313631-d0): carrier: link connected
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.515 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[fae0d9b4-9ce8-4b13-98dc-681e6f3821e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.528 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[caeb46fb-cff8-4f04-8911-f5540fd5d2ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2e313631-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:31:53'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452699, 'reachable_time': 33996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228744, 'error': None, 'target': 'ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.539 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6accb2-9188-498c-b33c-2f1e29d628c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:3153'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452699, 'tstamp': 452699}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228745, 'error': None, 'target': 'ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.553 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4135b861-761e-4813-98ea-bffcefe43de1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2e313631-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:31:53'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452699, 'reachable_time': 33996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228746, 'error': None, 'target': 'ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.574 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[57a2ba78-d63f-4a26-b876-298d862e9089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.625 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[55262db3-b5d4-4670-b50e-3c5c529d6bcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.626 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e313631-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.626 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.627 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e313631-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.628 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:01 compute-0 NetworkManager[54920]: <info>  [1769161021.6288] manager: (tap2e313631-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Jan 23 09:37:01 compute-0 kernel: tap2e313631-d0: entered promiscuous mode
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.632 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2e313631-d0, col_values=(('external_ids', {'iface-id': 'd65b2ee0-0456-4245-9d04-610259fcbfde'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.632 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:01 compute-0 ovn_controller[94697]: 2026-01-23T09:37:01Z|00594|binding|INFO|Releasing lport d65b2ee0-0456-4245-9d04-610259fcbfde from this chassis (sb_readonly=0)
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.635 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2e313631-de8d-446e-b2e7-770dbeeb9852.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2e313631-de8d-446e-b2e7-770dbeeb9852.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.635 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[34f31c14-f1e6-4270-90f2-aa8765391b8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.636 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-2e313631-de8d-446e-b2e7-770dbeeb9852
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/2e313631-de8d-446e-b2e7-770dbeeb9852.pid.haproxy
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 2e313631-de8d-446e-b2e7-770dbeeb9852
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:37:01 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:01.636 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852', 'env', 'PROCESS_TAG=haproxy-2e313631-de8d-446e-b2e7-770dbeeb9852', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2e313631-de8d-446e-b2e7-770dbeeb9852.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.644 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.713 182096 DEBUG nova.compute.manager [req-781cf797-6ae6-40d0-9e7d-22889e874e64 req-dbdf9fad-359d-4d08-95a2-6cab192f4c0d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.714 182096 DEBUG oslo_concurrency.lockutils [req-781cf797-6ae6-40d0-9e7d-22889e874e64 req-dbdf9fad-359d-4d08-95a2-6cab192f4c0d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.714 182096 DEBUG oslo_concurrency.lockutils [req-781cf797-6ae6-40d0-9e7d-22889e874e64 req-dbdf9fad-359d-4d08-95a2-6cab192f4c0d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.715 182096 DEBUG oslo_concurrency.lockutils [req-781cf797-6ae6-40d0-9e7d-22889e874e64 req-dbdf9fad-359d-4d08-95a2-6cab192f4c0d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.715 182096 DEBUG nova.compute.manager [req-781cf797-6ae6-40d0-9e7d-22889e874e64 req-dbdf9fad-359d-4d08-95a2-6cab192f4c0d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Processing event network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.826 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161021.8258698, bd78986f-3b06-4345-8540-cb6ab9009fb2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.831 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] VM Started (Lifecycle Event)
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.832 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.835 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.840 182096 INFO nova.virt.libvirt.driver [-] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Instance spawned successfully.
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.840 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.859 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.861 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.868 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.869 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.869 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.870 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.870 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.871 182096 DEBUG nova.virt.libvirt.driver [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.895 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.896 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161021.8259602, bd78986f-3b06-4345-8540-cb6ab9009fb2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.896 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] VM Paused (Lifecycle Event)
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.918 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.920 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161021.8347402, bd78986f-3b06-4345-8540-cb6ab9009fb2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.920 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] VM Resumed (Lifecycle Event)
Jan 23 09:37:01 compute-0 podman[228781]: 2026-01-23 09:37:01.919414468 +0000 UTC m=+0.032740534 container create e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.937 182096 INFO nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Took 6.30 seconds to spawn the instance on the hypervisor.
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.938 182096 DEBUG nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.939 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.944 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:37:01 compute-0 systemd[1]: Started libpod-conmon-e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59.scope.
Jan 23 09:37:01 compute-0 nova_compute[182092]: 2026-01-23 09:37:01.964 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:37:01 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:37:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a85481472e38637ca6b038b4a7d084995f5b9b51266f95475844976049913bd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:37:01 compute-0 podman[228781]: 2026-01-23 09:37:01.97853584 +0000 UTC m=+0.091861906 container init e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:37:01 compute-0 podman[228781]: 2026-01-23 09:37:01.984000827 +0000 UTC m=+0.097326883 container start e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 09:37:01 compute-0 podman[228781]: 2026-01-23 09:37:01.90579551 +0000 UTC m=+0.019121576 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:37:02 compute-0 neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852[228793]: [NOTICE]   (228797) : New worker (228799) forked
Jan 23 09:37:02 compute-0 neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852[228793]: [NOTICE]   (228797) : Loading success.
Jan 23 09:37:02 compute-0 nova_compute[182092]: 2026-01-23 09:37:02.002 182096 INFO nova.compute.manager [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Took 6.73 seconds to build instance.
Jan 23 09:37:02 compute-0 nova_compute[182092]: 2026-01-23 09:37:02.016 182096 DEBUG oslo_concurrency.lockutils [None req-01f34b78-bd9c-4c43-a495-8e0e708000ee 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.823s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:37:02 compute-0 nova_compute[182092]: 2026-01-23 09:37:02.312 182096 DEBUG nova.network.neutron [req-efe45f7f-3cc6-4c20-810d-d0734d972dde req-7fcd157a-2e7e-4a91-b673-7f146f944450 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updated VIF entry in instance network info cache for port 647c3660-1915-4e07-8634-43e7c7e49321. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:37:02 compute-0 nova_compute[182092]: 2026-01-23 09:37:02.312 182096 DEBUG nova.network.neutron [req-efe45f7f-3cc6-4c20-810d-d0734d972dde req-7fcd157a-2e7e-4a91-b673-7f146f944450 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updating instance_info_cache with network_info: [{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:37:02 compute-0 nova_compute[182092]: 2026-01-23 09:37:02.328 182096 DEBUG oslo_concurrency.lockutils [req-efe45f7f-3cc6-4c20-810d-d0734d972dde req-7fcd157a-2e7e-4a91-b673-7f146f944450 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:37:02 compute-0 nova_compute[182092]: 2026-01-23 09:37:02.451 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:03 compute-0 podman[228804]: 2026-01-23 09:37:03.22386668 +0000 UTC m=+0.050301636 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, config_id=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public)
Jan 23 09:37:03 compute-0 nova_compute[182092]: 2026-01-23 09:37:03.820 182096 DEBUG nova.compute.manager [req-4f27b35f-2d8c-4eef-9d4b-e9199802ff28 req-a9897c75-8643-4861-8d27-779d05f7a87c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:37:03 compute-0 nova_compute[182092]: 2026-01-23 09:37:03.821 182096 DEBUG oslo_concurrency.lockutils [req-4f27b35f-2d8c-4eef-9d4b-e9199802ff28 req-a9897c75-8643-4861-8d27-779d05f7a87c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:37:03 compute-0 nova_compute[182092]: 2026-01-23 09:37:03.821 182096 DEBUG oslo_concurrency.lockutils [req-4f27b35f-2d8c-4eef-9d4b-e9199802ff28 req-a9897c75-8643-4861-8d27-779d05f7a87c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:37:03 compute-0 nova_compute[182092]: 2026-01-23 09:37:03.821 182096 DEBUG oslo_concurrency.lockutils [req-4f27b35f-2d8c-4eef-9d4b-e9199802ff28 req-a9897c75-8643-4861-8d27-779d05f7a87c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:37:03 compute-0 nova_compute[182092]: 2026-01-23 09:37:03.822 182096 DEBUG nova.compute.manager [req-4f27b35f-2d8c-4eef-9d4b-e9199802ff28 req-a9897c75-8643-4861-8d27-779d05f7a87c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] No waiting events found dispatching network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:37:03 compute-0 nova_compute[182092]: 2026-01-23 09:37:03.822 182096 WARNING nova.compute.manager [req-4f27b35f-2d8c-4eef-9d4b-e9199802ff28 req-a9897c75-8643-4861-8d27-779d05f7a87c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received unexpected event network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 for instance with vm_state active and task_state None.
Jan 23 09:37:04 compute-0 NetworkManager[54920]: <info>  [1769161024.3568] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Jan 23 09:37:04 compute-0 NetworkManager[54920]: <info>  [1769161024.3572] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.357 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.491 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:04 compute-0 ovn_controller[94697]: 2026-01-23T09:37:04Z|00595|binding|INFO|Releasing lport d65b2ee0-0456-4245-9d04-610259fcbfde from this chassis (sb_readonly=0)
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.502 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.774 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.805 182096 DEBUG nova.compute.manager [req-2a10e944-5059-4cbc-ace8-c1c8896f25fb req-1493586d-3104-4f98-b1bd-faa678d085cc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-changed-647c3660-1915-4e07-8634-43e7c7e49321 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.805 182096 DEBUG nova.compute.manager [req-2a10e944-5059-4cbc-ace8-c1c8896f25fb req-1493586d-3104-4f98-b1bd-faa678d085cc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Refreshing instance network info cache due to event network-changed-647c3660-1915-4e07-8634-43e7c7e49321. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.805 182096 DEBUG oslo_concurrency.lockutils [req-2a10e944-5059-4cbc-ace8-c1c8896f25fb req-1493586d-3104-4f98-b1bd-faa678d085cc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.805 182096 DEBUG oslo_concurrency.lockutils [req-2a10e944-5059-4cbc-ace8-c1c8896f25fb req-1493586d-3104-4f98-b1bd-faa678d085cc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:37:04 compute-0 nova_compute[182092]: 2026-01-23 09:37:04.805 182096 DEBUG nova.network.neutron [req-2a10e944-5059-4cbc-ace8-c1c8896f25fb req-1493586d-3104-4f98-b1bd-faa678d085cc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Refreshing network info cache for port 647c3660-1915-4e07-8634-43e7c7e49321 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:37:05 compute-0 nova_compute[182092]: 2026-01-23 09:37:05.764 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:06 compute-0 nova_compute[182092]: 2026-01-23 09:37:06.524 182096 DEBUG nova.network.neutron [req-2a10e944-5059-4cbc-ace8-c1c8896f25fb req-1493586d-3104-4f98-b1bd-faa678d085cc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updated VIF entry in instance network info cache for port 647c3660-1915-4e07-8634-43e7c7e49321. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:37:06 compute-0 nova_compute[182092]: 2026-01-23 09:37:06.525 182096 DEBUG nova.network.neutron [req-2a10e944-5059-4cbc-ace8-c1c8896f25fb req-1493586d-3104-4f98-b1bd-faa678d085cc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updating instance_info_cache with network_info: [{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:37:06 compute-0 nova_compute[182092]: 2026-01-23 09:37:06.543 182096 DEBUG oslo_concurrency.lockutils [req-2a10e944-5059-4cbc-ace8-c1c8896f25fb req-1493586d-3104-4f98-b1bd-faa678d085cc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:37:07 compute-0 nova_compute[182092]: 2026-01-23 09:37:07.452 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:09 compute-0 nova_compute[182092]: 2026-01-23 09:37:09.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:10 compute-0 nova_compute[182092]: 2026-01-23 09:37:10.766 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:11 compute-0 nova_compute[182092]: 2026-01-23 09:37:11.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:11 compute-0 nova_compute[182092]: 2026-01-23 09:37:11.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:37:11 compute-0 nova_compute[182092]: 2026-01-23 09:37:11.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:37:11 compute-0 nova_compute[182092]: 2026-01-23 09:37:11.897 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:37:11 compute-0 nova_compute[182092]: 2026-01-23 09:37:11.898 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:37:11 compute-0 nova_compute[182092]: 2026-01-23 09:37:11.898 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:37:11 compute-0 nova_compute[182092]: 2026-01-23 09:37:11.898 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd78986f-3b06-4345-8540-cb6ab9009fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:37:12 compute-0 podman[228839]: 2026-01-23 09:37:12.221185448 +0000 UTC m=+0.060605732 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 09:37:12 compute-0 nova_compute[182092]: 2026-01-23 09:37:12.453 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:13 compute-0 ovn_controller[94697]: 2026-01-23T09:37:13Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b4:49:8c 10.100.0.10
Jan 23 09:37:13 compute-0 ovn_controller[94697]: 2026-01-23T09:37:13Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b4:49:8c 10.100.0.10
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.045 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updating instance_info_cache with network_info: [{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.072 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.072 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.072 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.073 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.073 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.093 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.094 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.094 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.094 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.168 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.214 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.215 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.259 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.444 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.445 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5522MB free_disk=73.18424224853516GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.445 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.445 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.516 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance bd78986f-3b06-4345-8540-cb6ab9009fb2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.517 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.517 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.566 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.579 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.593 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:37:14 compute-0 nova_compute[182092]: 2026-01-23 09:37:14.594 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:37:15 compute-0 nova_compute[182092]: 2026-01-23 09:37:15.589 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:15 compute-0 nova_compute[182092]: 2026-01-23 09:37:15.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:15 compute-0 nova_compute[182092]: 2026-01-23 09:37:15.767 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:17 compute-0 nova_compute[182092]: 2026-01-23 09:37:17.455 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:17 compute-0 nova_compute[182092]: 2026-01-23 09:37:17.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:17 compute-0 nova_compute[182092]: 2026-01-23 09:37:17.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:37:17 compute-0 nova_compute[182092]: 2026-01-23 09:37:17.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:37:19 compute-0 nova_compute[182092]: 2026-01-23 09:37:19.365 182096 INFO nova.compute.manager [None req-ab16c967-dc18-4727-bfe6-a921257d0137 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Get console output
Jan 23 09:37:19 compute-0 nova_compute[182092]: 2026-01-23 09:37:19.369 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:37:20 compute-0 podman[228870]: 2026-01-23 09:37:20.208420212 +0000 UTC m=+0.035572957 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:37:20 compute-0 podman[228869]: 2026-01-23 09:37:20.22022092 +0000 UTC m=+0.049484125 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 23 09:37:20 compute-0 nova_compute[182092]: 2026-01-23 09:37:20.770 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:22 compute-0 nova_compute[182092]: 2026-01-23 09:37:22.456 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:24 compute-0 ovn_controller[94697]: 2026-01-23T09:37:24Z|00596|binding|INFO|Releasing lport d65b2ee0-0456-4245-9d04-610259fcbfde from this chassis (sb_readonly=0)
Jan 23 09:37:24 compute-0 nova_compute[182092]: 2026-01-23 09:37:24.110 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:24.910 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:37:24 compute-0 nova_compute[182092]: 2026-01-23 09:37:24.911 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:24.911 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:37:25 compute-0 nova_compute[182092]: 2026-01-23 09:37:25.772 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:26.913 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:37:27 compute-0 nova_compute[182092]: 2026-01-23 09:37:27.457 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:28 compute-0 ovn_controller[94697]: 2026-01-23T09:37:28Z|00597|binding|INFO|Releasing lport d65b2ee0-0456-4245-9d04-610259fcbfde from this chassis (sb_readonly=0)
Jan 23 09:37:28 compute-0 nova_compute[182092]: 2026-01-23 09:37:28.388 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:30 compute-0 nova_compute[182092]: 2026-01-23 09:37:30.774 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:31 compute-0 podman[228908]: 2026-01-23 09:37:31.207198128 +0000 UTC m=+0.042223711 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 09:37:31 compute-0 podman[228909]: 2026-01-23 09:37:31.212225661 +0000 UTC m=+0.044327168 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:37:31 compute-0 ovn_controller[94697]: 2026-01-23T09:37:31Z|00598|binding|INFO|Releasing lport d65b2ee0-0456-4245-9d04-610259fcbfde from this chassis (sb_readonly=0)
Jan 23 09:37:31 compute-0 nova_compute[182092]: 2026-01-23 09:37:31.267 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:32 compute-0 nova_compute[182092]: 2026-01-23 09:37:32.458 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:34 compute-0 podman[228946]: 2026-01-23 09:37:34.202272396 +0000 UTC m=+0.041528500 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Jan 23 09:37:35 compute-0 nova_compute[182092]: 2026-01-23 09:37:35.776 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:35 compute-0 nova_compute[182092]: 2026-01-23 09:37:35.967 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:37 compute-0 nova_compute[182092]: 2026-01-23 09:37:37.460 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:39.869 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:37:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:39.869 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:37:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:39.870 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:37:40 compute-0 nova_compute[182092]: 2026-01-23 09:37:40.778 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:40 compute-0 nova_compute[182092]: 2026-01-23 09:37:40.809 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:41 compute-0 nova_compute[182092]: 2026-01-23 09:37:41.015 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:42 compute-0 nova_compute[182092]: 2026-01-23 09:37:42.462 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:42 compute-0 nova_compute[182092]: 2026-01-23 09:37:42.883 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:43 compute-0 podman[228964]: 2026-01-23 09:37:43.222446912 +0000 UTC m=+0.061994452 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:37:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:43.834 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:28:3d 10.100.0.2 2001:db8::f816:3eff:fe12:283d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe12:283d/64', 'neutron:device_id': 'ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87bb1d0c-5b11-4fd3-82ca-b36019be3a6b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c684d440-d284-41ed-a1e0-1be5ac0761c9) old=Port_Binding(mac=['fa:16:3e:12:28:3d 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:37:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:43.835 103978 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c684d440-d284-41ed-a1e0-1be5ac0761c9 in datapath 8e5585ef-d2b9-42d4-8407-0e03aa280771 updated
Jan 23 09:37:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:43.836 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e5585ef-d2b9-42d4-8407-0e03aa280771, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:37:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:37:43.837 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9016c86b-84db-4667-8fe5-e1b4c67123f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:37:45 compute-0 nova_compute[182092]: 2026-01-23 09:37:45.780 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:47 compute-0 nova_compute[182092]: 2026-01-23 09:37:47.463 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:50 compute-0 nova_compute[182092]: 2026-01-23 09:37:50.782 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:51 compute-0 podman[228987]: 2026-01-23 09:37:51.218279932 +0000 UTC m=+0.046861409 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:37:51 compute-0 podman[228988]: 2026-01-23 09:37:51.241202287 +0000 UTC m=+0.067580517 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:37:52 compute-0 nova_compute[182092]: 2026-01-23 09:37:52.464 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:55 compute-0 nova_compute[182092]: 2026-01-23 09:37:55.784 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:37:57 compute-0 nova_compute[182092]: 2026-01-23 09:37:57.465 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:00 compute-0 nova_compute[182092]: 2026-01-23 09:38:00.786 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:02 compute-0 podman[229027]: 2026-01-23 09:38:02.2058313 +0000 UTC m=+0.036023802 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:38:02 compute-0 podman[229026]: 2026-01-23 09:38:02.205966977 +0000 UTC m=+0.038813883 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 09:38:02 compute-0 nova_compute[182092]: 2026-01-23 09:38:02.468 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:05 compute-0 podman[229063]: 2026-01-23 09:38:05.20018671 +0000 UTC m=+0.039459531 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 23 09:38:05 compute-0 nova_compute[182092]: 2026-01-23 09:38:05.788 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:07 compute-0 nova_compute[182092]: 2026-01-23 09:38:07.469 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:10 compute-0 nova_compute[182092]: 2026-01-23 09:38:10.108 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:10 compute-0 nova_compute[182092]: 2026-01-23 09:38:10.678 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:10 compute-0 nova_compute[182092]: 2026-01-23 09:38:10.790 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:11 compute-0 nova_compute[182092]: 2026-01-23 09:38:11.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:12 compute-0 nova_compute[182092]: 2026-01-23 09:38:12.471 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:12 compute-0 nova_compute[182092]: 2026-01-23 09:38:12.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.455 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.455 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.468 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.591 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.591 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.596 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.596 182096 INFO nova.compute.claims [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.648 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.664 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.711 182096 DEBUG nova.scheduler.client.report [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.791 182096 DEBUG nova.scheduler.client.report [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.791 182096 DEBUG nova.compute.provider_tree [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.807 182096 DEBUG nova.scheduler.client.report [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.826 182096 DEBUG nova.scheduler.client.report [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.863 182096 DEBUG nova.compute.provider_tree [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.874 182096 DEBUG nova.scheduler.client.report [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.887 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.888 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.925 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.925 182096 DEBUG nova.network.neutron [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.936 182096 INFO nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:38:13 compute-0 nova_compute[182092]: 2026-01-23 09:38:13.946 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.024 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.024 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.025 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.025 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd78986f-3b06-4345-8540-cb6ab9009fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.027 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.028 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.028 182096 INFO nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Creating image(s)
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.029 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "/var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.029 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.030 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.040 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.086 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.086 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.087 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.096 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.113 182096 DEBUG nova.policy [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.141 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.142 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.165 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.166 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.167 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.212 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.213 182096 DEBUG nova.virt.disk.api [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Checking if we can resize image /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.214 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:38:14 compute-0 podman[229087]: 2026-01-23 09:38:14.225266535 +0000 UTC m=+0.060824580 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.262 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.263 182096 DEBUG nova.virt.disk.api [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Cannot resize image /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.263 182096 DEBUG nova.objects.instance [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.275 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.275 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Ensure instance console log exists: /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.276 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.277 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.278 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.912 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:14.912 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:38:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:14.914 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:38:14 compute-0 nova_compute[182092]: 2026-01-23 09:38:14.982 182096 DEBUG nova.network.neutron [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Successfully created port: f53d4981-e71b-4854-a70b-97a0fa1ff783 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.556 182096 DEBUG nova.network.neutron [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Successfully updated port: f53d4981-e71b-4854-a70b-97a0fa1ff783 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.574 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.574 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquired lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.574 182096 DEBUG nova.network.neutron [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.588 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updating instance_info_cache with network_info: [{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.599 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.599 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.599 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.599 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.639 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.640 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.640 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.640 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.642 182096 DEBUG nova.compute.manager [req-429e4029-a2bc-4567-9bee-bad9733bd4a1 req-6cc83963-cd95-4080-9866-a8147a4d6320 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received event network-changed-f53d4981-e71b-4854-a70b-97a0fa1ff783 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.642 182096 DEBUG nova.compute.manager [req-429e4029-a2bc-4567-9bee-bad9733bd4a1 req-6cc83963-cd95-4080-9866-a8147a4d6320 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Refreshing instance network info cache due to event network-changed-f53d4981-e71b-4854-a70b-97a0fa1ff783. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.643 182096 DEBUG oslo_concurrency.lockutils [req-429e4029-a2bc-4567-9bee-bad9733bd4a1 req-6cc83963-cd95-4080-9866-a8147a4d6320 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.675 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.699 182096 DEBUG nova.network.neutron [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.721 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.722 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.770 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.791 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.975 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.976 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5581MB free_disk=73.18387603759766GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.976 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:15 compute-0 nova_compute[182092]: 2026-01-23 09:38:15.977 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:16 compute-0 nova_compute[182092]: 2026-01-23 09:38:16.031 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance bd78986f-3b06-4345-8540-cb6ab9009fb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:38:16 compute-0 nova_compute[182092]: 2026-01-23 09:38:16.031 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:38:16 compute-0 nova_compute[182092]: 2026-01-23 09:38:16.031 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:38:16 compute-0 nova_compute[182092]: 2026-01-23 09:38:16.031 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:38:16 compute-0 nova_compute[182092]: 2026-01-23 09:38:16.073 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:38:16 compute-0 nova_compute[182092]: 2026-01-23 09:38:16.085 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:38:16 compute-0 nova_compute[182092]: 2026-01-23 09:38:16.100 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:38:16 compute-0 nova_compute[182092]: 2026-01-23 09:38:16.101 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:17 compute-0 nova_compute[182092]: 2026-01-23 09:38:17.151 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:17 compute-0 nova_compute[182092]: 2026-01-23 09:38:17.472 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.530 182096 DEBUG nova.network.neutron [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Updating instance_info_cache with network_info: [{"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.552 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Releasing lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.552 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Instance network_info: |[{"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.553 182096 DEBUG oslo_concurrency.lockutils [req-429e4029-a2bc-4567-9bee-bad9733bd4a1 req-6cc83963-cd95-4080-9866-a8147a4d6320 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.553 182096 DEBUG nova.network.neutron [req-429e4029-a2bc-4567-9bee-bad9733bd4a1 req-6cc83963-cd95-4080-9866-a8147a4d6320 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Refreshing network info cache for port f53d4981-e71b-4854-a70b-97a0fa1ff783 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.555 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Start _get_guest_xml network_info=[{"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.558 182096 WARNING nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.563 182096 DEBUG nova.virt.libvirt.host [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.563 182096 DEBUG nova.virt.libvirt.host [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.568 182096 DEBUG nova.virt.libvirt.host [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.568 182096 DEBUG nova.virt.libvirt.host [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.569 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.569 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.569 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.570 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.570 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.570 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.570 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.570 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.571 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.571 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.571 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.571 182096 DEBUG nova.virt.hardware [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.573 182096 DEBUG nova.virt.libvirt.vif [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:38:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-753355613',display_name='tempest-TestGettingAddress-server-753355613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-753355613',id=152,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFoxNOAzqFCK7fj1o2x808jK01jGkXcWkrcqWotCNNhhlkVBGqS/u6QEWituFz8WrIBU/cAaZkNNGKU6l/yRl4WZnY8Nb5mWIq2L2/ttFdkGQmDOObJaGnFJc76xtwD6g==',key_name='tempest-TestGettingAddress-1792278688',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-6d91vue7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:38:13Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=2b19c9a2-669e-45b6-b7a8-c0472c77d8a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.574 182096 DEBUG nova.network.os_vif_util [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.574 182096 DEBUG nova.network.os_vif_util [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:f1:ed,bridge_name='br-int',has_traffic_filtering=True,id=f53d4981-e71b-4854-a70b-97a0fa1ff783,network=Network(8e5585ef-d2b9-42d4-8407-0e03aa280771),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d4981-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.575 182096 DEBUG nova.objects.instance [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.587 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <uuid>2b19c9a2-669e-45b6-b7a8-c0472c77d8a0</uuid>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <name>instance-00000098</name>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <nova:name>tempest-TestGettingAddress-server-753355613</nova:name>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:38:18</nova:creationTime>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:38:18 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:38:18 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:38:18 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:38:18 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:38:18 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:38:18 compute-0 nova_compute[182092]:         <nova:user uuid="2223cd913aab4f7cbffc6e9c703c6acc">tempest-TestGettingAddress-10741833-project-member</nova:user>
Jan 23 09:38:18 compute-0 nova_compute[182092]:         <nova:project uuid="d4181f6c647942e881af13381cc2f253">tempest-TestGettingAddress-10741833</nova:project>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:38:18 compute-0 nova_compute[182092]:         <nova:port uuid="f53d4981-e71b-4854-a70b-97a0fa1ff783">
Jan 23 09:38:18 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe1e:f1ed" ipVersion="6"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <system>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <entry name="serial">2b19c9a2-669e-45b6-b7a8-c0472c77d8a0</entry>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <entry name="uuid">2b19c9a2-669e-45b6-b7a8-c0472c77d8a0</entry>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </system>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <os>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   </os>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <features>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   </features>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.config"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:1e:f1:ed"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <target dev="tapf53d4981-e7"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/console.log" append="off"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <video>
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </video>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:38:18 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:38:18 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:38:18 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:38:18 compute-0 nova_compute[182092]: </domain>
Jan 23 09:38:18 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.588 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Preparing to wait for external event network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.588 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.588 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.588 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.589 182096 DEBUG nova.virt.libvirt.vif [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:38:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-753355613',display_name='tempest-TestGettingAddress-server-753355613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-753355613',id=152,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFoxNOAzqFCK7fj1o2x808jK01jGkXcWkrcqWotCNNhhlkVBGqS/u6QEWituFz8WrIBU/cAaZkNNGKU6l/yRl4WZnY8Nb5mWIq2L2/ttFdkGQmDOObJaGnFJc76xtwD6g==',key_name='tempest-TestGettingAddress-1792278688',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-6d91vue7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:38:13Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=2b19c9a2-669e-45b6-b7a8-c0472c77d8a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.589 182096 DEBUG nova.network.os_vif_util [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.590 182096 DEBUG nova.network.os_vif_util [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:f1:ed,bridge_name='br-int',has_traffic_filtering=True,id=f53d4981-e71b-4854-a70b-97a0fa1ff783,network=Network(8e5585ef-d2b9-42d4-8407-0e03aa280771),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d4981-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.590 182096 DEBUG os_vif [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:f1:ed,bridge_name='br-int',has_traffic_filtering=True,id=f53d4981-e71b-4854-a70b-97a0fa1ff783,network=Network(8e5585ef-d2b9-42d4-8407-0e03aa280771),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d4981-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.590 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.590 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.591 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.593 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.593 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf53d4981-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.593 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf53d4981-e7, col_values=(('external_ids', {'iface-id': 'f53d4981-e71b-4854-a70b-97a0fa1ff783', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:f1:ed', 'vm-uuid': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.594 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:18 compute-0 NetworkManager[54920]: <info>  [1769161098.5954] manager: (tapf53d4981-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.596 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.599 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.600 182096 INFO os_vif [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:f1:ed,bridge_name='br-int',has_traffic_filtering=True,id=f53d4981-e71b-4854-a70b-97a0fa1ff783,network=Network(8e5585ef-d2b9-42d4-8407-0e03aa280771),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d4981-e7')
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.629 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.629 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.629 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No VIF found with MAC fa:16:3e:1e:f1:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.630 182096 INFO nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Using config drive
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:18 compute-0 nova_compute[182092]: 2026-01-23 09:38:18.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:38:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:18.916 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.519 182096 INFO nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Creating config drive at /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.config
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.523 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl6tvh2r_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.642 182096 DEBUG oslo_concurrency.processutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl6tvh2r_" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:38:19 compute-0 kernel: tapf53d4981-e7: entered promiscuous mode
Jan 23 09:38:19 compute-0 NetworkManager[54920]: <info>  [1769161099.6843] manager: (tapf53d4981-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/302)
Jan 23 09:38:19 compute-0 ovn_controller[94697]: 2026-01-23T09:38:19Z|00599|binding|INFO|Claiming lport f53d4981-e71b-4854-a70b-97a0fa1ff783 for this chassis.
Jan 23 09:38:19 compute-0 ovn_controller[94697]: 2026-01-23T09:38:19Z|00600|binding|INFO|f53d4981-e71b-4854-a70b-97a0fa1ff783: Claiming fa:16:3e:1e:f1:ed 10.100.0.6 2001:db8::f816:3eff:fe1e:f1ed
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.687 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.691 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:f1:ed 10.100.0.6 2001:db8::f816:3eff:fe1e:f1ed'], port_security=['fa:16:3e:1e:f1:ed 10.100.0.6 2001:db8::f816:3eff:fe1e:f1ed'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8::f816:3eff:fe1e:f1ed/64', 'neutron:device_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '913ed349-7c9c-4761-b374-2b010caacf4a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87bb1d0c-5b11-4fd3-82ca-b36019be3a6b, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=f53d4981-e71b-4854-a70b-97a0fa1ff783) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.691 103978 INFO neutron.agent.ovn.metadata.agent [-] Port f53d4981-e71b-4854-a70b-97a0fa1ff783 in datapath 8e5585ef-d2b9-42d4-8407-0e03aa280771 bound to our chassis
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.693 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e5585ef-d2b9-42d4-8407-0e03aa280771
Jan 23 09:38:19 compute-0 ovn_controller[94697]: 2026-01-23T09:38:19Z|00601|binding|INFO|Setting lport f53d4981-e71b-4854-a70b-97a0fa1ff783 ovn-installed in OVS
Jan 23 09:38:19 compute-0 ovn_controller[94697]: 2026-01-23T09:38:19Z|00602|binding|INFO|Setting lport f53d4981-e71b-4854-a70b-97a0fa1ff783 up in Southbound
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.699 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.702 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[00766a34-1361-496d-b647-c04f1dcbe9cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.703 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e5585ef-d1 in ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.704 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e5585ef-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.704 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4eaa24-7b41-4338-960a-044f0689b63f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.705 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[17ba8f62-7b6b-4d68-bc10-b6f80c350ff2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.714 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b8540a50-704f-49df-9e92-c1867b1cabd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 systemd-udevd[229146]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:38:19 compute-0 NetworkManager[54920]: <info>  [1769161099.7254] device (tapf53d4981-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:38:19 compute-0 systemd-machined[153562]: New machine qemu-77-instance-00000098.
Jan 23 09:38:19 compute-0 NetworkManager[54920]: <info>  [1769161099.7262] device (tapf53d4981-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.733 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c88da2cb-bb62-4beb-9da3-c034634e4770]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 systemd[1]: Started Virtual Machine qemu-77-instance-00000098.
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.754 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2f387507-2ed8-4bf5-a89d-eeec7a286e16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.758 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[25a01d39-f41d-482e-8dae-76f7cf1b87f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 NetworkManager[54920]: <info>  [1769161099.7595] manager: (tap8e5585ef-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Jan 23 09:38:19 compute-0 systemd-udevd[229150]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.781 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[09f21798-1128-454c-8d04-873d8d901051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.784 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e11e7897-a4a6-4c10-9905-07b12993482a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 NetworkManager[54920]: <info>  [1769161099.8008] device (tap8e5585ef-d0): carrier: link connected
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.804 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[50bb59aa-05c0-47d1-9e76-27fe8b5dd530]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.817 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bccfbc9d-a309-4c28-8039-00e1a3e7ddc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e5585ef-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:28:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460528, 'reachable_time': 34576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229170, 'error': None, 'target': 'ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.828 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fe861273-b7da-4403-8887-8609f649b372]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:283d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 460528, 'tstamp': 460528}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229171, 'error': None, 'target': 'ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.840 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[553f4894-9b4b-411e-8c03-e567d228f0e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e5585ef-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:28:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460528, 'reachable_time': 34576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229172, 'error': None, 'target': 'ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.862 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0001fa-e91a-4d85-99e5-5a09b4d00e67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.866 182096 DEBUG nova.compute.manager [req-76ef2f02-5f67-4d74-8fef-cb5a96325418 req-654a2556-9953-4d78-bac7-8cdb3330e050 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received event network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.866 182096 DEBUG oslo_concurrency.lockutils [req-76ef2f02-5f67-4d74-8fef-cb5a96325418 req-654a2556-9953-4d78-bac7-8cdb3330e050 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.867 182096 DEBUG oslo_concurrency.lockutils [req-76ef2f02-5f67-4d74-8fef-cb5a96325418 req-654a2556-9953-4d78-bac7-8cdb3330e050 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.867 182096 DEBUG oslo_concurrency.lockutils [req-76ef2f02-5f67-4d74-8fef-cb5a96325418 req-654a2556-9953-4d78-bac7-8cdb3330e050 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.867 182096 DEBUG nova.compute.manager [req-76ef2f02-5f67-4d74-8fef-cb5a96325418 req-654a2556-9953-4d78-bac7-8cdb3330e050 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Processing event network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.910 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aac89412-e448-4935-9dac-d6f85e5c9ef1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.910 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e5585ef-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.911 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.911 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e5585ef-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:19 compute-0 NetworkManager[54920]: <info>  [1769161099.9130] manager: (tap8e5585ef-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 23 09:38:19 compute-0 kernel: tap8e5585ef-d0: entered promiscuous mode
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.913 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.915 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e5585ef-d0, col_values=(('external_ids', {'iface-id': 'c684d440-d284-41ed-a1e0-1be5ac0761c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:19 compute-0 ovn_controller[94697]: 2026-01-23T09:38:19Z|00603|binding|INFO|Releasing lport c684d440-d284-41ed-a1e0-1be5ac0761c9 from this chassis (sb_readonly=0)
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.916 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.929 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.929 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e5585ef-d2b9-42d4-8407-0e03aa280771.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e5585ef-d2b9-42d4-8407-0e03aa280771.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.932 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[69619d4f-9f70-4879-8183-5620e517a013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.933 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-8e5585ef-d2b9-42d4-8407-0e03aa280771
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/8e5585ef-d2b9-42d4-8407-0e03aa280771.pid.haproxy
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 8e5585ef-d2b9-42d4-8407-0e03aa280771
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:38:19 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:19.934 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'env', 'PROCESS_TAG=haproxy-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e5585ef-d2b9-42d4-8407-0e03aa280771.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.984 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.984 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161099.9845014, 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.985 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] VM Started (Lifecycle Event)
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.988 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.990 182096 INFO nova.virt.libvirt.driver [-] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Instance spawned successfully.
Jan 23 09:38:19 compute-0 nova_compute[182092]: 2026-01-23 09:38:19.991 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.002 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.006 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.009 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.009 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.010 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.010 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.011 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.011 182096 DEBUG nova.virt.libvirt.driver [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.030 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.030 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161099.984554, 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.030 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] VM Paused (Lifecycle Event)
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.060 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.067 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161099.98816, 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.067 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] VM Resumed (Lifecycle Event)
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.083 182096 INFO nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Took 6.06 seconds to spawn the instance on the hypervisor.
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.083 182096 DEBUG nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.088 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.090 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.116 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.149 182096 INFO nova.compute.manager [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Took 6.63 seconds to build instance.
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.165 182096 DEBUG oslo_concurrency.lockutils [None req-3d471e5d-46af-4ade-b451-a99355a1277d 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:20 compute-0 podman[229207]: 2026-01-23 09:38:20.220670786 +0000 UTC m=+0.035210269 container create b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 09:38:20 compute-0 systemd[1]: Started libpod-conmon-b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3.scope.
Jan 23 09:38:20 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:38:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c36dc3244506ddf4ba12afea276431193c432599ed3267a439e379fa1a42869b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:38:20 compute-0 podman[229207]: 2026-01-23 09:38:20.272977711 +0000 UTC m=+0.087517215 container init b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:38:20 compute-0 podman[229207]: 2026-01-23 09:38:20.277173053 +0000 UTC m=+0.091712536 container start b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 09:38:20 compute-0 podman[229207]: 2026-01-23 09:38:20.203383667 +0000 UTC m=+0.017923170 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:38:20 compute-0 neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771[229219]: [NOTICE]   (229223) : New worker (229225) forked
Jan 23 09:38:20 compute-0 neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771[229219]: [NOTICE]   (229223) : Loading success.
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.366 182096 DEBUG nova.network.neutron [req-429e4029-a2bc-4567-9bee-bad9733bd4a1 req-6cc83963-cd95-4080-9866-a8147a4d6320 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Updated VIF entry in instance network info cache for port f53d4981-e71b-4854-a70b-97a0fa1ff783. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.369 182096 DEBUG nova.network.neutron [req-429e4029-a2bc-4567-9bee-bad9733bd4a1 req-6cc83963-cd95-4080-9866-a8147a4d6320 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Updating instance_info_cache with network_info: [{"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.389 182096 DEBUG oslo_concurrency.lockutils [req-429e4029-a2bc-4567-9bee-bad9733bd4a1 req-6cc83963-cd95-4080-9866-a8147a4d6320 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:38:20 compute-0 ovn_controller[94697]: 2026-01-23T09:38:20Z|00604|binding|INFO|Releasing lport d65b2ee0-0456-4245-9d04-610259fcbfde from this chassis (sb_readonly=0)
Jan 23 09:38:20 compute-0 ovn_controller[94697]: 2026-01-23T09:38:20Z|00605|binding|INFO|Releasing lport c684d440-d284-41ed-a1e0-1be5ac0761c9 from this chassis (sb_readonly=0)
Jan 23 09:38:20 compute-0 nova_compute[182092]: 2026-01-23 09:38:20.552 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:21 compute-0 nova_compute[182092]: 2026-01-23 09:38:21.949 182096 DEBUG nova.compute.manager [req-d1543aaf-692c-4056-8872-c5301c7e62bd req-e5ad8c35-34d4-4e21-a11c-0ab970f5a463 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received event network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:38:21 compute-0 nova_compute[182092]: 2026-01-23 09:38:21.949 182096 DEBUG oslo_concurrency.lockutils [req-d1543aaf-692c-4056-8872-c5301c7e62bd req-e5ad8c35-34d4-4e21-a11c-0ab970f5a463 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:21 compute-0 nova_compute[182092]: 2026-01-23 09:38:21.950 182096 DEBUG oslo_concurrency.lockutils [req-d1543aaf-692c-4056-8872-c5301c7e62bd req-e5ad8c35-34d4-4e21-a11c-0ab970f5a463 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:21 compute-0 nova_compute[182092]: 2026-01-23 09:38:21.950 182096 DEBUG oslo_concurrency.lockutils [req-d1543aaf-692c-4056-8872-c5301c7e62bd req-e5ad8c35-34d4-4e21-a11c-0ab970f5a463 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:21 compute-0 nova_compute[182092]: 2026-01-23 09:38:21.950 182096 DEBUG nova.compute.manager [req-d1543aaf-692c-4056-8872-c5301c7e62bd req-e5ad8c35-34d4-4e21-a11c-0ab970f5a463 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] No waiting events found dispatching network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:38:21 compute-0 nova_compute[182092]: 2026-01-23 09:38:21.950 182096 WARNING nova.compute.manager [req-d1543aaf-692c-4056-8872-c5301c7e62bd req-e5ad8c35-34d4-4e21-a11c-0ab970f5a463 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received unexpected event network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 for instance with vm_state active and task_state None.
Jan 23 09:38:22 compute-0 podman[229231]: 2026-01-23 09:38:22.21190438 +0000 UTC m=+0.046489979 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:38:22 compute-0 podman[229230]: 2026-01-23 09:38:22.221194739 +0000 UTC m=+0.056304726 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:38:22 compute-0 nova_compute[182092]: 2026-01-23 09:38:22.483 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:22 compute-0 nova_compute[182092]: 2026-01-23 09:38:22.592 182096 DEBUG nova.compute.manager [req-6697b657-97a4-4658-8f58-99ec41f9b182 req-d09aa178-cdf9-4c9c-8551-3767246b80b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received event network-changed-f53d4981-e71b-4854-a70b-97a0fa1ff783 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:38:22 compute-0 nova_compute[182092]: 2026-01-23 09:38:22.593 182096 DEBUG nova.compute.manager [req-6697b657-97a4-4658-8f58-99ec41f9b182 req-d09aa178-cdf9-4c9c-8551-3767246b80b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Refreshing instance network info cache due to event network-changed-f53d4981-e71b-4854-a70b-97a0fa1ff783. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:38:22 compute-0 nova_compute[182092]: 2026-01-23 09:38:22.593 182096 DEBUG oslo_concurrency.lockutils [req-6697b657-97a4-4658-8f58-99ec41f9b182 req-d09aa178-cdf9-4c9c-8551-3767246b80b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:38:22 compute-0 nova_compute[182092]: 2026-01-23 09:38:22.593 182096 DEBUG oslo_concurrency.lockutils [req-6697b657-97a4-4658-8f58-99ec41f9b182 req-d09aa178-cdf9-4c9c-8551-3767246b80b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:38:22 compute-0 nova_compute[182092]: 2026-01-23 09:38:22.594 182096 DEBUG nova.network.neutron [req-6697b657-97a4-4658-8f58-99ec41f9b182 req-d09aa178-cdf9-4c9c-8551-3767246b80b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Refreshing network info cache for port f53d4981-e71b-4854-a70b-97a0fa1ff783 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:38:22 compute-0 nova_compute[182092]: 2026-01-23 09:38:22.661 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:23 compute-0 nova_compute[182092]: 2026-01-23 09:38:23.595 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:23 compute-0 nova_compute[182092]: 2026-01-23 09:38:23.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:24 compute-0 nova_compute[182092]: 2026-01-23 09:38:24.062 182096 DEBUG nova.network.neutron [req-6697b657-97a4-4658-8f58-99ec41f9b182 req-d09aa178-cdf9-4c9c-8551-3767246b80b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Updated VIF entry in instance network info cache for port f53d4981-e71b-4854-a70b-97a0fa1ff783. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:38:24 compute-0 nova_compute[182092]: 2026-01-23 09:38:24.063 182096 DEBUG nova.network.neutron [req-6697b657-97a4-4658-8f58-99ec41f9b182 req-d09aa178-cdf9-4c9c-8551-3767246b80b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Updating instance_info_cache with network_info: [{"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:38:24 compute-0 nova_compute[182092]: 2026-01-23 09:38:24.079 182096 DEBUG oslo_concurrency.lockutils [req-6697b657-97a4-4658-8f58-99ec41f9b182 req-d09aa178-cdf9-4c9c-8551-3767246b80b8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:38:27 compute-0 nova_compute[182092]: 2026-01-23 09:38:27.486 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:27 compute-0 nova_compute[182092]: 2026-01-23 09:38:27.751 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:28 compute-0 nova_compute[182092]: 2026-01-23 09:38:28.596 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:29 compute-0 nova_compute[182092]: 2026-01-23 09:38:29.658 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:29 compute-0 nova_compute[182092]: 2026-01-23 09:38:29.658 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:38:29 compute-0 nova_compute[182092]: 2026-01-23 09:38:29.808 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:38:30 compute-0 ovn_controller[94697]: 2026-01-23T09:38:30Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:f1:ed 10.100.0.6
Jan 23 09:38:30 compute-0 ovn_controller[94697]: 2026-01-23T09:38:30Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:f1:ed 10.100.0.6
Jan 23 09:38:32 compute-0 nova_compute[182092]: 2026-01-23 09:38:32.487 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.002 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'name': 'tempest-TestGettingAddress-server-753355613', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000098', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd4181f6c647942e881af13381cc2f253', 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'hostId': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.004 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'name': 'tempest-TestNetworkBasicOps-server-449213939', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000093', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'user_id': '8aa2911d0bc0474cb77214528548d308', 'hostId': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.004 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.006 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 / tapf53d4981-e7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.006 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.incoming.bytes volume: 1766 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.007 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bd78986f-3b06-4345-8540-cb6ab9009fb2 / tap647c3660-19 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.007 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.incoming.bytes volume: 4683 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5dc0c0f-c65d-4759-8a08-ce2c311edcf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1766, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.004462', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '486aaf50-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': '3832abc6156396789f356b75f9df26ee509d0fd3c7daefcce27c1c52eed4f2fd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4683, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.004462', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '486af096-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': '3377e6eff49c5710a8a18be552fd02d01a59cf86c187f582dac848ea5a69ae45'}]}, 'timestamp': '2026-01-23 09:38:33.008254', '_unique_id': '2f755197021f4832b11eae372b8873fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.009 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.026 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.write.requests volume: 311 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.026 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.045 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.write.requests volume: 339 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.045 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68b4bd4f-136e-4f07-b765-77583c6d279c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 311, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.010156', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '486dcb2c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '545c8742c6ab5348e0d66a24722f6162c646ad738e4db1209c4e301b7e10f7d7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.010156', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '486dd70c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '6796526688bcf160265f6b60d96af39162a5eb87493b15a5fe17c09a08ed1e22'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 339, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.010156', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4870acd4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': '50897eefafbf41ccbdd5f86ea5f16568c8fb67d9cbd35e9316c2678149ca0d28'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.010156', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4870b81e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': '5c6ae48fce0ad6db8602786a7599f3b69b85a5fff0f377e43c4964225152ffbe'}]}, 'timestamp': '2026-01-23 09:38:33.046116', '_unique_id': '32de18e031234092b6ddfb727bbf46c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.047 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.047 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.047 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.048 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.write.bytes volume: 73093120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.048 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d9b9efd-9488-4a6e-a6a9-97c86cacf56f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.047678', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4870ff7c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': 'b86fe5703adac06c603a45733fa1ff6487f5ffcb5ddd8cbf24ae733d1bf088db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.047678', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '487108a0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': 'e3edc1f62875b35d4178673844f23d46c0b7cc6dc91d64e518a0e29dd22dbee2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73093120, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.047678', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4871114c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': '7732f7d41c112d30386b396ce68b7b6a32ef8d32fb700917d8ad1a58b19099c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.047678', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48711aac-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': 'c3b28472b7466d25a4be0f9a8d23dcdc66062e0deea40afafafa2bd00412b4bc'}]}, 'timestamp': '2026-01-23 09:38:33.048634', '_unique_id': '4127b500083e47a7ba78c384ff92b418'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.050 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.050 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.incoming.packets volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '863441b1-1512-49f1-bcb0-359be21573e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.050219', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '48716246-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': '625fd314ac8b562bf1167dd015f5d03a5a8b04f9f1ebcd2e9600d7e588253271'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 30, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.050219', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '48717312-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': '07e51fc9436f11144fde552560dbdf29a3a54616fc1cd0a8409f5e22ad9a2b7d'}]}, 'timestamp': '2026-01-23 09:38:33.050927', '_unique_id': '4d3fbf8dc56c479d9b49d3a37ed090f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.052 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.read.bytes volume: 30403072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.052 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.052 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.read.bytes volume: 29997568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50047879-38db-4f37-8fde-2854c5652c93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30403072, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.052312', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4871b3e0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '657bcde3690284b2eda5f44bab0f654c05e46b228c3e855db87fb0dd7edc4673'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.052312', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4871be3a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '05a0d4f9847bb2f9d71961b1484273470bda868a1c57a30d74f1a2eb0673a55d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29997568, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.052312', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4871c736-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': 'b34473dda7dc68f1073d0b4c04b81022df369105c147b59a7bde849e3e30a0cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.052312', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4871cfa6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': '1dde8653d9017b37fa0a0864c5ec7cfd744077b9cc6236af7c29bd7e68bdd36b'}]}, 'timestamp': '2026-01-23 09:38:33.053265', '_unique_id': '9aa76b1e86a64507b3f664fb2683c695'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.054 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.054 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9356528-4771-488d-b7cc-222df7d76149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.054716', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '4872124a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': '53121d56a12140211d7486506f4c3ffbcb3c2742d1eeecd84ff88da60b48caea'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.054716', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '48721ba0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': '7d1368998d365a2ac67358d23a8dd01cd55dda13d78ba28234250aa4d9129325'}]}, 'timestamp': '2026-01-23 09:38:33.055218', '_unique_id': '26a01ef5bc83430e9333a8bf6ab3c9d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.056 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.read.latency volume: 187965939 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.056 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.read.latency volume: 65020548 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.057 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.read.latency volume: 177573435 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.057 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.read.latency volume: 93908125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b81c9425-fd9d-4439-8ac1-918d3074ef06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187965939, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.056604', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48725c46-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '5a3cf1da6fceaf891a887f1d595a29308e1b18add5a5a9ad4c2e54f1539d1eff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 65020548, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.056604', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48726542-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '541e69711e6dfe6f843419a7c0405c2bd49d8bd4677bd1a9465ee25660f271c7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 177573435, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.056604', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48726df8-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': '29cda04e5d72e08fb1b770f9fdf39f9eb8d2a5cb499976415a2e6770c5d27ecc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 93908125, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.056604', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48727654-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': 'e25e0a70a290644e5e3b09fd741ef33cb5cf23e67744f8de33a18885d84d1239'}]}, 'timestamp': '2026-01-23 09:38:33.057545', '_unique_id': 'dceba90891374e498a5a94c9c289fb40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.059 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.059 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-753355613>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-449213939>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-753355613>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-449213939>]
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.059 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.065 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.066 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.072 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.072 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '012b0e42-1d3d-4893-be46-54ca2d51b0a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.059373', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4873c34c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.588753692, 'message_signature': '9a3c09666c25d48f73182d34f436f1561247bd88d593d7a0b7a024fa3b1c8105'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.059373', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4873ccf2-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.588753692, 'message_signature': '0ab801fe5d3e46add9c5ac399d0a4e202e2694f5c0921018f50fcffd457e5fdc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.059373', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4874c576-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.595668099, 'message_signature': '15a99c7d2c59e46ccd2b2933003c7f9a3aac3f8270289298038f967569dc2494'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.059373', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4874d002-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.595668099, 'message_signature': '1795fa5471843acf110addf68415433ee681b3bae2866f8c3cb5dd6c8f69a27f'}]}, 'timestamp': '2026-01-23 09:38:33.072939', '_unique_id': 'f7aad20f2d2c41258f5f5c6f709615d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.074 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.074 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba728fa7-c4b4-429c-b79e-65eaab317175', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.074394', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '487512f6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': '3cb94a56e939807108d8fe401267bbabb463a026af57238a780bdc6825067f49'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.074394', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '48751d50-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': 'ccfb7276e6c554b2332b38ce1c830d00b4104c4963494e2a11b2b0bd18740da8'}]}, 'timestamp': '2026-01-23 09:38:33.074923', '_unique_id': '079af26469a3444294eef94ade458a61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.075 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.076 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.076 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2678015e-96cc-4c7b-b86e-6c4e38e7ed9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.076304', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '48755d38-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': '45481717e6899bf9be2bc155d92fd87ce44e2d9461ed9618765e13ab7bdca591'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.076304', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '4875683c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': '015572bb2e324a1481bc4407816c465416d9301dab7d81e579a7e0ac2c1a7e08'}]}, 'timestamp': '2026-01-23 09:38:33.076842', '_unique_id': 'f685a1f864b84f8a9b2abd6f1eb83e3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.077 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.078 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.078 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.078 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb2e4616-8297-4799-96c5-5b495b54c6c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.078220', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '4875a7f2-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': '71a21200bfe855c1403bb6a0f2cca763650572a4ecd1530f6d6570f0fd3c5a07'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.078220', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '4875b1b6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': 'bbf665beebaae621ae94a9c4a53395e98bc5e4de6dc1dddba0360e5c58f8b970'}]}, 'timestamp': '2026-01-23 09:38:33.078739', '_unique_id': 'e46a02f67d2649d684a33a9b53f74104'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.080 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.089 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/cpu volume: 9560000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.100 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/cpu volume: 10500000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07efcf1a-7e72-4e10-a33a-ae6d49b1c0aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9560000000, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'timestamp': '2026-01-23T09:38:33.080099', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '48777136-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.619239099, 'message_signature': '09cdeb0a66ff57b1728d67772bda68c883d5544678112c3cb897c3426071ed8d'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10500000000, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'timestamp': '2026-01-23T09:38:33.080099', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '487902f8-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.629523462, 'message_signature': '5e2f5e1fef90a3e8940bdb5208fa81677c358c10575684741cdeb5392b3992bb'}]}, 'timestamp': '2026-01-23 09:38:33.100464', '_unique_id': '583c14940dbe45fba971f2281cbcc751'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.102 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.102 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-753355613>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-449213939>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-753355613>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-449213939>]
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.102 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.102 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-753355613>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-449213939>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-753355613>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-449213939>]
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.102 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.102 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/memory.usage volume: 40.390625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/memory.usage volume: 42.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c006896d-ea6c-4af7-af4e-f74535a7849f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.390625, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'timestamp': '2026-01-23T09:38:33.102764', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '48796662-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.619239099, 'message_signature': '30398993378e30aff57ac3f955b04913e20be76660b6a258c721f70049c50904'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.81640625, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'timestamp': '2026-01-23T09:38:33.102764', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '48796f54-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.629523462, 'message_signature': '4089cc20fc263ebfcea62fff4809b0b0a6c9647db278286322da66c2c9eb9476'}]}, 'timestamp': '2026-01-23 09:38:33.103231', '_unique_id': '11d1f7296d324cb9b7f84f18e1d5a86b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.103 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.104 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.104 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-753355613>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-449213939>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-753355613>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-449213939>]
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.105 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.105 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.105 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.105 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f3f1af8-a468-4356-99f6-0675536105e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.105027', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4879beb4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.588753692, 'message_signature': '8bbee65be58dcfb01ddb823f1e3956a5821286bcdf8f702a25b4db1bc382b904'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.105027', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4879c7a6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.588753692, 'message_signature': '4d6cfe8003274626ec8b14f91f82939ef1fd6ac6da867873da739a9a09c03ffa'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.105027', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4879d084-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.595668099, 'message_signature': 'b930d391c4c755e4ee91ff00d8e08b3ab8aded8847a36ffd062472f441512c49'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.105027', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4879d9bc-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.595668099, 'message_signature': 'a8ff79b3716614836ccc7b997cc2d13b2b4233d9c0bf58ae1ecbb80599e15c55'}]}, 'timestamp': '2026-01-23 09:38:33.105953', '_unique_id': '8e6ffc5f9f08484d8f314df9cbe66e57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.106 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.107 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.read.requests volume: 1092 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.107 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.107 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.read.requests volume: 1075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae560981-30b0-4a9d-8abb-a6a875e30af5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1092, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.107367', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '487a1ac6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '5325ad9a639f9cd45184b0e175a791a4995e70e9e72138b7ff2e8071047878fd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.107367', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '487a2552-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '75d910cc57e66d9780179219465d1ee310d806ca5f963fb4975b5197b392e6a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1075, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.107367', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '487a2df4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': '8915d55a493939edb9a651af65f3537ecf696dfbcb67db6a48eb5f73493e7171'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.107367', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '487a3704-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': 'a4b5636f728d27eae4ca3f253efa75c5cd9876d4e63a24dd602efaf07a93e1c7'}]}, 'timestamp': '2026-01-23 09:38:33.108342', '_unique_id': 'bfa69094920f414cbab6a82efb253c4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.108 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.109 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc25b1bd-5539-4745-aaea-54e03ffcc48e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.109788', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '487a7958-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': 'a6df08d7c318276e8bba4b5e620b056a0f0b10a433bda01c1974cc6b1ac76d3c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.109788', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '487a82c2-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': 'd26469cb4cf175d871ac88088ed15a1d518782fedfd16d20de39b7b26901c887'}]}, 'timestamp': '2026-01-23 09:38:33.110286', '_unique_id': '18f5693070a94cdd8308de0525047ef6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.110 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.111 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.outgoing.bytes volume: 3488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93dd1894-9c15-4963-b801-c079e396eafc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.111790', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '487ac746-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': 'f505cae6dbca2da407424a38f8a7c89d7517b6f677ab1e553dc3463f06e8fa89'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3488, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.111790', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '487ad092-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': '533bd39d12c8c01c359ea84c12e71f0e9919c4d3fd671959b6bcb84e63511c55'}]}, 'timestamp': '2026-01-23 09:38:33.112280', '_unique_id': 'af501cc2ba204beba83bd2c3b346cdbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.112 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.113 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.write.latency volume: 454580593 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.113 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.114 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.write.latency volume: 386683895 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.114 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc97869c-1928-4232-aa1a-d8395b760cac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 454580593, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.113668', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '487b10c0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': '3fcda325efd20cb1c5b360c49632cbcf86bff1b14f9b84f8fa5da2dc9447c930'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.113668', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '487b19bc-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.539540799, 'message_signature': 'e50a8f4ae18dc797a2b873ef68cdf48709eee4d66f17e10dfc95bdb5b6d34953'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 386683895, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.113668', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '487b224a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': '5d93c5b313c4a42e1fcaab36bb21db396e5f8cd6cfa3ba4f5ab8aeb2dd7476c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.113668', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '487b2aa6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.556603695, 'message_signature': 'a5bda82c26a5f28d7b64e4645b58b2c5c0aad13e923b884a8be66e3c00fd233b'}]}, 'timestamp': '2026-01-23 09:38:33.114585', '_unique_id': 'b043ae6c8dd0460eb64274a1654a4e6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.116 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.116 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.116 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.116 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3268df8d-0d93-45ef-9fe7-50ac5673ac58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-vda', 'timestamp': '2026-01-23T09:38:33.116011', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '487b6c14-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.588753692, 'message_signature': '7e6859e2d78d5d99df26efb3d239190fc8d73c7c4bc2ff0c0b71609e97c738a3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-sda', 'timestamp': '2026-01-23T09:38:33.116011', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'instance-00000098', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '487b74fc-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.588753692, 'message_signature': '566f47b21608c61a8737d97ac66fb361c5cf92f34ed7656271c6a336e294ebe5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-vda', 'timestamp': '2026-01-23T09:38:33.116011', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '487b7dee-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.595668099, 'message_signature': '877e0352969be7946d583f4c0efd4377e5a9759f273c1ba9a70b1c2b18ee1785'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2-sda', 'timestamp': '2026-01-23T09:38:33.116011', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'instance-00000093', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '487b873a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.595668099, 'message_signature': '9bc752060ee348e60b75887fd63027f7fe5f5bdbc479e2b469973f749d6995da'}]}, 'timestamp': '2026-01-23 09:38:33.116946', '_unique_id': '74f47020ef8b47f994c5d568074a9ce6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.117 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.118 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.118 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c437cbb-5196-423f-a3bb-28e9c837e9d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.118342', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '487bc7a4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': 'ddf5d29ab1620aa74b0a0c0484f398aa863cf77590ae2430bfea45d77f6bde31'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.118342', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '487bd258-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': 'ae0758890b9f9f2480b6b96b177cbbebec68cf7341b25cb155e46cd3aa05b034'}]}, 'timestamp': '2026-01-23 09:38:33.118879', '_unique_id': '8a36973ac963424185c4ac4c8d1ea4ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.119 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.120 12 DEBUG ceilometer.compute.pollsters [-] 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.120 12 DEBUG ceilometer.compute.pollsters [-] bd78986f-3b06-4345-8540-cb6ab9009fb2/network.outgoing.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcb583c2-f4d3-412d-8bd1-30300c094dd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-00000098-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-tapf53d4981-e7', 'timestamp': '2026-01-23T09:38:33.120280', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-753355613', 'name': 'tapf53d4981-e7', 'instance_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:f1:ed', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf53d4981-e7'}, 'message_id': '487c12d6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.533855538, 'message_signature': '17326ec99813d3fe268710772e8454a0080841ace5d2b536c9a143de91e86729'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-00000093-bd78986f-3b06-4345-8540-cb6ab9009fb2-tap647c3660-19', 'timestamp': '2026-01-23T09:38:33.120280', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-449213939', 'name': 'tap647c3660-19', 'instance_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:49:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap647c3660-19'}, 'message_id': '487c1d3a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4618.535988961, 'message_signature': '3a18b1555900bdafb91a96111f98d967e1ce9a68872c9b9e6111cf1a8876cf9a'}]}, 'timestamp': '2026-01-23 09:38:33.120796', '_unique_id': 'da70028d9b3c4b40b491b2b83e00af59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:38:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:38:33.121 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:38:33 compute-0 podman[229280]: 2026-01-23 09:38:33.207341392 +0000 UTC m=+0.042507929 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 23 09:38:33 compute-0 podman[229281]: 2026-01-23 09:38:33.20744649 +0000 UTC m=+0.041469000 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:38:33 compute-0 nova_compute[182092]: 2026-01-23 09:38:33.599 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:36 compute-0 podman[229317]: 2026-01-23 09:38:36.209219807 +0000 UTC m=+0.040309614 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7)
Jan 23 09:38:37 compute-0 nova_compute[182092]: 2026-01-23 09:38:37.489 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:38 compute-0 nova_compute[182092]: 2026-01-23 09:38:38.600 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.848 182096 DEBUG nova.compute.manager [req-91766275-6540-488c-963e-71eeb93cbc23 req-bafce146-b0f2-4d2a-a887-49b45100d5e2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received event network-changed-f53d4981-e71b-4854-a70b-97a0fa1ff783 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.849 182096 DEBUG nova.compute.manager [req-91766275-6540-488c-963e-71eeb93cbc23 req-bafce146-b0f2-4d2a-a887-49b45100d5e2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Refreshing instance network info cache due to event network-changed-f53d4981-e71b-4854-a70b-97a0fa1ff783. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.849 182096 DEBUG oslo_concurrency.lockutils [req-91766275-6540-488c-963e-71eeb93cbc23 req-bafce146-b0f2-4d2a-a887-49b45100d5e2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.849 182096 DEBUG oslo_concurrency.lockutils [req-91766275-6540-488c-963e-71eeb93cbc23 req-bafce146-b0f2-4d2a-a887-49b45100d5e2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.849 182096 DEBUG nova.network.neutron [req-91766275-6540-488c-963e-71eeb93cbc23 req-bafce146-b0f2-4d2a-a887-49b45100d5e2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Refreshing network info cache for port f53d4981-e71b-4854-a70b-97a0fa1ff783 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:38:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:39.870 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:39.870 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:39.871 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.939 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.939 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.939 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.940 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.940 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.948 182096 INFO nova.compute.manager [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Terminating instance
Jan 23 09:38:39 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.956 182096 DEBUG nova.compute.manager [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:38:39 compute-0 kernel: tapf53d4981-e7 (unregistering): left promiscuous mode
Jan 23 09:38:39 compute-0 NetworkManager[54920]: <info>  [1769161119.9822] device (tapf53d4981-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:38:39 compute-0 ovn_controller[94697]: 2026-01-23T09:38:39Z|00606|binding|INFO|Releasing lport f53d4981-e71b-4854-a70b-97a0fa1ff783 from this chassis (sb_readonly=0)
Jan 23 09:38:39 compute-0 ovn_controller[94697]: 2026-01-23T09:38:39Z|00607|binding|INFO|Setting lport f53d4981-e71b-4854-a70b-97a0fa1ff783 down in Southbound
Jan 23 09:38:39 compute-0 ovn_controller[94697]: 2026-01-23T09:38:39Z|00608|binding|INFO|Removing iface tapf53d4981-e7 ovn-installed in OVS
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.997 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:39.999 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.005 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:f1:ed 10.100.0.6 2001:db8::f816:3eff:fe1e:f1ed'], port_security=['fa:16:3e:1e:f1:ed 10.100.0.6 2001:db8::f816:3eff:fe1e:f1ed'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8::f816:3eff:fe1e:f1ed/64', 'neutron:device_id': '2b19c9a2-669e-45b6-b7a8-c0472c77d8a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '913ed349-7c9c-4761-b374-2b010caacf4a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87bb1d0c-5b11-4fd3-82ca-b36019be3a6b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=f53d4981-e71b-4854-a70b-97a0fa1ff783) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.005 103978 INFO neutron.agent.ovn.metadata.agent [-] Port f53d4981-e71b-4854-a70b-97a0fa1ff783 in datapath 8e5585ef-d2b9-42d4-8407-0e03aa280771 unbound from our chassis
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.007 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e5585ef-d2b9-42d4-8407-0e03aa280771, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.009 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9014f8b5-f0af-4e2e-82c1-ae8a0f2f3129]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.009 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771 namespace which is not needed anymore
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.012 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000098.scope: Deactivated successfully.
Jan 23 09:38:40 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000098.scope: Consumed 10.527s CPU time.
Jan 23 09:38:40 compute-0 systemd-machined[153562]: Machine qemu-77-instance-00000098 terminated.
Jan 23 09:38:40 compute-0 neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771[229219]: [NOTICE]   (229223) : haproxy version is 2.8.14-c23fe91
Jan 23 09:38:40 compute-0 neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771[229219]: [NOTICE]   (229223) : path to executable is /usr/sbin/haproxy
Jan 23 09:38:40 compute-0 neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771[229219]: [WARNING]  (229223) : Exiting Master process...
Jan 23 09:38:40 compute-0 neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771[229219]: [WARNING]  (229223) : Exiting Master process...
Jan 23 09:38:40 compute-0 neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771[229219]: [ALERT]    (229223) : Current worker (229225) exited with code 143 (Terminated)
Jan 23 09:38:40 compute-0 neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771[229219]: [WARNING]  (229223) : All workers exited. Exiting... (0)
Jan 23 09:38:40 compute-0 systemd[1]: libpod-b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3.scope: Deactivated successfully.
Jan 23 09:38:40 compute-0 podman[229357]: 2026-01-23 09:38:40.120998769 +0000 UTC m=+0.038004406 container died b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:38:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-c36dc3244506ddf4ba12afea276431193c432599ed3267a439e379fa1a42869b-merged.mount: Deactivated successfully.
Jan 23 09:38:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3-userdata-shm.mount: Deactivated successfully.
Jan 23 09:38:40 compute-0 podman[229357]: 2026-01-23 09:38:40.141155388 +0000 UTC m=+0.058161027 container cleanup b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 09:38:40 compute-0 systemd[1]: libpod-conmon-b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3.scope: Deactivated successfully.
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.176 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 podman[229380]: 2026-01-23 09:38:40.207990411 +0000 UTC m=+0.049060956 container remove b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.210 182096 INFO nova.virt.libvirt.driver [-] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Instance destroyed successfully.
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.210 182096 DEBUG nova.objects.instance [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'resources' on Instance uuid 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.215 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[77d1c257-a655-4df0-b67a-290aa30f8bdd]: (4, ('Fri Jan 23 09:38:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771 (b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3)\nb38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3\nFri Jan 23 09:38:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771 (b38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3)\nb38f3737bcc5dd535a25d6a34e4e091887d124c087506a3e3810ff2525ee59d3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.216 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[805ec241-2eff-4e18-9c6d-746bc05a8f42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.217 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e5585ef-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.218 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 kernel: tap8e5585ef-d0: left promiscuous mode
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.225 182096 DEBUG nova.virt.libvirt.vif [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:38:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-753355613',display_name='tempest-TestGettingAddress-server-753355613',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-753355613',id=152,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCFoxNOAzqFCK7fj1o2x808jK01jGkXcWkrcqWotCNNhhlkVBGqS/u6QEWituFz8WrIBU/cAaZkNNGKU6l/yRl4WZnY8Nb5mWIq2L2/ttFdkGQmDOObJaGnFJc76xtwD6g==',key_name='tempest-TestGettingAddress-1792278688',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:38:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-6d91vue7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:38:20Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=2b19c9a2-669e-45b6-b7a8-c0472c77d8a0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.226 182096 DEBUG nova.network.os_vif_util [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.226 182096 DEBUG nova.network.os_vif_util [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:f1:ed,bridge_name='br-int',has_traffic_filtering=True,id=f53d4981-e71b-4854-a70b-97a0fa1ff783,network=Network(8e5585ef-d2b9-42d4-8407-0e03aa280771),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d4981-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.227 182096 DEBUG os_vif [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:f1:ed,bridge_name='br-int',has_traffic_filtering=True,id=f53d4981-e71b-4854-a70b-97a0fa1ff783,network=Network(8e5585ef-d2b9-42d4-8407-0e03aa280771),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d4981-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.228 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.228 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf53d4981-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.230 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.231 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.236 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.237 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.239 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1abdeb04-8b17-43d1-824a-84a6bab7600d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.240 182096 INFO os_vif [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:f1:ed,bridge_name='br-int',has_traffic_filtering=True,id=f53d4981-e71b-4854-a70b-97a0fa1ff783,network=Network(8e5585ef-d2b9-42d4-8407-0e03aa280771),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d4981-e7')
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.241 182096 INFO nova.virt.libvirt.driver [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Deleting instance files /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0_del
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.241 182096 INFO nova.virt.libvirt.driver [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Deletion of /var/lib/nova/instances/2b19c9a2-669e-45b6-b7a8-c0472c77d8a0_del complete
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.251 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[367ae99a-8812-42de-9287-39146c8cd89b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.252 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5373b28b-d438-40ae-a6aa-521c4dc530f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.268 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[714a693d-43ee-4119-9513-2a89257c0c15]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460523, 'reachable_time': 38865, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229408, 'error': None, 'target': 'ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d8e5585ef\x2dd2b9\x2d42d4\x2d8407\x2d0e03aa280771.mount: Deactivated successfully.
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.272 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e5585ef-d2b9-42d4-8407-0e03aa280771 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:38:40 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:40.272 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc7eaed-ed5a-4dd1-b52f-1f21229f6591]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.288 182096 INFO nova.compute.manager [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.289 182096 DEBUG oslo.service.loopingcall [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.289 182096 DEBUG nova.compute.manager [-] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.289 182096 DEBUG nova.network.neutron [-] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.891 182096 DEBUG nova.network.neutron [-] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.903 182096 INFO nova.compute.manager [-] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Took 0.61 seconds to deallocate network for instance.
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.974 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:40 compute-0 nova_compute[182092]: 2026-01-23 09:38:40.975 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.053 182096 DEBUG nova.compute.provider_tree [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.066 182096 DEBUG nova.scheduler.client.report [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.080 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.100 182096 INFO nova.scheduler.client.report [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Deleted allocations for instance 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.186 182096 DEBUG nova.network.neutron [req-91766275-6540-488c-963e-71eeb93cbc23 req-bafce146-b0f2-4d2a-a887-49b45100d5e2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Updated VIF entry in instance network info cache for port f53d4981-e71b-4854-a70b-97a0fa1ff783. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.186 182096 DEBUG nova.network.neutron [req-91766275-6540-488c-963e-71eeb93cbc23 req-bafce146-b0f2-4d2a-a887-49b45100d5e2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Updating instance_info_cache with network_info: [{"id": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "address": "fa:16:3e:1e:f1:ed", "network": {"id": "8e5585ef-d2b9-42d4-8407-0e03aa280771", "bridge": "br-int", "label": "tempest-network-smoke--1903704812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe1e:f1ed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d4981-e7", "ovs_interfaceid": "f53d4981-e71b-4854-a70b-97a0fa1ff783", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.194 182096 DEBUG oslo_concurrency.lockutils [None req-18e69864-4812-4d2e-8544-4e53a69be534 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.203 182096 DEBUG oslo_concurrency.lockutils [req-91766275-6540-488c-963e-71eeb93cbc23 req-bafce146-b0f2-4d2a-a887-49b45100d5e2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-2b19c9a2-669e-45b6-b7a8-c0472c77d8a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.920 182096 DEBUG nova.compute.manager [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received event network-vif-unplugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.921 182096 DEBUG oslo_concurrency.lockutils [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.921 182096 DEBUG oslo_concurrency.lockutils [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.921 182096 DEBUG oslo_concurrency.lockutils [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.921 182096 DEBUG nova.compute.manager [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] No waiting events found dispatching network-vif-unplugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.921 182096 WARNING nova.compute.manager [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received unexpected event network-vif-unplugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 for instance with vm_state deleted and task_state None.
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.922 182096 DEBUG nova.compute.manager [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received event network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.922 182096 DEBUG oslo_concurrency.lockutils [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.922 182096 DEBUG oslo_concurrency.lockutils [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.922 182096 DEBUG oslo_concurrency.lockutils [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "2b19c9a2-669e-45b6-b7a8-c0472c77d8a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.922 182096 DEBUG nova.compute.manager [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] No waiting events found dispatching network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.923 182096 WARNING nova.compute.manager [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received unexpected event network-vif-plugged-f53d4981-e71b-4854-a70b-97a0fa1ff783 for instance with vm_state deleted and task_state None.
Jan 23 09:38:41 compute-0 nova_compute[182092]: 2026-01-23 09:38:41.923 182096 DEBUG nova.compute.manager [req-9fb6e997-981d-4b5e-8773-d0f3952ec70f req-67517445-2f9b-464c-b2a9-fec8b1485f28 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Received event network-vif-deleted-f53d4981-e71b-4854-a70b-97a0fa1ff783 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:38:42 compute-0 nova_compute[182092]: 2026-01-23 09:38:42.490 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:45 compute-0 podman[229409]: 2026-01-23 09:38:45.222198493 +0000 UTC m=+0.060464091 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:38:45 compute-0 nova_compute[182092]: 2026-01-23 09:38:45.229 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:47 compute-0 ovn_controller[94697]: 2026-01-23T09:38:47Z|00609|binding|INFO|Releasing lport d65b2ee0-0456-4245-9d04-610259fcbfde from this chassis (sb_readonly=0)
Jan 23 09:38:47 compute-0 nova_compute[182092]: 2026-01-23 09:38:47.156 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:47 compute-0 nova_compute[182092]: 2026-01-23 09:38:47.492 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:50 compute-0 nova_compute[182092]: 2026-01-23 09:38:50.230 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:52 compute-0 nova_compute[182092]: 2026-01-23 09:38:52.493 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:53 compute-0 podman[229433]: 2026-01-23 09:38:53.219213885 +0000 UTC m=+0.056261215 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:38:53 compute-0 podman[229434]: 2026-01-23 09:38:53.231925607 +0000 UTC m=+0.066658521 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:38:55 compute-0 nova_compute[182092]: 2026-01-23 09:38:55.208 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161120.2076204, 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:38:55 compute-0 nova_compute[182092]: 2026-01-23 09:38:55.208 182096 INFO nova.compute.manager [-] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] VM Stopped (Lifecycle Event)
Jan 23 09:38:55 compute-0 nova_compute[182092]: 2026-01-23 09:38:55.223 182096 DEBUG nova.compute.manager [None req-5be561c4-5344-47cb-a71c-63d20480c678 - - - - - -] [instance: 2b19c9a2-669e-45b6-b7a8-c0472c77d8a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:38:55 compute-0 nova_compute[182092]: 2026-01-23 09:38:55.230 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:55 compute-0 nova_compute[182092]: 2026-01-23 09:38:55.832 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:56 compute-0 nova_compute[182092]: 2026-01-23 09:38:56.230 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:38:56 compute-0 nova_compute[182092]: 2026-01-23 09:38:56.248 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Triggering sync for uuid bd78986f-3b06-4345-8540-cb6ab9009fb2 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 23 09:38:56 compute-0 nova_compute[182092]: 2026-01-23 09:38:56.249 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:38:56 compute-0 nova_compute[182092]: 2026-01-23 09:38:56.249 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:38:56 compute-0 nova_compute[182092]: 2026-01-23 09:38:56.263 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:38:57 compute-0 nova_compute[182092]: 2026-01-23 09:38:57.494 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:58 compute-0 nova_compute[182092]: 2026-01-23 09:38:58.609 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:59.392 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:38:59 compute-0 nova_compute[182092]: 2026-01-23 09:38:59.393 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:38:59 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:38:59.393 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:39:00 compute-0 nova_compute[182092]: 2026-01-23 09:39:00.231 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:00 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:00.395 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:00 compute-0 nova_compute[182092]: 2026-01-23 09:39:00.896 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:02 compute-0 nova_compute[182092]: 2026-01-23 09:39:02.495 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.134 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.134 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.149 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.217 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.217 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.221 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.221 182096 INFO nova.compute.claims [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:39:04 compute-0 podman[229474]: 2026-01-23 09:39:04.230079641 +0000 UTC m=+0.061864622 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 09:39:04 compute-0 podman[229475]: 2026-01-23 09:39:04.234621405 +0000 UTC m=+0.064084046 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.323 182096 DEBUG nova.compute.provider_tree [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.333 182096 DEBUG nova.scheduler.client.report [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.347 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.347 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.397 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.397 182096 DEBUG nova.network.neutron [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.408 182096 INFO nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.421 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.523 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.524 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.524 182096 INFO nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Creating image(s)
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.524 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "/var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.525 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.525 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.535 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.579 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.580 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.581 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.590 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.604 182096 DEBUG nova.policy [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.634 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.635 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.656 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.657 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.657 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.705 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.706 182096 DEBUG nova.virt.disk.api [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Checking if we can resize image /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.706 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.750 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.751 182096 DEBUG nova.virt.disk.api [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Cannot resize image /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.751 182096 DEBUG nova.objects.instance [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'migration_context' on Instance uuid a7e87540-5713-4ac2-a9c8-942e11144ee8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.768 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.768 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Ensure instance console log exists: /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.769 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.769 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:04 compute-0 nova_compute[182092]: 2026-01-23 09:39:04.769 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:05 compute-0 nova_compute[182092]: 2026-01-23 09:39:05.232 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:05 compute-0 nova_compute[182092]: 2026-01-23 09:39:05.847 182096 DEBUG nova.network.neutron [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Successfully created port: 2ffa52f2-8471-4c8c-ada7-a235d863feed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:39:06 compute-0 nova_compute[182092]: 2026-01-23 09:39:06.380 182096 DEBUG nova.network.neutron [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Successfully created port: b3f8d54c-85ef-4fa7-8739-13144ea79fd4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:39:07 compute-0 podman[229527]: 2026-01-23 09:39:07.227683136 +0000 UTC m=+0.064329088 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 23 09:39:07 compute-0 nova_compute[182092]: 2026-01-23 09:39:07.496 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:07 compute-0 nova_compute[182092]: 2026-01-23 09:39:07.556 182096 DEBUG nova.network.neutron [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Successfully updated port: 2ffa52f2-8471-4c8c-ada7-a235d863feed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:39:07 compute-0 nova_compute[182092]: 2026-01-23 09:39:07.635 182096 DEBUG nova.compute.manager [req-88f4b16d-ea01-4aab-9e74-e6228e1f44ef req-98daaf22-17b5-4a96-9010-1697ce4bc0ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-changed-2ffa52f2-8471-4c8c-ada7-a235d863feed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:07 compute-0 nova_compute[182092]: 2026-01-23 09:39:07.636 182096 DEBUG nova.compute.manager [req-88f4b16d-ea01-4aab-9e74-e6228e1f44ef req-98daaf22-17b5-4a96-9010-1697ce4bc0ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Refreshing instance network info cache due to event network-changed-2ffa52f2-8471-4c8c-ada7-a235d863feed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:39:07 compute-0 nova_compute[182092]: 2026-01-23 09:39:07.636 182096 DEBUG oslo_concurrency.lockutils [req-88f4b16d-ea01-4aab-9e74-e6228e1f44ef req-98daaf22-17b5-4a96-9010-1697ce4bc0ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:07 compute-0 nova_compute[182092]: 2026-01-23 09:39:07.636 182096 DEBUG oslo_concurrency.lockutils [req-88f4b16d-ea01-4aab-9e74-e6228e1f44ef req-98daaf22-17b5-4a96-9010-1697ce4bc0ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:07 compute-0 nova_compute[182092]: 2026-01-23 09:39:07.636 182096 DEBUG nova.network.neutron [req-88f4b16d-ea01-4aab-9e74-e6228e1f44ef req-98daaf22-17b5-4a96-9010-1697ce4bc0ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Refreshing network info cache for port 2ffa52f2-8471-4c8c-ada7-a235d863feed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:39:07 compute-0 nova_compute[182092]: 2026-01-23 09:39:07.802 182096 DEBUG nova.network.neutron [req-88f4b16d-ea01-4aab-9e74-e6228e1f44ef req-98daaf22-17b5-4a96-9010-1697ce4bc0ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:39:08 compute-0 nova_compute[182092]: 2026-01-23 09:39:08.090 182096 DEBUG nova.network.neutron [req-88f4b16d-ea01-4aab-9e74-e6228e1f44ef req-98daaf22-17b5-4a96-9010-1697ce4bc0ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:08 compute-0 nova_compute[182092]: 2026-01-23 09:39:08.099 182096 DEBUG oslo_concurrency.lockutils [req-88f4b16d-ea01-4aab-9e74-e6228e1f44ef req-98daaf22-17b5-4a96-9010-1697ce4bc0ef 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:08 compute-0 nova_compute[182092]: 2026-01-23 09:39:08.303 182096 DEBUG nova.network.neutron [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Successfully updated port: b3f8d54c-85ef-4fa7-8739-13144ea79fd4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:39:08 compute-0 nova_compute[182092]: 2026-01-23 09:39:08.318 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:08 compute-0 nova_compute[182092]: 2026-01-23 09:39:08.318 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquired lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:08 compute-0 nova_compute[182092]: 2026-01-23 09:39:08.318 182096 DEBUG nova.network.neutron [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:39:08 compute-0 nova_compute[182092]: 2026-01-23 09:39:08.428 182096 DEBUG nova.network.neutron [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.731 182096 DEBUG nova.compute.manager [req-73d2c562-e2b2-4ad7-aecc-ff3ec6ac0253 req-ef84591b-c307-4639-b127-8aa5619975c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-changed-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.731 182096 DEBUG nova.compute.manager [req-73d2c562-e2b2-4ad7-aecc-ff3ec6ac0253 req-ef84591b-c307-4639-b127-8aa5619975c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Refreshing instance network info cache due to event network-changed-b3f8d54c-85ef-4fa7-8739-13144ea79fd4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.731 182096 DEBUG oslo_concurrency.lockutils [req-73d2c562-e2b2-4ad7-aecc-ff3ec6ac0253 req-ef84591b-c307-4639-b127-8aa5619975c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.820 182096 DEBUG nova.network.neutron [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updating instance_info_cache with network_info: [{"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.832 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Releasing lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.833 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Instance network_info: |[{"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.833 182096 DEBUG oslo_concurrency.lockutils [req-73d2c562-e2b2-4ad7-aecc-ff3ec6ac0253 req-ef84591b-c307-4639-b127-8aa5619975c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.833 182096 DEBUG nova.network.neutron [req-73d2c562-e2b2-4ad7-aecc-ff3ec6ac0253 req-ef84591b-c307-4639-b127-8aa5619975c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Refreshing network info cache for port b3f8d54c-85ef-4fa7-8739-13144ea79fd4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.836 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Start _get_guest_xml network_info=[{"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.839 182096 WARNING nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.845 182096 DEBUG nova.virt.libvirt.host [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.846 182096 DEBUG nova.virt.libvirt.host [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.848 182096 DEBUG nova.virt.libvirt.host [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.848 182096 DEBUG nova.virt.libvirt.host [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.849 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.849 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.850 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.850 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.850 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.850 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.850 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.851 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.851 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.851 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.851 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.851 182096 DEBUG nova.virt.hardware [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.854 182096 DEBUG nova.virt.libvirt.vif [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-939246726',display_name='tempest-TestGettingAddress-server-939246726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-939246726',id=155,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+vq503RsRv3NlbK2HqykQzUUUSumUFu3tzH7J4rcOQwkWVuSyj4F67HdGwPzSRWgEPqTVbQUEPNN9j6dgWLKb+/Lfysg6PG38f9xMl2gfvn9tbA0X8cV0TRrBlxVULog==',key_name='tempest-TestGettingAddress-1228770733',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-cw93do7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:39:04Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a7e87540-5713-4ac2-a9c8-942e11144ee8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.854 182096 DEBUG nova.network.os_vif_util [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.855 182096 DEBUG nova.network.os_vif_util [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:58,bridge_name='br-int',has_traffic_filtering=True,id=2ffa52f2-8471-4c8c-ada7-a235d863feed,network=Network(19e72efe-6307-41da-8074-25766fc6d9f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ffa52f2-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.855 182096 DEBUG nova.virt.libvirt.vif [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-939246726',display_name='tempest-TestGettingAddress-server-939246726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-939246726',id=155,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+vq503RsRv3NlbK2HqykQzUUUSumUFu3tzH7J4rcOQwkWVuSyj4F67HdGwPzSRWgEPqTVbQUEPNN9j6dgWLKb+/Lfysg6PG38f9xMl2gfvn9tbA0X8cV0TRrBlxVULog==',key_name='tempest-TestGettingAddress-1228770733',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-cw93do7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:39:04Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a7e87540-5713-4ac2-a9c8-942e11144ee8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.856 182096 DEBUG nova.network.os_vif_util [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.856 182096 DEBUG nova.network.os_vif_util [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:72:9b,bridge_name='br-int',has_traffic_filtering=True,id=b3f8d54c-85ef-4fa7-8739-13144ea79fd4,network=Network(70104d04-d5ee-47d4-b059-a0882ae695ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f8d54c-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.857 182096 DEBUG nova.objects.instance [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7e87540-5713-4ac2-a9c8-942e11144ee8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.867 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <uuid>a7e87540-5713-4ac2-a9c8-942e11144ee8</uuid>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <name>instance-0000009b</name>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <nova:name>tempest-TestGettingAddress-server-939246726</nova:name>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:39:09</nova:creationTime>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:user uuid="2223cd913aab4f7cbffc6e9c703c6acc">tempest-TestGettingAddress-10741833-project-member</nova:user>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:project uuid="d4181f6c647942e881af13381cc2f253">tempest-TestGettingAddress-10741833</nova:project>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:port uuid="2ffa52f2-8471-4c8c-ada7-a235d863feed">
Jan 23 09:39:09 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         <nova:port uuid="b3f8d54c-85ef-4fa7-8739-13144ea79fd4">
Jan 23 09:39:09 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fea8:729b" ipVersion="6"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <system>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <entry name="serial">a7e87540-5713-4ac2-a9c8-942e11144ee8</entry>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <entry name="uuid">a7e87540-5713-4ac2-a9c8-942e11144ee8</entry>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </system>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <os>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   </os>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <features>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   </features>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk.config"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:52:70:58"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <target dev="tap2ffa52f2-84"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:a8:72:9b"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <target dev="tapb3f8d54c-85"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/console.log" append="off"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <video>
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </video>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:39:09 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:39:09 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:39:09 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:39:09 compute-0 nova_compute[182092]: </domain>
Jan 23 09:39:09 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.868 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Preparing to wait for external event network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.868 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.868 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.868 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.868 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Preparing to wait for external event network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.868 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.869 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.869 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.869 182096 DEBUG nova.virt.libvirt.vif [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-939246726',display_name='tempest-TestGettingAddress-server-939246726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-939246726',id=155,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+vq503RsRv3NlbK2HqykQzUUUSumUFu3tzH7J4rcOQwkWVuSyj4F67HdGwPzSRWgEPqTVbQUEPNN9j6dgWLKb+/Lfysg6PG38f9xMl2gfvn9tbA0X8cV0TRrBlxVULog==',key_name='tempest-TestGettingAddress-1228770733',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-cw93do7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:39:04Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a7e87540-5713-4ac2-a9c8-942e11144ee8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.869 182096 DEBUG nova.network.os_vif_util [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.870 182096 DEBUG nova.network.os_vif_util [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:58,bridge_name='br-int',has_traffic_filtering=True,id=2ffa52f2-8471-4c8c-ada7-a235d863feed,network=Network(19e72efe-6307-41da-8074-25766fc6d9f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ffa52f2-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.870 182096 DEBUG os_vif [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:58,bridge_name='br-int',has_traffic_filtering=True,id=2ffa52f2-8471-4c8c-ada7-a235d863feed,network=Network(19e72efe-6307-41da-8074-25766fc6d9f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ffa52f2-84') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.870 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.871 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.871 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.873 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.873 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ffa52f2-84, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.873 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2ffa52f2-84, col_values=(('external_ids', {'iface-id': '2ffa52f2-8471-4c8c-ada7-a235d863feed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:70:58', 'vm-uuid': 'a7e87540-5713-4ac2-a9c8-942e11144ee8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.874 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:09 compute-0 NetworkManager[54920]: <info>  [1769161149.8760] manager: (tap2ffa52f2-84): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.876 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.879 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.879 182096 INFO os_vif [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:58,bridge_name='br-int',has_traffic_filtering=True,id=2ffa52f2-8471-4c8c-ada7-a235d863feed,network=Network(19e72efe-6307-41da-8074-25766fc6d9f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ffa52f2-84')
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.880 182096 DEBUG nova.virt.libvirt.vif [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-939246726',display_name='tempest-TestGettingAddress-server-939246726',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-939246726',id=155,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+vq503RsRv3NlbK2HqykQzUUUSumUFu3tzH7J4rcOQwkWVuSyj4F67HdGwPzSRWgEPqTVbQUEPNN9j6dgWLKb+/Lfysg6PG38f9xMl2gfvn9tbA0X8cV0TRrBlxVULog==',key_name='tempest-TestGettingAddress-1228770733',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-cw93do7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:39:04Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a7e87540-5713-4ac2-a9c8-942e11144ee8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.880 182096 DEBUG nova.network.os_vif_util [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.881 182096 DEBUG nova.network.os_vif_util [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:72:9b,bridge_name='br-int',has_traffic_filtering=True,id=b3f8d54c-85ef-4fa7-8739-13144ea79fd4,network=Network(70104d04-d5ee-47d4-b059-a0882ae695ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f8d54c-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.881 182096 DEBUG os_vif [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:72:9b,bridge_name='br-int',has_traffic_filtering=True,id=b3f8d54c-85ef-4fa7-8739-13144ea79fd4,network=Network(70104d04-d5ee-47d4-b059-a0882ae695ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f8d54c-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.881 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.881 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.881 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.882 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.883 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3f8d54c-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.883 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb3f8d54c-85, col_values=(('external_ids', {'iface-id': 'b3f8d54c-85ef-4fa7-8739-13144ea79fd4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:72:9b', 'vm-uuid': 'a7e87540-5713-4ac2-a9c8-942e11144ee8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:09 compute-0 NetworkManager[54920]: <info>  [1769161149.8845] manager: (tapb3f8d54c-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.884 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.885 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.889 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.889 182096 INFO os_vif [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:72:9b,bridge_name='br-int',has_traffic_filtering=True,id=b3f8d54c-85ef-4fa7-8739-13144ea79fd4,network=Network(70104d04-d5ee-47d4-b059-a0882ae695ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f8d54c-85')
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.925 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.925 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.925 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No VIF found with MAC fa:16:3e:52:70:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.926 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No VIF found with MAC fa:16:3e:a8:72:9b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:39:09 compute-0 nova_compute[182092]: 2026-01-23 09:39:09.926 182096 INFO nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Using config drive
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.562 182096 INFO nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Creating config drive at /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk.config
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.567 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpreax4jmw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.685 182096 DEBUG oslo_concurrency.processutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpreax4jmw" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:10 compute-0 kernel: tap2ffa52f2-84: entered promiscuous mode
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.7257] manager: (tap2ffa52f2-84): new Tun device (/org/freedesktop/NetworkManager/Devices/307)
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.728 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00610|binding|INFO|Claiming lport 2ffa52f2-8471-4c8c-ada7-a235d863feed for this chassis.
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00611|binding|INFO|2ffa52f2-8471-4c8c-ada7-a235d863feed: Claiming fa:16:3e:52:70:58 10.100.0.12
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.7353] manager: (tapb3f8d54c-85): new Tun device (/org/freedesktop/NetworkManager/Devices/308)
Jan 23 09:39:10 compute-0 kernel: tapb3f8d54c-85: entered promiscuous mode
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.743 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:70:58 10.100.0.12'], port_security=['fa:16:3e:52:70:58 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a7e87540-5713-4ac2-a9c8-942e11144ee8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19e72efe-6307-41da-8074-25766fc6d9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9e38e9d0-977f-42cc-8e60-4bdce73c3090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6aa0531-db76-40ab-b0a7-eb7a32653be0, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2ffa52f2-8471-4c8c-ada7-a235d863feed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.744 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2ffa52f2-8471-4c8c-ada7-a235d863feed in datapath 19e72efe-6307-41da-8074-25766fc6d9f3 bound to our chassis
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00612|binding|INFO|Setting lport 2ffa52f2-8471-4c8c-ada7-a235d863feed ovn-installed in OVS
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00613|binding|INFO|Setting lport 2ffa52f2-8471-4c8c-ada7-a235d863feed up in Southbound
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.746 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.746 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19e72efe-6307-41da-8074-25766fc6d9f3
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.748 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00614|if_status|INFO|Not updating pb chassis for b3f8d54c-85ef-4fa7-8739-13144ea79fd4 now as sb is readonly
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00615|binding|INFO|Claiming lport b3f8d54c-85ef-4fa7-8739-13144ea79fd4 for this chassis.
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00616|binding|INFO|b3f8d54c-85ef-4fa7-8739-13144ea79fd4: Claiming fa:16:3e:a8:72:9b 2001:db8::f816:3eff:fea8:729b
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.755 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[98be5f45-167d-436a-a9e7-bed21926bd6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.755 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19e72efe-61 in ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:39:10 compute-0 systemd-udevd[229570]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:39:10 compute-0 systemd-udevd[229571]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.757 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19e72efe-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.757 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d69c3423-e1cf-4cd1-b5a8-2f1b29d0c946]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.759 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec261b5-a5a2-46f7-811c-a89181cf9d10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.762 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:72:9b 2001:db8::f816:3eff:fea8:729b'], port_security=['fa:16:3e:a8:72:9b 2001:db8::f816:3eff:fea8:729b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fea8:729b/64', 'neutron:device_id': 'a7e87540-5713-4ac2-a9c8-942e11144ee8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70104d04-d5ee-47d4-b059-a0882ae695ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9e38e9d0-977f-42cc-8e60-4bdce73c3090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7a180e3-fed5-40ae-a930-fe216e9b2af2, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b3f8d54c-85ef-4fa7-8739-13144ea79fd4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00617|binding|INFO|Setting lport b3f8d54c-85ef-4fa7-8739-13144ea79fd4 up in Southbound
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00618|binding|INFO|Setting lport b3f8d54c-85ef-4fa7-8739-13144ea79fd4 ovn-installed in OVS
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.769 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.769 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[86a4825a-2784-4f69-8f04-e6d959cafc8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.7780] device (tapb3f8d54c-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.7789] device (tapb3f8d54c-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.7793] device (tap2ffa52f2-84): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.7797] device (tap2ffa52f2-84): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.779 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2c1cc2b4-a3a7-47e5-a48c-90a7bb1b584c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 systemd-machined[153562]: New machine qemu-78-instance-0000009b.
Jan 23 09:39:10 compute-0 systemd[1]: Started Virtual Machine qemu-78-instance-0000009b.
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.799 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[badad364-df5d-4516-b3a4-031c347e5d28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.802 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f58c2aae-3d4f-4dd0-bbbd-e1832b89e40f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.8035] manager: (tap19e72efe-60): new Veth device (/org/freedesktop/NetworkManager/Devices/309)
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.832 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c282fd65-f74d-4666-ba3e-f9a107b73257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.835 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b33b7c-eab6-4d6c-ae64-341314e1bc43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.8549] device (tap19e72efe-60): carrier: link connected
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.862 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[45dc3490-2018-4a69-a934-15912c50161f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.876 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ff42d0d0-c254-44bf-a3ab-7e77a3c4885d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19e72efe-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:0a:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465634, 'reachable_time': 16532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229598, 'error': None, 'target': 'ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.887 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[45ff7ba6-387e-496d-b1a6-d72b2ea87a7e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe87:a49'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 465634, 'tstamp': 465634}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229599, 'error': None, 'target': 'ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.901 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[67c162ef-f407-4bf4-ab75-adee6512dd84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19e72efe-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:0a:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465634, 'reachable_time': 16532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229600, 'error': None, 'target': 'ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.925 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e41926b6-3c5a-4142-80c6-e4e0fd5c941f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.958 182096 DEBUG nova.compute.manager [req-ddfc6134-d0be-478a-bce2-d6ee43490871 req-597898cf-d428-4d3d-b1e1-dd4bf24a7adf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.958 182096 DEBUG oslo_concurrency.lockutils [req-ddfc6134-d0be-478a-bce2-d6ee43490871 req-597898cf-d428-4d3d-b1e1-dd4bf24a7adf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.959 182096 DEBUG oslo_concurrency.lockutils [req-ddfc6134-d0be-478a-bce2-d6ee43490871 req-597898cf-d428-4d3d-b1e1-dd4bf24a7adf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.959 182096 DEBUG oslo_concurrency.lockutils [req-ddfc6134-d0be-478a-bce2-d6ee43490871 req-597898cf-d428-4d3d-b1e1-dd4bf24a7adf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.959 182096 DEBUG nova.compute.manager [req-ddfc6134-d0be-478a-bce2-d6ee43490871 req-597898cf-d428-4d3d-b1e1-dd4bf24a7adf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Processing event network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.970 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b69f757e-0fd0-4a18-a654-a263e0c7bec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.971 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19e72efe-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.971 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.972 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19e72efe-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.973 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:10 compute-0 NetworkManager[54920]: <info>  [1769161150.9741] manager: (tap19e72efe-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Jan 23 09:39:10 compute-0 kernel: tap19e72efe-60: entered promiscuous mode
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.977 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19e72efe-60, col_values=(('external_ids', {'iface-id': '2830e4e6-7d79-4a86-9435-6e54ee15cd1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.978 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.979 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.979 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19e72efe-6307-41da-8074-25766fc6d9f3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19e72efe-6307-41da-8074-25766fc6d9f3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:39:10 compute-0 ovn_controller[94697]: 2026-01-23T09:39:10Z|00619|binding|INFO|Releasing lport 2830e4e6-7d79-4a86-9435-6e54ee15cd1e from this chassis (sb_readonly=0)
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.991 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d670ba-1bf3-4451-9d6a-9280b1839f3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:10 compute-0 nova_compute[182092]: 2026-01-23 09:39:10.992 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.992 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-19e72efe-6307-41da-8074-25766fc6d9f3
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/19e72efe-6307-41da-8074-25766fc6d9f3.pid.haproxy
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 19e72efe-6307-41da-8074-25766fc6d9f3
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:39:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:10.993 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3', 'env', 'PROCESS_TAG=haproxy-19e72efe-6307-41da-8074-25766fc6d9f3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19e72efe-6307-41da-8074-25766fc6d9f3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.004 182096 DEBUG nova.compute.manager [req-fdc55061-bfe4-447a-a51e-6514b6a6f349 req-a8a6caa5-0f21-4384-9d28-b8a39b5b5023 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.004 182096 DEBUG oslo_concurrency.lockutils [req-fdc55061-bfe4-447a-a51e-6514b6a6f349 req-a8a6caa5-0f21-4384-9d28-b8a39b5b5023 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.004 182096 DEBUG oslo_concurrency.lockutils [req-fdc55061-bfe4-447a-a51e-6514b6a6f349 req-a8a6caa5-0f21-4384-9d28-b8a39b5b5023 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.004 182096 DEBUG oslo_concurrency.lockutils [req-fdc55061-bfe4-447a-a51e-6514b6a6f349 req-a8a6caa5-0f21-4384-9d28-b8a39b5b5023 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.004 182096 DEBUG nova.compute.manager [req-fdc55061-bfe4-447a-a51e-6514b6a6f349 req-a8a6caa5-0f21-4384-9d28-b8a39b5b5023 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Processing event network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.074 182096 DEBUG nova.network.neutron [req-73d2c562-e2b2-4ad7-aecc-ff3ec6ac0253 req-ef84591b-c307-4639-b127-8aa5619975c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updated VIF entry in instance network info cache for port b3f8d54c-85ef-4fa7-8739-13144ea79fd4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.075 182096 DEBUG nova.network.neutron [req-73d2c562-e2b2-4ad7-aecc-ff3ec6ac0253 req-ef84591b-c307-4639-b127-8aa5619975c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updating instance_info_cache with network_info: [{"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.089 182096 DEBUG oslo_concurrency.lockutils [req-73d2c562-e2b2-4ad7-aecc-ff3ec6ac0253 req-ef84591b-c307-4639-b127-8aa5619975c1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:11 compute-0 podman[229628]: 2026-01-23 09:39:11.283310736 +0000 UTC m=+0.033336154 container create c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 09:39:11 compute-0 systemd[1]: Started libpod-conmon-c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa.scope.
Jan 23 09:39:11 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:39:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3bf369a99a0284bb45fc502ff5b20065aa3db706cd4eecfa2afb7802d5ad6f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:39:11 compute-0 podman[229628]: 2026-01-23 09:39:11.331264724 +0000 UTC m=+0.081290162 container init c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:39:11 compute-0 podman[229628]: 2026-01-23 09:39:11.337303231 +0000 UTC m=+0.087328649 container start c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:39:11 compute-0 podman[229628]: 2026-01-23 09:39:11.268853523 +0000 UTC m=+0.018878941 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:39:11 compute-0 neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3[229640]: [NOTICE]   (229644) : New worker (229646) forked
Jan 23 09:39:11 compute-0 neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3[229640]: [NOTICE]   (229644) : Loading success.
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.377 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b3f8d54c-85ef-4fa7-8739-13144ea79fd4 in datapath 70104d04-d5ee-47d4-b059-a0882ae695ee unbound from our chassis
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.378 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 70104d04-d5ee-47d4-b059-a0882ae695ee
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.387 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[87fc4ef0-84fb-44a1-9474-20f63c63a300]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.388 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap70104d04-d1 in ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.389 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap70104d04-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.389 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[674c0c66-2147-4bde-8e34-f543b35ee2e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.390 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[73d075b6-0449-481b-9497-c7dc0e530684]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.398 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b545ee35-53ac-451b-8f91-4c51bb1a5825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.407 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c1409c5c-4b31-4bdf-a16d-66fcc3df58ab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.433 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1794129a-923a-4c07-a008-e44079d6eae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 systemd-udevd[229590]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.438 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fa64118b-2fd9-406d-a5a0-abf80dbebe6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 NetworkManager[54920]: <info>  [1769161151.4395] manager: (tap70104d04-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/311)
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.449 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161151.449037, a7e87540-5713-4ac2-a9c8-942e11144ee8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.449 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] VM Started (Lifecycle Event)
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.452 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.459 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.461 182096 INFO nova.virt.libvirt.driver [-] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Instance spawned successfully.
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.462 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.468 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[46339bbc-3abf-459c-9ba3-37bb63589980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.470 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e1d25b-f311-4279-abfc-20cb76a0262c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 NetworkManager[54920]: <info>  [1769161151.4895] device (tap70104d04-d0): carrier: link connected
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.493 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1b88701f-2060-453e-bf15-de3f87b64f70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.507 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ab598e1f-3cea-480e-9eb0-8e5bb7825308]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap70104d04-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:d3:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 188], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465697, 'reachable_time': 34663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229669, 'error': None, 'target': 'ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.519 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[956d463f-6a00-4a25-b667-1e362258befd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4f:d3a4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 465697, 'tstamp': 465697}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229670, 'error': None, 'target': 'ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.530 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0c415778-3d81-41c8-96e0-ffdec5381495]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap70104d04-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:d3:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 188], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465697, 'reachable_time': 34663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229671, 'error': None, 'target': 'ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.543 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.546 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.547 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.547 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.547 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.548 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.548 182096 DEBUG nova.virt.libvirt.driver [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.551 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.555 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb3842f-5d05-4853-a2d6-24b5152abe7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.574 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.574 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161151.4491146, a7e87540-5713-4ac2-a9c8-942e11144ee8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.574 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] VM Paused (Lifecycle Event)
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.581 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1317ae27-f610-4a77-8813-4f7bd9c16b23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.582 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70104d04-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.582 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.583 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70104d04-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:11 compute-0 kernel: tap70104d04-d0: entered promiscuous mode
Jan 23 09:39:11 compute-0 NetworkManager[54920]: <info>  [1769161151.5846] manager: (tap70104d04-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.588 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap70104d04-d0, col_values=(('external_ids', {'iface-id': 'a419c1b5-3dc3-4292-bea6-667dcc1d421c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:11 compute-0 ovn_controller[94697]: 2026-01-23T09:39:11Z|00620|binding|INFO|Releasing lport a419c1b5-3dc3-4292-bea6-667dcc1d421c from this chassis (sb_readonly=0)
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.589 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/70104d04-d5ee-47d4-b059-a0882ae695ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/70104d04-d5ee-47d4-b059-a0882ae695ee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.584 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.603 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.606 182096 INFO nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Took 7.08 seconds to spawn the instance on the hypervisor.
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.606 182096 DEBUG nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.603 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[20400573-1212-401e-be8c-a137c259ee9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.607 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-70104d04-d5ee-47d4-b059-a0882ae695ee
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/70104d04-d5ee-47d4-b059-a0882ae695ee.pid.haproxy
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 70104d04-d5ee-47d4-b059-a0882ae695ee
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.608 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161151.4579098, a7e87540-5713-4ac2-a9c8-942e11144ee8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:39:11 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:11.608 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee', 'env', 'PROCESS_TAG=haproxy-70104d04-d5ee-47d4-b059-a0882ae695ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/70104d04-d5ee-47d4-b059-a0882ae695ee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.608 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] VM Resumed (Lifecycle Event)
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.628 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.630 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.650 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.672 182096 INFO nova.compute.manager [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Took 7.48 seconds to build instance.
Jan 23 09:39:11 compute-0 nova_compute[182092]: 2026-01-23 09:39:11.682 182096 DEBUG oslo_concurrency.lockutils [None req-1b34f2c8-7c71-4e07-92e8-e75d6f0861a1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:11 compute-0 podman[229697]: 2026-01-23 09:39:11.886197567 +0000 UTC m=+0.030914557 container create 21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:39:11 compute-0 systemd[1]: Started libpod-conmon-21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89.scope.
Jan 23 09:39:11 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:39:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674cb646332f72b69a2d7ac8ceb3d69fa85ca6519f94007735318ce4fdfc44a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:39:11 compute-0 podman[229697]: 2026-01-23 09:39:11.950265921 +0000 UTC m=+0.094982932 container init 21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 09:39:11 compute-0 podman[229697]: 2026-01-23 09:39:11.955227418 +0000 UTC m=+0.099944408 container start 21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 09:39:11 compute-0 podman[229697]: 2026-01-23 09:39:11.8720456 +0000 UTC m=+0.016762611 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:39:11 compute-0 neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee[229709]: [NOTICE]   (229713) : New worker (229715) forked
Jan 23 09:39:11 compute-0 neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee[229709]: [NOTICE]   (229713) : Loading success.
Jan 23 09:39:12 compute-0 nova_compute[182092]: 2026-01-23 09:39:12.499 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:12 compute-0 nova_compute[182092]: 2026-01-23 09:39:12.668 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:12 compute-0 nova_compute[182092]: 2026-01-23 09:39:12.669 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.070 182096 DEBUG nova.compute.manager [req-54cdb8d6-0559-4b08-b69b-c05c16e73a80 req-70391aa9-804a-43a5-9df0-b6f123acbb9c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.071 182096 DEBUG oslo_concurrency.lockutils [req-54cdb8d6-0559-4b08-b69b-c05c16e73a80 req-70391aa9-804a-43a5-9df0-b6f123acbb9c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.071 182096 DEBUG oslo_concurrency.lockutils [req-54cdb8d6-0559-4b08-b69b-c05c16e73a80 req-70391aa9-804a-43a5-9df0-b6f123acbb9c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.071 182096 DEBUG oslo_concurrency.lockutils [req-54cdb8d6-0559-4b08-b69b-c05c16e73a80 req-70391aa9-804a-43a5-9df0-b6f123acbb9c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.071 182096 DEBUG nova.compute.manager [req-54cdb8d6-0559-4b08-b69b-c05c16e73a80 req-70391aa9-804a-43a5-9df0-b6f123acbb9c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] No waiting events found dispatching network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.072 182096 WARNING nova.compute.manager [req-54cdb8d6-0559-4b08-b69b-c05c16e73a80 req-70391aa9-804a-43a5-9df0-b6f123acbb9c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received unexpected event network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 for instance with vm_state active and task_state None.
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.278 182096 DEBUG nova.compute.manager [req-5bad9ad1-9d2e-43ef-bda4-f2e78e298db3 req-3731b995-f525-445c-b470-f75f56a49607 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.279 182096 DEBUG oslo_concurrency.lockutils [req-5bad9ad1-9d2e-43ef-bda4-f2e78e298db3 req-3731b995-f525-445c-b470-f75f56a49607 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.279 182096 DEBUG oslo_concurrency.lockutils [req-5bad9ad1-9d2e-43ef-bda4-f2e78e298db3 req-3731b995-f525-445c-b470-f75f56a49607 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.279 182096 DEBUG oslo_concurrency.lockutils [req-5bad9ad1-9d2e-43ef-bda4-f2e78e298db3 req-3731b995-f525-445c-b470-f75f56a49607 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.279 182096 DEBUG nova.compute.manager [req-5bad9ad1-9d2e-43ef-bda4-f2e78e298db3 req-3731b995-f525-445c-b470-f75f56a49607 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] No waiting events found dispatching network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:39:13 compute-0 nova_compute[182092]: 2026-01-23 09:39:13.279 182096 WARNING nova.compute.manager [req-5bad9ad1-9d2e-43ef-bda4-f2e78e298db3 req-3731b995-f525-445c-b470-f75f56a49607 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received unexpected event network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed for instance with vm_state active and task_state None.
Jan 23 09:39:14 compute-0 nova_compute[182092]: 2026-01-23 09:39:14.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:14 compute-0 nova_compute[182092]: 2026-01-23 09:39:14.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:14 compute-0 nova_compute[182092]: 2026-01-23 09:39:14.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:39:14 compute-0 nova_compute[182092]: 2026-01-23 09:39:14.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:39:14 compute-0 nova_compute[182092]: 2026-01-23 09:39:14.886 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.573 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.573 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.573 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.573 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd78986f-3b06-4345-8540-cb6ab9009fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.683 182096 DEBUG nova.compute.manager [req-1bf8a728-e357-407f-aaca-f0a317e1e425 req-56ea95d2-28f0-411e-af21-356fa6650ed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-changed-2ffa52f2-8471-4c8c-ada7-a235d863feed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.684 182096 DEBUG nova.compute.manager [req-1bf8a728-e357-407f-aaca-f0a317e1e425 req-56ea95d2-28f0-411e-af21-356fa6650ed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Refreshing instance network info cache due to event network-changed-2ffa52f2-8471-4c8c-ada7-a235d863feed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.685 182096 DEBUG oslo_concurrency.lockutils [req-1bf8a728-e357-407f-aaca-f0a317e1e425 req-56ea95d2-28f0-411e-af21-356fa6650ed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.686 182096 DEBUG oslo_concurrency.lockutils [req-1bf8a728-e357-407f-aaca-f0a317e1e425 req-56ea95d2-28f0-411e-af21-356fa6650ed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:15 compute-0 nova_compute[182092]: 2026-01-23 09:39:15.686 182096 DEBUG nova.network.neutron [req-1bf8a728-e357-407f-aaca-f0a317e1e425 req-56ea95d2-28f0-411e-af21-356fa6650ed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Refreshing network info cache for port 2ffa52f2-8471-4c8c-ada7-a235d863feed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:39:16 compute-0 podman[229720]: 2026-01-23 09:39:16.221006437 +0000 UTC m=+0.058544089 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 09:39:17 compute-0 nova_compute[182092]: 2026-01-23 09:39:17.500 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.632 182096 DEBUG nova.network.neutron [req-1bf8a728-e357-407f-aaca-f0a317e1e425 req-56ea95d2-28f0-411e-af21-356fa6650ed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updated VIF entry in instance network info cache for port 2ffa52f2-8471-4c8c-ada7-a235d863feed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.633 182096 DEBUG nova.network.neutron [req-1bf8a728-e357-407f-aaca-f0a317e1e425 req-56ea95d2-28f0-411e-af21-356fa6650ed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updating instance_info_cache with network_info: [{"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.634 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updating instance_info_cache with network_info: [{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.657 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.657 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.658 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.658 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.658 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.659 182096 DEBUG oslo_concurrency.lockutils [req-1bf8a728-e357-407f-aaca-f0a317e1e425 req-56ea95d2-28f0-411e-af21-356fa6650ed3 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.659 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.674 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.675 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.719 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.776 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.777 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.833 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.837 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.890 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.895 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.895 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:19 compute-0 nova_compute[182092]: 2026-01-23 09:39:19.956 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.197 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.198 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5441MB free_disk=73.18312454223633GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.198 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.199 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.260 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance bd78986f-3b06-4345-8540-cb6ab9009fb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.260 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance a7e87540-5713-4ac2-a9c8-942e11144ee8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.260 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.260 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.304 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.315 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.328 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:39:20 compute-0 nova_compute[182092]: 2026-01-23 09:39:20.328 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:21 compute-0 nova_compute[182092]: 2026-01-23 09:39:21.319 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:39:21 compute-0 nova_compute[182092]: 2026-01-23 09:39:21.319 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:39:21 compute-0 nova_compute[182092]: 2026-01-23 09:39:21.479 182096 INFO nova.compute.manager [None req-804bf02e-34a8-4fa5-9486-9644a748336e 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Get console output
Jan 23 09:39:21 compute-0 nova_compute[182092]: 2026-01-23 09:39:21.484 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.205 182096 DEBUG nova.compute.manager [req-8d7dbeea-099c-4505-a3ae-f97d248723b0 req-7ed115fd-35a0-4f49-a4b9-124370980c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-changed-647c3660-1915-4e07-8634-43e7c7e49321 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.205 182096 DEBUG nova.compute.manager [req-8d7dbeea-099c-4505-a3ae-f97d248723b0 req-7ed115fd-35a0-4f49-a4b9-124370980c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Refreshing instance network info cache due to event network-changed-647c3660-1915-4e07-8634-43e7c7e49321. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.205 182096 DEBUG oslo_concurrency.lockutils [req-8d7dbeea-099c-4505-a3ae-f97d248723b0 req-7ed115fd-35a0-4f49-a4b9-124370980c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.205 182096 DEBUG oslo_concurrency.lockutils [req-8d7dbeea-099c-4505-a3ae-f97d248723b0 req-7ed115fd-35a0-4f49-a4b9-124370980c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.205 182096 DEBUG nova.network.neutron [req-8d7dbeea-099c-4505-a3ae-f97d248723b0 req-7ed115fd-35a0-4f49-a4b9-124370980c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Refreshing network info cache for port 647c3660-1915-4e07-8634-43e7c7e49321 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.280 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.280 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.280 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.280 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.281 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.289 182096 INFO nova.compute.manager [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Terminating instance
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.295 182096 DEBUG nova.compute.manager [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:39:22 compute-0 kernel: tap647c3660-19 (unregistering): left promiscuous mode
Jan 23 09:39:22 compute-0 NetworkManager[54920]: <info>  [1769161162.3159] device (tap647c3660-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.330 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:22 compute-0 ovn_controller[94697]: 2026-01-23T09:39:22Z|00621|binding|INFO|Releasing lport 647c3660-1915-4e07-8634-43e7c7e49321 from this chassis (sb_readonly=0)
Jan 23 09:39:22 compute-0 ovn_controller[94697]: 2026-01-23T09:39:22Z|00622|binding|INFO|Setting lport 647c3660-1915-4e07-8634-43e7c7e49321 down in Southbound
Jan 23 09:39:22 compute-0 ovn_controller[94697]: 2026-01-23T09:39:22Z|00623|binding|INFO|Removing iface tap647c3660-19 ovn-installed in OVS
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.332 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.336 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:49:8c 10.100.0.10'], port_security=['fa:16:3e:b4:49:8c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bd78986f-3b06-4345-8540-cb6ab9009fb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e313631-de8d-446e-b2e7-770dbeeb9852', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16f4f7bd-41ae-4b52-8026-76105c06c3aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e71e79f1-2acd-43dc-97c0-c92006b6370d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=647c3660-1915-4e07-8634-43e7c7e49321) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.337 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 647c3660-1915-4e07-8634-43e7c7e49321 in datapath 2e313631-de8d-446e-b2e7-770dbeeb9852 unbound from our chassis
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.339 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2e313631-de8d-446e-b2e7-770dbeeb9852, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.346 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[45f644b6-5201-444e-9494-a35b9ee9a498]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.346 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852 namespace which is not needed anymore
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.348 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:22 compute-0 ovn_controller[94697]: 2026-01-23T09:39:22Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:70:58 10.100.0.12
Jan 23 09:39:22 compute-0 ovn_controller[94697]: 2026-01-23T09:39:22Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:70:58 10.100.0.12
Jan 23 09:39:22 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000093.scope: Deactivated successfully.
Jan 23 09:39:22 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000093.scope: Consumed 14.080s CPU time.
Jan 23 09:39:22 compute-0 systemd-machined[153562]: Machine qemu-76-instance-00000093 terminated.
Jan 23 09:39:22 compute-0 neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852[228793]: [NOTICE]   (228797) : haproxy version is 2.8.14-c23fe91
Jan 23 09:39:22 compute-0 neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852[228793]: [NOTICE]   (228797) : path to executable is /usr/sbin/haproxy
Jan 23 09:39:22 compute-0 neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852[228793]: [WARNING]  (228797) : Exiting Master process...
Jan 23 09:39:22 compute-0 neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852[228793]: [ALERT]    (228797) : Current worker (228799) exited with code 143 (Terminated)
Jan 23 09:39:22 compute-0 neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852[228793]: [WARNING]  (228797) : All workers exited. Exiting... (0)
Jan 23 09:39:22 compute-0 systemd[1]: libpod-e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59.scope: Deactivated successfully.
Jan 23 09:39:22 compute-0 podman[229799]: 2026-01-23 09:39:22.44891674 +0000 UTC m=+0.037867299 container died e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:39:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59-userdata-shm.mount: Deactivated successfully.
Jan 23 09:39:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-2a85481472e38637ca6b038b4a7d084995f5b9b51266f95475844976049913bd-merged.mount: Deactivated successfully.
Jan 23 09:39:22 compute-0 podman[229799]: 2026-01-23 09:39:22.476297752 +0000 UTC m=+0.065248302 container cleanup e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:39:22 compute-0 systemd[1]: libpod-conmon-e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59.scope: Deactivated successfully.
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.503 182096 DEBUG nova.compute.manager [req-8e746837-bad8-4f55-8378-9352e3cf858c req-560727ce-7234-485f-a0f7-e69f9e7f0461 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-vif-unplugged-647c3660-1915-4e07-8634-43e7c7e49321 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.504 182096 DEBUG oslo_concurrency.lockutils [req-8e746837-bad8-4f55-8378-9352e3cf858c req-560727ce-7234-485f-a0f7-e69f9e7f0461 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.505 182096 DEBUG oslo_concurrency.lockutils [req-8e746837-bad8-4f55-8378-9352e3cf858c req-560727ce-7234-485f-a0f7-e69f9e7f0461 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.505 182096 DEBUG oslo_concurrency.lockutils [req-8e746837-bad8-4f55-8378-9352e3cf858c req-560727ce-7234-485f-a0f7-e69f9e7f0461 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.505 182096 DEBUG nova.compute.manager [req-8e746837-bad8-4f55-8378-9352e3cf858c req-560727ce-7234-485f-a0f7-e69f9e7f0461 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] No waiting events found dispatching network-vif-unplugged-647c3660-1915-4e07-8634-43e7c7e49321 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.505 182096 DEBUG nova.compute.manager [req-8e746837-bad8-4f55-8378-9352e3cf858c req-560727ce-7234-485f-a0f7-e69f9e7f0461 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-vif-unplugged-647c3660-1915-4e07-8634-43e7c7e49321 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.505 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:22 compute-0 podman[229821]: 2026-01-23 09:39:22.533285226 +0000 UTC m=+0.040460979 container remove e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.540 182096 INFO nova.virt.libvirt.driver [-] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Instance destroyed successfully.
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.539 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[faeb5d4e-31ec-45b1-af33-a8c400ae2b53]: (4, ('Fri Jan 23 09:39:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852 (e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59)\ne8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59\nFri Jan 23 09:39:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852 (e8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59)\ne8f5c7cf9ccb295d22e139c5d8180400d6bdb7f1a8227b5d84bbf9bed4893b59\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.540 182096 DEBUG nova.objects.instance [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'resources' on Instance uuid bd78986f-3b06-4345-8540-cb6ab9009fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.542 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2c42d654-29f3-4f52-920d-099736e09a8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.542 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e313631-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.543 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.556 182096 DEBUG nova.virt.libvirt.vif [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:36:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-449213939',display_name='tempest-TestNetworkBasicOps-server-449213939',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-449213939',id=147,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDLlh/W8fpdYSmbxeIab5T+zjxt4urFKyLKqSjPhnxgn6AKuUkpHIVSILw1O0d54oiROsTsbOfe+NEugfjZDIrHCSlRU3BF04WQfou4hQRgLlD/lKNDa5dCseNGA1Hqsqw==',key_name='tempest-TestNetworkBasicOps-974513515',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-xkgdgeo1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:37:01Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=bd78986f-3b06-4345-8540-cb6ab9009fb2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.556 182096 DEBUG nova.network.os_vif_util [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:39:22 compute-0 kernel: tap2e313631-d0: left promiscuous mode
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.558 182096 DEBUG nova.network.os_vif_util [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:49:8c,bridge_name='br-int',has_traffic_filtering=True,id=647c3660-1915-4e07-8634-43e7c7e49321,network=Network(2e313631-de8d-446e-b2e7-770dbeeb9852),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap647c3660-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.558 182096 DEBUG os_vif [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:49:8c,bridge_name='br-int',has_traffic_filtering=True,id=647c3660-1915-4e07-8634-43e7c7e49321,network=Network(2e313631-de8d-446e-b2e7-770dbeeb9852),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap647c3660-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.560 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.560 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap647c3660-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.562 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.563 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.563 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b728f25b-5e67-4e47-9eb5-bd16070afd8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.565 182096 INFO os_vif [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:49:8c,bridge_name='br-int',has_traffic_filtering=True,id=647c3660-1915-4e07-8634-43e7c7e49321,network=Network(2e313631-de8d-446e-b2e7-770dbeeb9852),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap647c3660-19')
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.566 182096 INFO nova.virt.libvirt.driver [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Deleting instance files /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2_del
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.566 182096 INFO nova.virt.libvirt.driver [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Deletion of /var/lib/nova/instances/bd78986f-3b06-4345-8540-cb6ab9009fb2_del complete
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.569 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[80d96d0a-8d26-4d83-a3c3-b7cbdcf61112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.570 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b37dd5-959c-41f8-b645-03cc5ce71f76]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.582 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef2fa87-3e0f-40d1-b55f-a1b388238c62]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452695, 'reachable_time': 43086, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229851, 'error': None, 'target': 'ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:22 compute-0 systemd[1]: run-netns-ovnmeta\x2d2e313631\x2dde8d\x2d446e\x2db2e7\x2d770dbeeb9852.mount: Deactivated successfully.
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.584 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2e313631-de8d-446e-b2e7-770dbeeb9852 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:39:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:22.585 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[17ab79f4-1e88-4e06-882a-4015749daeed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.626 182096 INFO nova.compute.manager [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.626 182096 DEBUG oslo.service.loopingcall [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.626 182096 DEBUG nova.compute.manager [-] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:39:22 compute-0 nova_compute[182092]: 2026-01-23 09:39:22.626 182096 DEBUG nova.network.neutron [-] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.146 182096 DEBUG nova.network.neutron [-] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.153 182096 DEBUG nova.network.neutron [req-8d7dbeea-099c-4505-a3ae-f97d248723b0 req-7ed115fd-35a0-4f49-a4b9-124370980c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updated VIF entry in instance network info cache for port 647c3660-1915-4e07-8634-43e7c7e49321. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.154 182096 DEBUG nova.network.neutron [req-8d7dbeea-099c-4505-a3ae-f97d248723b0 req-7ed115fd-35a0-4f49-a4b9-124370980c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Updating instance_info_cache with network_info: [{"id": "647c3660-1915-4e07-8634-43e7c7e49321", "address": "fa:16:3e:b4:49:8c", "network": {"id": "2e313631-de8d-446e-b2e7-770dbeeb9852", "bridge": "br-int", "label": "tempest-network-smoke--362121355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap647c3660-19", "ovs_interfaceid": "647c3660-1915-4e07-8634-43e7c7e49321", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.167 182096 INFO nova.compute.manager [-] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Took 0.54 seconds to deallocate network for instance.
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.172 182096 DEBUG oslo_concurrency.lockutils [req-8d7dbeea-099c-4505-a3ae-f97d248723b0 req-7ed115fd-35a0-4f49-a4b9-124370980c05 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-bd78986f-3b06-4345-8540-cb6ab9009fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.218 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.218 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.270 182096 DEBUG nova.compute.provider_tree [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.282 182096 DEBUG nova.scheduler.client.report [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.295 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.330 182096 INFO nova.scheduler.client.report [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Deleted allocations for instance bd78986f-3b06-4345-8540-cb6ab9009fb2
Jan 23 09:39:23 compute-0 nova_compute[182092]: 2026-01-23 09:39:23.430 182096 DEBUG oslo_concurrency.lockutils [None req-91bc6a0b-40f6-4ecd-9481-f57c43dbd6e1 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:24 compute-0 podman[229852]: 2026-01-23 09:39:24.206734856 +0000 UTC m=+0.038856545 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 09:39:24 compute-0 podman[229853]: 2026-01-23 09:39:24.213357755 +0000 UTC m=+0.044734508 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.580 182096 DEBUG nova.compute.manager [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.580 182096 DEBUG oslo_concurrency.lockutils [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.580 182096 DEBUG oslo_concurrency.lockutils [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.581 182096 DEBUG oslo_concurrency.lockutils [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "bd78986f-3b06-4345-8540-cb6ab9009fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.581 182096 DEBUG nova.compute.manager [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] No waiting events found dispatching network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.581 182096 WARNING nova.compute.manager [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received unexpected event network-vif-plugged-647c3660-1915-4e07-8634-43e7c7e49321 for instance with vm_state deleted and task_state None.
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.581 182096 DEBUG nova.compute.manager [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Received event network-vif-deleted-647c3660-1915-4e07-8634-43e7c7e49321 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.581 182096 INFO nova.compute.manager [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Neutron deleted interface 647c3660-1915-4e07-8634-43e7c7e49321; detaching it from the instance and deleting it from the info cache
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.582 182096 DEBUG nova.network.neutron [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 23 09:39:24 compute-0 nova_compute[182092]: 2026-01-23 09:39:24.583 182096 DEBUG nova.compute.manager [req-bb581d14-03e0-4922-9508-c11289edeb03 req-28f32dc1-69bb-42e0-9846-e58eec2e4af7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Detach interface failed, port_id=647c3660-1915-4e07-8634-43e7c7e49321, reason: Instance bd78986f-3b06-4345-8540-cb6ab9009fb2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 09:39:25 compute-0 ovn_controller[94697]: 2026-01-23T09:39:25Z|00624|binding|INFO|Releasing lport a419c1b5-3dc3-4292-bea6-667dcc1d421c from this chassis (sb_readonly=0)
Jan 23 09:39:25 compute-0 ovn_controller[94697]: 2026-01-23T09:39:25Z|00625|binding|INFO|Releasing lport 2830e4e6-7d79-4a86-9435-6e54ee15cd1e from this chassis (sb_readonly=0)
Jan 23 09:39:25 compute-0 nova_compute[182092]: 2026-01-23 09:39:25.948 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:27 compute-0 nova_compute[182092]: 2026-01-23 09:39:27.505 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:27 compute-0 nova_compute[182092]: 2026-01-23 09:39:27.561 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:29 compute-0 nova_compute[182092]: 2026-01-23 09:39:29.295 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:29 compute-0 ovn_controller[94697]: 2026-01-23T09:39:29Z|00626|binding|INFO|Releasing lport a419c1b5-3dc3-4292-bea6-667dcc1d421c from this chassis (sb_readonly=0)
Jan 23 09:39:29 compute-0 ovn_controller[94697]: 2026-01-23T09:39:29Z|00627|binding|INFO|Releasing lport 2830e4e6-7d79-4a86-9435-6e54ee15cd1e from this chassis (sb_readonly=0)
Jan 23 09:39:29 compute-0 nova_compute[182092]: 2026-01-23 09:39:29.700 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:32 compute-0 nova_compute[182092]: 2026-01-23 09:39:32.508 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:32 compute-0 nova_compute[182092]: 2026-01-23 09:39:32.562 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:33 compute-0 nova_compute[182092]: 2026-01-23 09:39:33.313 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:33 compute-0 nova_compute[182092]: 2026-01-23 09:39:33.969 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:35 compute-0 podman[229891]: 2026-01-23 09:39:35.206195016 +0000 UTC m=+0.039700916 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:39:35 compute-0 podman[229892]: 2026-01-23 09:39:35.206244458 +0000 UTC m=+0.037444941 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:39:37 compute-0 nova_compute[182092]: 2026-01-23 09:39:37.510 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:37 compute-0 nova_compute[182092]: 2026-01-23 09:39:37.539 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161162.537919, bd78986f-3b06-4345-8540-cb6ab9009fb2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:39:37 compute-0 nova_compute[182092]: 2026-01-23 09:39:37.539 182096 INFO nova.compute.manager [-] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] VM Stopped (Lifecycle Event)
Jan 23 09:39:37 compute-0 nova_compute[182092]: 2026-01-23 09:39:37.563 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:37 compute-0 nova_compute[182092]: 2026-01-23 09:39:37.643 182096 DEBUG nova.compute.manager [None req-67d61fd1-e61b-4097-850e-d25f33a1a693 - - - - - -] [instance: bd78986f-3b06-4345-8540-cb6ab9009fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:38 compute-0 podman[229928]: 2026-01-23 09:39:38.208381882 +0000 UTC m=+0.041552287 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Jan 23 09:39:38 compute-0 nova_compute[182092]: 2026-01-23 09:39:38.370 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:39.870 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:39.871 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:39.871 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:41.630 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:39:41 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:41.631 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.631 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.754 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.754 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.764 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.830 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.830 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.834 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.834 182096 INFO nova.compute.claims [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.914 182096 DEBUG nova.compute.provider_tree [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.922 182096 DEBUG nova.scheduler.client.report [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.935 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.935 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.975 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.975 182096 DEBUG nova.network.neutron [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.987 182096 INFO nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:39:41 compute-0 nova_compute[182092]: 2026-01-23 09:39:41.999 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.114 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.115 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.115 182096 INFO nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Creating image(s)
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.116 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "/var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.116 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "/var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.116 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "/var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.126 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.149 182096 DEBUG nova.policy [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.173 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.173 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.174 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.184 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.229 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.229 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.250 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.250 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.251 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.294 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.294 182096 DEBUG nova.virt.disk.api [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Checking if we can resize image /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.295 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.339 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.340 182096 DEBUG nova.virt.disk.api [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Cannot resize image /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.340 182096 DEBUG nova.objects.instance [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lazy-loading 'migration_context' on Instance uuid c090cdd2-7b44-4f97-9ccb-253fe16eef62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.352 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.353 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Ensure instance console log exists: /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.353 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.353 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.353 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.511 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:42 compute-0 nova_compute[182092]: 2026-01-23 09:39:42.563 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:43 compute-0 nova_compute[182092]: 2026-01-23 09:39:43.024 182096 DEBUG nova.network.neutron [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Successfully created port: dd650e27-b8d1-4200-a549-2495e4e181fa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:39:43 compute-0 nova_compute[182092]: 2026-01-23 09:39:43.592 182096 DEBUG nova.network.neutron [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Successfully updated port: dd650e27-b8d1-4200-a549-2495e4e181fa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:39:43 compute-0 nova_compute[182092]: 2026-01-23 09:39:43.603 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:43 compute-0 nova_compute[182092]: 2026-01-23 09:39:43.603 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquired lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:43 compute-0 nova_compute[182092]: 2026-01-23 09:39:43.603 182096 DEBUG nova.network.neutron [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:39:43 compute-0 nova_compute[182092]: 2026-01-23 09:39:43.762 182096 DEBUG nova.network.neutron [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.620 182096 DEBUG nova.network.neutron [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Updating instance_info_cache with network_info: [{"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.641 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Releasing lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.641 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Instance network_info: |[{"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.643 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Start _get_guest_xml network_info=[{"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.645 182096 WARNING nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.650 182096 DEBUG nova.virt.libvirt.host [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.650 182096 DEBUG nova.virt.libvirt.host [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.660 182096 DEBUG nova.virt.libvirt.host [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.661 182096 DEBUG nova.virt.libvirt.host [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.661 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.662 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.662 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.662 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.662 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.663 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.663 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.663 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.663 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.663 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.664 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.664 182096 DEBUG nova.virt.hardware [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.666 182096 DEBUG nova.virt.libvirt.vif [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:39:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-315406914-acc',id=158,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDhSC5lTbToOC4esGJ7L03gtjuSkFHF+O4VOv/0ddgi0bh9OfG3oIyqm1/8NB7Q4aoFUFogxzG/XzWucYO6KnLvyJ2b8377h31gIuMHmN7lo/b662FlKgXZo/fiSt4ZgZQ==',key_name='tempest-TestSecurityGroupsBasicOps-26663372',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5be4fcb5d9df463eb390de7dc8be6fec',ramdisk_id='',reservation_id='r-fmd9kbtt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-315406914',owner_user_name='tempest-TestSecurityGroupsBasicOps-315406914-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:39:42Z,user_data=None,user_id='ca9dd50592344370a9c82fc29637452b',uuid=c090cdd2-7b44-4f97-9ccb-253fe16eef62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.667 182096 DEBUG nova.network.os_vif_util [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Converting VIF {"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.667 182096 DEBUG nova.network.os_vif_util [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:8d:19,bridge_name='br-int',has_traffic_filtering=True,id=dd650e27-b8d1-4200-a549-2495e4e181fa,network=Network(9f1cf616-2c32-488c-baab-73aa253de79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd650e27-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.668 182096 DEBUG nova.objects.instance [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lazy-loading 'pci_devices' on Instance uuid c090cdd2-7b44-4f97-9ccb-253fe16eef62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.682 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <uuid>c090cdd2-7b44-4f97-9ccb-253fe16eef62</uuid>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <name>instance-0000009e</name>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385</nova:name>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:39:44</nova:creationTime>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:39:44 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:39:44 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:39:44 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:39:44 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:39:44 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:39:44 compute-0 nova_compute[182092]:         <nova:user uuid="ca9dd50592344370a9c82fc29637452b">tempest-TestSecurityGroupsBasicOps-315406914-project-member</nova:user>
Jan 23 09:39:44 compute-0 nova_compute[182092]:         <nova:project uuid="5be4fcb5d9df463eb390de7dc8be6fec">tempest-TestSecurityGroupsBasicOps-315406914</nova:project>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:39:44 compute-0 nova_compute[182092]:         <nova:port uuid="dd650e27-b8d1-4200-a549-2495e4e181fa">
Jan 23 09:39:44 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <system>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <entry name="serial">c090cdd2-7b44-4f97-9ccb-253fe16eef62</entry>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <entry name="uuid">c090cdd2-7b44-4f97-9ccb-253fe16eef62</entry>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </system>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <os>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   </os>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <features>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   </features>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.config"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:b3:8d:19"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <target dev="tapdd650e27-b8"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/console.log" append="off"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <video>
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </video>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:39:44 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:39:44 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:39:44 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:39:44 compute-0 nova_compute[182092]: </domain>
Jan 23 09:39:44 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.683 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Preparing to wait for external event network-vif-plugged-dd650e27-b8d1-4200-a549-2495e4e181fa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.683 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.683 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.683 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.684 182096 DEBUG nova.virt.libvirt.vif [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:39:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-315406914-acc',id=158,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDhSC5lTbToOC4esGJ7L03gtjuSkFHF+O4VOv/0ddgi0bh9OfG3oIyqm1/8NB7Q4aoFUFogxzG/XzWucYO6KnLvyJ2b8377h31gIuMHmN7lo/b662FlKgXZo/fiSt4ZgZQ==',key_name='tempest-TestSecurityGroupsBasicOps-26663372',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5be4fcb5d9df463eb390de7dc8be6fec',ramdisk_id='',reservation_id='r-fmd9kbtt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-315406914',owner_user_name='tempest-TestSecurityGroupsBasicOps-315406914-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:39:42Z,user_data=None,user_id='ca9dd50592344370a9c82fc29637452b',uuid=c090cdd2-7b44-4f97-9ccb-253fe16eef62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.684 182096 DEBUG nova.network.os_vif_util [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Converting VIF {"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.685 182096 DEBUG nova.network.os_vif_util [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:8d:19,bridge_name='br-int',has_traffic_filtering=True,id=dd650e27-b8d1-4200-a549-2495e4e181fa,network=Network(9f1cf616-2c32-488c-baab-73aa253de79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd650e27-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.685 182096 DEBUG os_vif [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:8d:19,bridge_name='br-int',has_traffic_filtering=True,id=dd650e27-b8d1-4200-a549-2495e4e181fa,network=Network(9f1cf616-2c32-488c-baab-73aa253de79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd650e27-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.685 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.686 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.686 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.688 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.688 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd650e27-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.688 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdd650e27-b8, col_values=(('external_ids', {'iface-id': 'dd650e27-b8d1-4200-a549-2495e4e181fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:8d:19', 'vm-uuid': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.689 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:44 compute-0 NetworkManager[54920]: <info>  [1769161184.6904] manager: (tapdd650e27-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.693 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.693 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.694 182096 INFO os_vif [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:8d:19,bridge_name='br-int',has_traffic_filtering=True,id=dd650e27-b8d1-4200-a549-2495e4e181fa,network=Network(9f1cf616-2c32-488c-baab-73aa253de79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd650e27-b8')
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.744 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.744 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.745 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] No VIF found with MAC fa:16:3e:b3:8d:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:39:44 compute-0 nova_compute[182092]: 2026-01-23 09:39:44.745 182096 INFO nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Using config drive
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.012 182096 INFO nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Creating config drive at /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.config
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.017 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7di1g6za execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.135 182096 DEBUG oslo_concurrency.processutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7di1g6za" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:39:45 compute-0 kernel: tapdd650e27-b8: entered promiscuous mode
Jan 23 09:39:45 compute-0 NetworkManager[54920]: <info>  [1769161185.1687] manager: (tapdd650e27-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Jan 23 09:39:45 compute-0 ovn_controller[94697]: 2026-01-23T09:39:45Z|00628|binding|INFO|Claiming lport dd650e27-b8d1-4200-a549-2495e4e181fa for this chassis.
Jan 23 09:39:45 compute-0 ovn_controller[94697]: 2026-01-23T09:39:45Z|00629|binding|INFO|dd650e27-b8d1-4200-a549-2495e4e181fa: Claiming fa:16:3e:b3:8d:19 10.100.0.13
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.171 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.184 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:8d:19 10.100.0.13'], port_security=['fa:16:3e:b3:8d:19 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f1cf616-2c32-488c-baab-73aa253de79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2c46b324-68c0-4a50-a85f-4e212875add8 56176f0a-2fb7-4e45-921d-b2b7e6c271bf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae90b9ba-24e1-405b-9cf2-8420d2ca0301, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=dd650e27-b8d1-4200-a549-2495e4e181fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.185 103978 INFO neutron.agent.ovn.metadata.agent [-] Port dd650e27-b8d1-4200-a549-2495e4e181fa in datapath 9f1cf616-2c32-488c-baab-73aa253de79f bound to our chassis
Jan 23 09:39:45 compute-0 ovn_controller[94697]: 2026-01-23T09:39:45Z|00630|binding|INFO|Setting lport dd650e27-b8d1-4200-a549-2495e4e181fa up in Southbound
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.186 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f1cf616-2c32-488c-baab-73aa253de79f
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.187 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:45 compute-0 ovn_controller[94697]: 2026-01-23T09:39:45Z|00631|binding|INFO|Setting lport dd650e27-b8d1-4200-a549-2495e4e181fa ovn-installed in OVS
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.194 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.194 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7df67e2b-ddd5-46e3-a1d1-4c11f6fc2776]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.195 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f1cf616-21 in ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.196 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f1cf616-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.196 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e850491b-bd3a-40b0-bf94-9b9f10696696]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.197 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[85b6e22e-58e7-4b67-8600-2673ac1e22f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 systemd-udevd[229981]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.208 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[f08fcf1e-ded7-4cf0-aca5-e80522b15343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 NetworkManager[54920]: <info>  [1769161185.2097] device (tapdd650e27-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:39:45 compute-0 NetworkManager[54920]: <info>  [1769161185.2104] device (tapdd650e27-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:39:45 compute-0 systemd-machined[153562]: New machine qemu-79-instance-0000009e.
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.217 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7debca25-f26e-4097-8c81-c526c00ce9a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 systemd[1]: Started Virtual Machine qemu-79-instance-0000009e.
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.239 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b15f9daa-9363-43f3-9cb4-2a8b62c08ee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 NetworkManager[54920]: <info>  [1769161185.2443] manager: (tap9f1cf616-20): new Veth device (/org/freedesktop/NetworkManager/Devices/315)
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.243 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7919845d-de7a-4861-83f2-c01ace6108f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 systemd-udevd[229985]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.271 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[f738b445-33d5-4466-aba1-ff94c45107d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.277 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c627d973-87ce-43b9-bb44-cb4cebc139aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 NetworkManager[54920]: <info>  [1769161185.2917] device (tap9f1cf616-20): carrier: link connected
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.295 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce2ac50-356b-4857-8ec0-5755fe018ced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.306 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[82eb7300-103b-48ac-85e7-1caa5ef78ed3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f1cf616-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:5a:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 191], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469077, 'reachable_time': 42414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230006, 'error': None, 'target': 'ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.316 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[36021502-5c5c-4c87-981a-09f02c473ba0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe19:5ad6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469077, 'tstamp': 469077}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230007, 'error': None, 'target': 'ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.326 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6a883c27-9e69-457f-98c3-2e3274c5695c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f1cf616-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:5a:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 191], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469077, 'reachable_time': 42414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230008, 'error': None, 'target': 'ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.344 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[01f9504f-621d-4732-8ba0-fe55f27b2795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.383 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fc863601-2671-47e5-bfc2-1dfa5284c171]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.384 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f1cf616-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.385 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.386 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f1cf616-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.388 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:45 compute-0 kernel: tap9f1cf616-20: entered promiscuous mode
Jan 23 09:39:45 compute-0 NetworkManager[54920]: <info>  [1769161185.3891] manager: (tap9f1cf616-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.393 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f1cf616-20, col_values=(('external_ids', {'iface-id': 'ddbf3865-7fb3-4d33-bb90-eb46c309610d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.393 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:45 compute-0 ovn_controller[94697]: 2026-01-23T09:39:45Z|00632|binding|INFO|Releasing lport ddbf3865-7fb3-4d33-bb90-eb46c309610d from this chassis (sb_readonly=0)
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.394 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.396 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f1cf616-2c32-488c-baab-73aa253de79f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f1cf616-2c32-488c-baab-73aa253de79f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.406 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.406 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed6ac28-3659-4699-b246-93745c5ee77d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.407 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-9f1cf616-2c32-488c-baab-73aa253de79f
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/9f1cf616-2c32-488c-baab-73aa253de79f.pid.haproxy
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 9f1cf616-2c32-488c-baab-73aa253de79f
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:39:45 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:45.408 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f', 'env', 'PROCESS_TAG=haproxy-9f1cf616-2c32-488c-baab-73aa253de79f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f1cf616-2c32-488c-baab-73aa253de79f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.597 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161185.597383, c090cdd2-7b44-4f97-9ccb-253fe16eef62 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.598 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] VM Started (Lifecycle Event)
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.612 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.615 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161185.5974824, c090cdd2-7b44-4f97-9ccb-253fe16eef62 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.615 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] VM Paused (Lifecycle Event)
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.629 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.631 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:39:45 compute-0 nova_compute[182092]: 2026-01-23 09:39:45.647 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:39:45 compute-0 podman[230043]: 2026-01-23 09:39:45.695953014 +0000 UTC m=+0.033128492 container create 4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:39:45 compute-0 systemd[1]: Started libpod-conmon-4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744.scope.
Jan 23 09:39:45 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:39:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92a96dacea434b4d2030329f295a568a2ea4c5daa874076935a1f54718b0b801/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:39:45 compute-0 podman[230043]: 2026-01-23 09:39:45.752455732 +0000 UTC m=+0.089631210 container init 4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:39:45 compute-0 podman[230043]: 2026-01-23 09:39:45.756722519 +0000 UTC m=+0.093897986 container start 4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 09:39:45 compute-0 podman[230043]: 2026-01-23 09:39:45.680803767 +0000 UTC m=+0.017979264 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:39:45 compute-0 neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f[230055]: [NOTICE]   (230059) : New worker (230061) forked
Jan 23 09:39:45 compute-0 neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f[230055]: [NOTICE]   (230059) : Loading success.
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.027 182096 DEBUG nova.compute.manager [req-8df74aa5-d989-41d8-ac5f-d754c18118cc req-753a7a11-00a0-47d8-91d4-d5814ab68d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Received event network-vif-plugged-dd650e27-b8d1-4200-a549-2495e4e181fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.027 182096 DEBUG oslo_concurrency.lockutils [req-8df74aa5-d989-41d8-ac5f-d754c18118cc req-753a7a11-00a0-47d8-91d4-d5814ab68d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.028 182096 DEBUG oslo_concurrency.lockutils [req-8df74aa5-d989-41d8-ac5f-d754c18118cc req-753a7a11-00a0-47d8-91d4-d5814ab68d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.028 182096 DEBUG oslo_concurrency.lockutils [req-8df74aa5-d989-41d8-ac5f-d754c18118cc req-753a7a11-00a0-47d8-91d4-d5814ab68d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.028 182096 DEBUG nova.compute.manager [req-8df74aa5-d989-41d8-ac5f-d754c18118cc req-753a7a11-00a0-47d8-91d4-d5814ab68d8d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Processing event network-vif-plugged-dd650e27-b8d1-4200-a549-2495e4e181fa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.029 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.033 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161186.0328588, c090cdd2-7b44-4f97-9ccb-253fe16eef62 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.033 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] VM Resumed (Lifecycle Event)
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.035 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.037 182096 INFO nova.virt.libvirt.driver [-] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Instance spawned successfully.
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.037 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.057 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.060 182096 DEBUG nova.compute.manager [req-dda255d9-037d-4e72-a740-c92ad04a626b req-afa5f06e-de07-44d7-85e6-d0ac3a57d63c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Received event network-changed-dd650e27-b8d1-4200-a549-2495e4e181fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.060 182096 DEBUG nova.compute.manager [req-dda255d9-037d-4e72-a740-c92ad04a626b req-afa5f06e-de07-44d7-85e6-d0ac3a57d63c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Refreshing instance network info cache due to event network-changed-dd650e27-b8d1-4200-a549-2495e4e181fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.060 182096 DEBUG oslo_concurrency.lockutils [req-dda255d9-037d-4e72-a740-c92ad04a626b req-afa5f06e-de07-44d7-85e6-d0ac3a57d63c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.061 182096 DEBUG oslo_concurrency.lockutils [req-dda255d9-037d-4e72-a740-c92ad04a626b req-afa5f06e-de07-44d7-85e6-d0ac3a57d63c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.061 182096 DEBUG nova.network.neutron [req-dda255d9-037d-4e72-a740-c92ad04a626b req-afa5f06e-de07-44d7-85e6-d0ac3a57d63c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Refreshing network info cache for port dd650e27-b8d1-4200-a549-2495e4e181fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.064 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.065 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.065 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.065 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.066 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.066 182096 DEBUG nova.virt.libvirt.driver [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.070 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.106 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.154 182096 INFO nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Took 4.04 seconds to spawn the instance on the hypervisor.
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.155 182096 DEBUG nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.228 182096 INFO nova.compute.manager [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Took 4.42 seconds to build instance.
Jan 23 09:39:46 compute-0 nova_compute[182092]: 2026-01-23 09:39:46.240 182096 DEBUG oslo_concurrency.lockutils [None req-b9a1c2c5-0984-4492-a1d0-56da3fa9be01 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:47 compute-0 nova_compute[182092]: 2026-01-23 09:39:47.085 182096 DEBUG nova.network.neutron [req-dda255d9-037d-4e72-a740-c92ad04a626b req-afa5f06e-de07-44d7-85e6-d0ac3a57d63c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Updated VIF entry in instance network info cache for port dd650e27-b8d1-4200-a549-2495e4e181fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:39:47 compute-0 nova_compute[182092]: 2026-01-23 09:39:47.085 182096 DEBUG nova.network.neutron [req-dda255d9-037d-4e72-a740-c92ad04a626b req-afa5f06e-de07-44d7-85e6-d0ac3a57d63c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Updating instance_info_cache with network_info: [{"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:47 compute-0 nova_compute[182092]: 2026-01-23 09:39:47.106 182096 DEBUG oslo_concurrency.lockutils [req-dda255d9-037d-4e72-a740-c92ad04a626b req-afa5f06e-de07-44d7-85e6-d0ac3a57d63c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:47 compute-0 podman[230066]: 2026-01-23 09:39:47.223992631 +0000 UTC m=+0.061274779 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:39:47 compute-0 nova_compute[182092]: 2026-01-23 09:39:47.512 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:47 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:39:47.632 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:39:48 compute-0 nova_compute[182092]: 2026-01-23 09:39:48.098 182096 DEBUG nova.compute.manager [req-ee652b75-c7f0-4f42-8627-c928b5f2da56 req-58fe3019-0033-4318-8ec8-84c7d7ae7d49 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Received event network-vif-plugged-dd650e27-b8d1-4200-a549-2495e4e181fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:48 compute-0 nova_compute[182092]: 2026-01-23 09:39:48.099 182096 DEBUG oslo_concurrency.lockutils [req-ee652b75-c7f0-4f42-8627-c928b5f2da56 req-58fe3019-0033-4318-8ec8-84c7d7ae7d49 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:39:48 compute-0 nova_compute[182092]: 2026-01-23 09:39:48.099 182096 DEBUG oslo_concurrency.lockutils [req-ee652b75-c7f0-4f42-8627-c928b5f2da56 req-58fe3019-0033-4318-8ec8-84c7d7ae7d49 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:39:48 compute-0 nova_compute[182092]: 2026-01-23 09:39:48.099 182096 DEBUG oslo_concurrency.lockutils [req-ee652b75-c7f0-4f42-8627-c928b5f2da56 req-58fe3019-0033-4318-8ec8-84c7d7ae7d49 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:39:48 compute-0 nova_compute[182092]: 2026-01-23 09:39:48.099 182096 DEBUG nova.compute.manager [req-ee652b75-c7f0-4f42-8627-c928b5f2da56 req-58fe3019-0033-4318-8ec8-84c7d7ae7d49 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] No waiting events found dispatching network-vif-plugged-dd650e27-b8d1-4200-a549-2495e4e181fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:39:48 compute-0 nova_compute[182092]: 2026-01-23 09:39:48.099 182096 WARNING nova.compute.manager [req-ee652b75-c7f0-4f42-8627-c928b5f2da56 req-58fe3019-0033-4318-8ec8-84c7d7ae7d49 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Received unexpected event network-vif-plugged-dd650e27-b8d1-4200-a549-2495e4e181fa for instance with vm_state active and task_state None.
Jan 23 09:39:49 compute-0 nova_compute[182092]: 2026-01-23 09:39:49.691 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:50 compute-0 nova_compute[182092]: 2026-01-23 09:39:50.171 182096 DEBUG nova.compute.manager [req-4636652b-d8f9-4c8b-ba27-9719122834a5 req-c74e9ce5-4c15-4862-bc88-bc5d092afb48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Received event network-changed-dd650e27-b8d1-4200-a549-2495e4e181fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:39:50 compute-0 nova_compute[182092]: 2026-01-23 09:39:50.171 182096 DEBUG nova.compute.manager [req-4636652b-d8f9-4c8b-ba27-9719122834a5 req-c74e9ce5-4c15-4862-bc88-bc5d092afb48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Refreshing instance network info cache due to event network-changed-dd650e27-b8d1-4200-a549-2495e4e181fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:39:50 compute-0 nova_compute[182092]: 2026-01-23 09:39:50.171 182096 DEBUG oslo_concurrency.lockutils [req-4636652b-d8f9-4c8b-ba27-9719122834a5 req-c74e9ce5-4c15-4862-bc88-bc5d092afb48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:39:50 compute-0 nova_compute[182092]: 2026-01-23 09:39:50.171 182096 DEBUG oslo_concurrency.lockutils [req-4636652b-d8f9-4c8b-ba27-9719122834a5 req-c74e9ce5-4c15-4862-bc88-bc5d092afb48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:39:50 compute-0 nova_compute[182092]: 2026-01-23 09:39:50.172 182096 DEBUG nova.network.neutron [req-4636652b-d8f9-4c8b-ba27-9719122834a5 req-c74e9ce5-4c15-4862-bc88-bc5d092afb48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Refreshing network info cache for port dd650e27-b8d1-4200-a549-2495e4e181fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:39:50 compute-0 nova_compute[182092]: 2026-01-23 09:39:50.946 182096 DEBUG nova.network.neutron [req-4636652b-d8f9-4c8b-ba27-9719122834a5 req-c74e9ce5-4c15-4862-bc88-bc5d092afb48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Updated VIF entry in instance network info cache for port dd650e27-b8d1-4200-a549-2495e4e181fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:39:50 compute-0 nova_compute[182092]: 2026-01-23 09:39:50.947 182096 DEBUG nova.network.neutron [req-4636652b-d8f9-4c8b-ba27-9719122834a5 req-c74e9ce5-4c15-4862-bc88-bc5d092afb48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Updating instance_info_cache with network_info: [{"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:39:50 compute-0 nova_compute[182092]: 2026-01-23 09:39:50.960 182096 DEBUG oslo_concurrency.lockutils [req-4636652b-d8f9-4c8b-ba27-9719122834a5 req-c74e9ce5-4c15-4862-bc88-bc5d092afb48 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:39:52 compute-0 nova_compute[182092]: 2026-01-23 09:39:52.513 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:54 compute-0 nova_compute[182092]: 2026-01-23 09:39:54.694 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:55 compute-0 podman[230090]: 2026-01-23 09:39:55.213163392 +0000 UTC m=+0.047251594 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:39:55 compute-0 podman[230089]: 2026-01-23 09:39:55.218620352 +0000 UTC m=+0.054082776 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:39:57 compute-0 nova_compute[182092]: 2026-01-23 09:39:57.516 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:39:58 compute-0 ovn_controller[94697]: 2026-01-23T09:39:58Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b3:8d:19 10.100.0.13
Jan 23 09:39:58 compute-0 ovn_controller[94697]: 2026-01-23T09:39:58Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:8d:19 10.100.0.13
Jan 23 09:39:59 compute-0 nova_compute[182092]: 2026-01-23 09:39:59.696 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:02 compute-0 nova_compute[182092]: 2026-01-23 09:40:02.518 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:04 compute-0 nova_compute[182092]: 2026-01-23 09:40:04.699 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:06 compute-0 podman[230141]: 2026-01-23 09:40:06.202650224 +0000 UTC m=+0.038675493 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 23 09:40:06 compute-0 podman[230142]: 2026-01-23 09:40:06.204323879 +0000 UTC m=+0.037588953 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:40:07 compute-0 nova_compute[182092]: 2026-01-23 09:40:07.521 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.259 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.259 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.259 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.260 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.260 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.267 182096 INFO nova.compute.manager [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Terminating instance
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.273 182096 DEBUG nova.compute.manager [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:40:08 compute-0 kernel: tap2ffa52f2-84 (unregistering): left promiscuous mode
Jan 23 09:40:08 compute-0 NetworkManager[54920]: <info>  [1769161208.2998] device (tap2ffa52f2-84): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00633|binding|INFO|Releasing lport 2ffa52f2-8471-4c8c-ada7-a235d863feed from this chassis (sb_readonly=0)
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00634|binding|INFO|Setting lport 2ffa52f2-8471-4c8c-ada7-a235d863feed down in Southbound
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.305 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00635|binding|INFO|Removing iface tap2ffa52f2-84 ovn-installed in OVS
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.308 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.312 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:70:58 10.100.0.12'], port_security=['fa:16:3e:52:70:58 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a7e87540-5713-4ac2-a9c8-942e11144ee8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19e72efe-6307-41da-8074-25766fc6d9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9e38e9d0-977f-42cc-8e60-4bdce73c3090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6aa0531-db76-40ab-b0a7-eb7a32653be0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2ffa52f2-8471-4c8c-ada7-a235d863feed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.313 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2ffa52f2-8471-4c8c-ada7-a235d863feed in datapath 19e72efe-6307-41da-8074-25766fc6d9f3 unbound from our chassis
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.315 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19e72efe-6307-41da-8074-25766fc6d9f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.316 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9a7e7008-b984-482e-9c3e-9da71dfdda36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.316 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3 namespace which is not needed anymore
Jan 23 09:40:08 compute-0 kernel: tapb3f8d54c-85 (unregistering): left promiscuous mode
Jan 23 09:40:08 compute-0 NetworkManager[54920]: <info>  [1769161208.3219] device (tapb3f8d54c-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.323 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00636|binding|INFO|Releasing lport b3f8d54c-85ef-4fa7-8739-13144ea79fd4 from this chassis (sb_readonly=0)
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00637|binding|INFO|Setting lport b3f8d54c-85ef-4fa7-8739-13144ea79fd4 down in Southbound
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00638|binding|INFO|Removing iface tapb3f8d54c-85 ovn-installed in OVS
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.326 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.328 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.331 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:72:9b 2001:db8::f816:3eff:fea8:729b'], port_security=['fa:16:3e:a8:72:9b 2001:db8::f816:3eff:fea8:729b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fea8:729b/64', 'neutron:device_id': 'a7e87540-5713-4ac2-a9c8-942e11144ee8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70104d04-d5ee-47d4-b059-a0882ae695ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9e38e9d0-977f-42cc-8e60-4bdce73c3090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7a180e3-fed5-40ae-a930-fe216e9b2af2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=b3f8d54c-85ef-4fa7-8739-13144ea79fd4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.354 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Jan 23 09:40:08 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000009b.scope: Consumed 12.945s CPU time.
Jan 23 09:40:08 compute-0 systemd-machined[153562]: Machine qemu-78-instance-0000009b terminated.
Jan 23 09:40:08 compute-0 podman[230178]: 2026-01-23 09:40:08.41256961 +0000 UTC m=+0.094585033 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3[229640]: [NOTICE]   (229644) : haproxy version is 2.8.14-c23fe91
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3[229640]: [NOTICE]   (229644) : path to executable is /usr/sbin/haproxy
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3[229640]: [WARNING]  (229644) : Exiting Master process...
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3[229640]: [WARNING]  (229644) : Exiting Master process...
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3[229640]: [ALERT]    (229644) : Current worker (229646) exited with code 143 (Terminated)
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3[229640]: [WARNING]  (229644) : All workers exited. Exiting... (0)
Jan 23 09:40:08 compute-0 systemd[1]: libpod-c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa.scope: Deactivated successfully.
Jan 23 09:40:08 compute-0 podman[230220]: 2026-01-23 09:40:08.434262276 +0000 UTC m=+0.039014912 container died c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:40:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa-userdata-shm.mount: Deactivated successfully.
Jan 23 09:40:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-a3bf369a99a0284bb45fc502ff5b20065aa3db706cd4eecfa2afb7802d5ad6f2-merged.mount: Deactivated successfully.
Jan 23 09:40:08 compute-0 podman[230220]: 2026-01-23 09:40:08.451675143 +0000 UTC m=+0.056427778 container cleanup c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:40:08 compute-0 systemd[1]: libpod-conmon-c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa.scope: Deactivated successfully.
Jan 23 09:40:08 compute-0 kernel: tap2ffa52f2-84: entered promiscuous mode
Jan 23 09:40:08 compute-0 systemd-udevd[230206]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:40:08 compute-0 NetworkManager[54920]: <info>  [1769161208.4861] manager: (tap2ffa52f2-84): new Tun device (/org/freedesktop/NetworkManager/Devices/317)
Jan 23 09:40:08 compute-0 kernel: tap2ffa52f2-84 (unregistering): left promiscuous mode
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.497 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 NetworkManager[54920]: <info>  [1769161208.4989] manager: (tapb3f8d54c-85): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00639|binding|INFO|Claiming lport 2ffa52f2-8471-4c8c-ada7-a235d863feed for this chassis.
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00640|binding|INFO|2ffa52f2-8471-4c8c-ada7-a235d863feed: Claiming fa:16:3e:52:70:58 10.100.0.12
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.508 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:70:58 10.100.0.12'], port_security=['fa:16:3e:52:70:58 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a7e87540-5713-4ac2-a9c8-942e11144ee8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19e72efe-6307-41da-8074-25766fc6d9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9e38e9d0-977f-42cc-8e60-4bdce73c3090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6aa0531-db76-40ab-b0a7-eb7a32653be0, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2ffa52f2-8471-4c8c-ada7-a235d863feed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:40:08 compute-0 podman[230243]: 2026-01-23 09:40:08.512436552 +0000 UTC m=+0.046591259 container remove c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.518 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1a38789c-3360-4c1c-9147-a317130a4e13]: (4, ('Fri Jan 23 09:40:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3 (c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa)\nc07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa\nFri Jan 23 09:40:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3 (c07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa)\nc07128dec98117b825fda438f2c7948c7a2d3d388dcf31e812cdc83f6d7a6dfa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_controller[94697]: 2026-01-23T09:40:08Z|00641|binding|INFO|Releasing lport 2ffa52f2-8471-4c8c-ada7-a235d863feed from this chassis (sb_readonly=0)
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.519 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[88a49b98-402c-4353-a8a6-5159e53fa163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.520 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19e72efe-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.522 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.527 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:70:58 10.100.0.12'], port_security=['fa:16:3e:52:70:58 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a7e87540-5713-4ac2-a9c8-942e11144ee8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19e72efe-6307-41da-8074-25766fc6d9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9e38e9d0-977f-42cc-8e60-4bdce73c3090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6aa0531-db76-40ab-b0a7-eb7a32653be0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2ffa52f2-8471-4c8c-ada7-a235d863feed) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.540 182096 INFO nova.virt.libvirt.driver [-] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Instance destroyed successfully.
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.541 182096 DEBUG nova.objects.instance [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'resources' on Instance uuid a7e87540-5713-4ac2-a9c8-942e11144ee8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:40:08 compute-0 kernel: tap19e72efe-60: left promiscuous mode
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.545 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.550 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ab805825-7820-43fd-99a0-f480ba1b41ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.559 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5848f28a-dec9-4fee-b0d8-26ba8b8cc8bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.560 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6c587483-d8e3-4b90-b91d-3a5196130eba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.561 182096 DEBUG nova.virt.libvirt.vif [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-939246726',display_name='tempest-TestGettingAddress-server-939246726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-939246726',id=155,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+vq503RsRv3NlbK2HqykQzUUUSumUFu3tzH7J4rcOQwkWVuSyj4F67HdGwPzSRWgEPqTVbQUEPNN9j6dgWLKb+/Lfysg6PG38f9xMl2gfvn9tbA0X8cV0TRrBlxVULog==',key_name='tempest-TestGettingAddress-1228770733',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:39:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-cw93do7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:39:11Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a7e87540-5713-4ac2-a9c8-942e11144ee8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.562 182096 DEBUG nova.network.os_vif_util [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.562 182096 DEBUG nova.network.os_vif_util [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:70:58,bridge_name='br-int',has_traffic_filtering=True,id=2ffa52f2-8471-4c8c-ada7-a235d863feed,network=Network(19e72efe-6307-41da-8074-25766fc6d9f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ffa52f2-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.563 182096 DEBUG os_vif [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:70:58,bridge_name='br-int',has_traffic_filtering=True,id=2ffa52f2-8471-4c8c-ada7-a235d863feed,network=Network(19e72efe-6307-41da-8074-25766fc6d9f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ffa52f2-84') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.564 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.564 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ffa52f2-84, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.566 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.568 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.569 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.571 182096 INFO os_vif [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:70:58,bridge_name='br-int',has_traffic_filtering=True,id=2ffa52f2-8471-4c8c-ada7-a235d863feed,network=Network(19e72efe-6307-41da-8074-25766fc6d9f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ffa52f2-84')
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.571 182096 DEBUG nova.virt.libvirt.vif [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:39:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-939246726',display_name='tempest-TestGettingAddress-server-939246726',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-939246726',id=155,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+vq503RsRv3NlbK2HqykQzUUUSumUFu3tzH7J4rcOQwkWVuSyj4F67HdGwPzSRWgEPqTVbQUEPNN9j6dgWLKb+/Lfysg6PG38f9xMl2gfvn9tbA0X8cV0TRrBlxVULog==',key_name='tempest-TestGettingAddress-1228770733',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:39:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-cw93do7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:39:11Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a7e87540-5713-4ac2-a9c8-942e11144ee8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.571 182096 DEBUG nova.network.os_vif_util [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.572 182096 DEBUG nova.network.os_vif_util [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:72:9b,bridge_name='br-int',has_traffic_filtering=True,id=b3f8d54c-85ef-4fa7-8739-13144ea79fd4,network=Network(70104d04-d5ee-47d4-b059-a0882ae695ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f8d54c-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.572 182096 DEBUG os_vif [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:72:9b,bridge_name='br-int',has_traffic_filtering=True,id=b3f8d54c-85ef-4fa7-8739-13144ea79fd4,network=Network(70104d04-d5ee-47d4-b059-a0882ae695ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f8d54c-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.573 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.573 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[280c11a9-9de7-43d5-acac-a5653b5491f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465628, 'reachable_time': 21383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230277, 'error': None, 'target': 'ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.573 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3f8d54c-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d19e72efe\x2d6307\x2d41da\x2d8074\x2d25766fc6d9f3.mount: Deactivated successfully.
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.575 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19e72efe-6307-41da-8074-25766fc6d9f3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.575 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[d9af2f96-901a-47f9-8019-1fa4b6dcd989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.577 103978 INFO neutron.agent.ovn.metadata.agent [-] Port b3f8d54c-85ef-4fa7-8739-13144ea79fd4 in datapath 70104d04-d5ee-47d4-b059-a0882ae695ee unbound from our chassis
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.576 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.578 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 70104d04-d5ee-47d4-b059-a0882ae695ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.578 182096 INFO os_vif [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:72:9b,bridge_name='br-int',has_traffic_filtering=True,id=b3f8d54c-85ef-4fa7-8739-13144ea79fd4,network=Network(70104d04-d5ee-47d4-b059-a0882ae695ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3f8d54c-85')
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.578 182096 INFO nova.virt.libvirt.driver [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Deleting instance files /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8_del
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.579 182096 INFO nova.virt.libvirt.driver [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Deletion of /var/lib/nova/instances/a7e87540-5713-4ac2-a9c8-942e11144ee8_del complete
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.579 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac615ef-0a4d-4330-b4a3-e2823fe3ce7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.579 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee namespace which is not needed anymore
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee[229709]: [NOTICE]   (229713) : haproxy version is 2.8.14-c23fe91
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee[229709]: [NOTICE]   (229713) : path to executable is /usr/sbin/haproxy
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee[229709]: [ALERT]    (229713) : Current worker (229715) exited with code 143 (Terminated)
Jan 23 09:40:08 compute-0 neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee[229709]: [WARNING]  (229713) : All workers exited. Exiting... (0)
Jan 23 09:40:08 compute-0 systemd[1]: libpod-21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89.scope: Deactivated successfully.
Jan 23 09:40:08 compute-0 conmon[229709]: conmon 21fed4f841fe98593f10 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89.scope/container/memory.events
Jan 23 09:40:08 compute-0 podman[230293]: 2026-01-23 09:40:08.674063726 +0000 UTC m=+0.033576678 container died 21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:40:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89-userdata-shm.mount: Deactivated successfully.
Jan 23 09:40:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-674cb646332f72b69a2d7ac8ceb3d69fa85ca6519f94007735318ce4fdfc44a2-merged.mount: Deactivated successfully.
Jan 23 09:40:08 compute-0 podman[230293]: 2026-01-23 09:40:08.691845318 +0000 UTC m=+0.051358249 container cleanup 21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:40:08 compute-0 systemd[1]: libpod-conmon-21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89.scope: Deactivated successfully.
Jan 23 09:40:08 compute-0 podman[230316]: 2026-01-23 09:40:08.731028006 +0000 UTC m=+0.023766337 container remove 21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.734 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0da384-3446-47b2-bb79-80bbe211229f]: (4, ('Fri Jan 23 09:40:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee (21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89)\n21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89\nFri Jan 23 09:40:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee (21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89)\n21fed4f841fe98593f102bb8629cae9cfe88b721d1fea4736f679b450215dd89\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.735 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a4509c27-532f-4b90-b965-21032ff826d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.736 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70104d04-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:08 compute-0 kernel: tap70104d04-d0: left promiscuous mode
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.739 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.750 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.752 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[407c6ffd-a35f-46a4-b445-5f902d4a41ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.765 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[79d9e4e8-1346-479a-95d7-6733acc89920]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.765 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[24acc1fe-40b6-4948-98f4-83caf831103e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.777 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[50dd35e2-a055-4782-b3cf-27e982a14416]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465691, 'reachable_time': 38339, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230328, 'error': None, 'target': 'ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.779 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-70104d04-d5ee-47d4-b059-a0882ae695ee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.779 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[628dbdc7-de46-442c-b282-08d93bc1e82f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.779 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2ffa52f2-8471-4c8c-ada7-a235d863feed in datapath 19e72efe-6307-41da-8074-25766fc6d9f3 unbound from our chassis
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.781 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19e72efe-6307-41da-8074-25766fc6d9f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.781 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[eda7b2aa-b337-4f7a-b950-ba8d71448abe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.782 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2ffa52f2-8471-4c8c-ada7-a235d863feed in datapath 19e72efe-6307-41da-8074-25766fc6d9f3 unbound from our chassis
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.783 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19e72efe-6307-41da-8074-25766fc6d9f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:40:08 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:08.783 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[463bdf12-deeb-43c6-9569-1d4f10bca004]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.810 182096 DEBUG nova.compute.manager [req-707c583c-6897-492d-ba3f-a05faf6e9f6c req-3bb582f8-6ad1-4328-9edf-7108d6603d34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-unplugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.811 182096 DEBUG oslo_concurrency.lockutils [req-707c583c-6897-492d-ba3f-a05faf6e9f6c req-3bb582f8-6ad1-4328-9edf-7108d6603d34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.811 182096 DEBUG oslo_concurrency.lockutils [req-707c583c-6897-492d-ba3f-a05faf6e9f6c req-3bb582f8-6ad1-4328-9edf-7108d6603d34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.811 182096 DEBUG oslo_concurrency.lockutils [req-707c583c-6897-492d-ba3f-a05faf6e9f6c req-3bb582f8-6ad1-4328-9edf-7108d6603d34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.811 182096 DEBUG nova.compute.manager [req-707c583c-6897-492d-ba3f-a05faf6e9f6c req-3bb582f8-6ad1-4328-9edf-7108d6603d34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] No waiting events found dispatching network-vif-unplugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.812 182096 DEBUG nova.compute.manager [req-707c583c-6897-492d-ba3f-a05faf6e9f6c req-3bb582f8-6ad1-4328-9edf-7108d6603d34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-unplugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.837 182096 INFO nova.compute.manager [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Took 0.56 seconds to destroy the instance on the hypervisor.
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.837 182096 DEBUG oslo.service.loopingcall [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.837 182096 DEBUG nova.compute.manager [-] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.837 182096 DEBUG nova.network.neutron [-] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.887 182096 DEBUG nova.compute.manager [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-changed-2ffa52f2-8471-4c8c-ada7-a235d863feed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.888 182096 DEBUG nova.compute.manager [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Refreshing instance network info cache due to event network-changed-2ffa52f2-8471-4c8c-ada7-a235d863feed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.888 182096 DEBUG oslo_concurrency.lockutils [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.888 182096 DEBUG oslo_concurrency.lockutils [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:40:08 compute-0 nova_compute[182092]: 2026-01-23 09:40:08.888 182096 DEBUG nova.network.neutron [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Refreshing network info cache for port 2ffa52f2-8471-4c8c-ada7-a235d863feed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:40:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d70104d04\x2dd5ee\x2d47d4\x2db059\x2da0882ae695ee.mount: Deactivated successfully.
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.456 182096 DEBUG nova.compute.manager [req-1c72c2aa-ebb8-455d-9ba8-e2c4ba084113 req-02be1fc3-d282-4c22-bdb8-92cbda82fad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-deleted-2ffa52f2-8471-4c8c-ada7-a235d863feed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.457 182096 INFO nova.compute.manager [req-1c72c2aa-ebb8-455d-9ba8-e2c4ba084113 req-02be1fc3-d282-4c22-bdb8-92cbda82fad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Neutron deleted interface 2ffa52f2-8471-4c8c-ada7-a235d863feed; detaching it from the instance and deleting it from the info cache
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.457 182096 DEBUG nova.network.neutron [req-1c72c2aa-ebb8-455d-9ba8-e2c4ba084113 req-02be1fc3-d282-4c22-bdb8-92cbda82fad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updating instance_info_cache with network_info: [{"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.591 182096 DEBUG nova.compute.manager [req-1c72c2aa-ebb8-455d-9ba8-e2c4ba084113 req-02be1fc3-d282-4c22-bdb8-92cbda82fad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Detach interface failed, port_id=2ffa52f2-8471-4c8c-ada7-a235d863feed, reason: Instance a7e87540-5713-4ac2-a9c8-942e11144ee8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.758 182096 DEBUG nova.network.neutron [-] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.776 182096 INFO nova.compute.manager [-] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Took 0.94 seconds to deallocate network for instance.
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.844 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.845 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.898 182096 DEBUG nova.compute.provider_tree [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.911 182096 DEBUG nova.scheduler.client.report [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.924 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.941 182096 INFO nova.scheduler.client.report [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Deleted allocations for instance a7e87540-5713-4ac2-a9c8-942e11144ee8
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.982 182096 DEBUG nova.network.neutron [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updated VIF entry in instance network info cache for port 2ffa52f2-8471-4c8c-ada7-a235d863feed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:40:09 compute-0 nova_compute[182092]: 2026-01-23 09:40:09.982 182096 DEBUG nova.network.neutron [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Updating instance_info_cache with network_info: [{"id": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "address": "fa:16:3e:52:70:58", "network": {"id": "19e72efe-6307-41da-8074-25766fc6d9f3", "bridge": "br-int", "label": "tempest-network-smoke--842482118", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ffa52f2-84", "ovs_interfaceid": "2ffa52f2-8471-4c8c-ada7-a235d863feed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "address": "fa:16:3e:a8:72:9b", "network": {"id": "70104d04-d5ee-47d4-b059-a0882ae695ee", "bridge": "br-int", "label": "tempest-network-smoke--856048650", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea8:729b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3f8d54c-85", "ovs_interfaceid": "b3f8d54c-85ef-4fa7-8739-13144ea79fd4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.001 182096 DEBUG oslo_concurrency.lockutils [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a7e87540-5713-4ac2-a9c8-942e11144ee8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.001 182096 DEBUG nova.compute.manager [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-unplugged-2ffa52f2-8471-4c8c-ada7-a235d863feed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.001 182096 DEBUG oslo_concurrency.lockutils [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.001 182096 DEBUG oslo_concurrency.lockutils [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.001 182096 DEBUG oslo_concurrency.lockutils [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.001 182096 DEBUG nova.compute.manager [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] No waiting events found dispatching network-vif-unplugged-2ffa52f2-8471-4c8c-ada7-a235d863feed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.002 182096 DEBUG nova.compute.manager [req-1f9f5c0b-f8a6-4c2d-b2e7-cae15c9a532e req-1788bdee-010d-4f1d-8030-ce62631cf37a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-unplugged-2ffa52f2-8471-4c8c-ada7-a235d863feed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.005 182096 DEBUG oslo_concurrency.lockutils [None req-3feeb31b-0311-48b2-a40e-9d98b19610bd 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.746s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.895 182096 DEBUG nova.compute.manager [req-fbd29618-307d-4ea4-98a9-bdd045da2fa1 req-98f0657d-2fd0-4021-b310-4cb6a8219be1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.895 182096 DEBUG oslo_concurrency.lockutils [req-fbd29618-307d-4ea4-98a9-bdd045da2fa1 req-98f0657d-2fd0-4021-b310-4cb6a8219be1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.896 182096 DEBUG oslo_concurrency.lockutils [req-fbd29618-307d-4ea4-98a9-bdd045da2fa1 req-98f0657d-2fd0-4021-b310-4cb6a8219be1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.896 182096 DEBUG oslo_concurrency.lockutils [req-fbd29618-307d-4ea4-98a9-bdd045da2fa1 req-98f0657d-2fd0-4021-b310-4cb6a8219be1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.896 182096 DEBUG nova.compute.manager [req-fbd29618-307d-4ea4-98a9-bdd045da2fa1 req-98f0657d-2fd0-4021-b310-4cb6a8219be1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] No waiting events found dispatching network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.896 182096 WARNING nova.compute.manager [req-fbd29618-307d-4ea4-98a9-bdd045da2fa1 req-98f0657d-2fd0-4021-b310-4cb6a8219be1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received unexpected event network-vif-plugged-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 for instance with vm_state deleted and task_state None.
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.983 182096 DEBUG nova.compute.manager [req-47c4bf9a-875a-48ec-99ba-0af07a331fbb req-8d9a6e11-0b42-4127-859f-b53eeae9911f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.983 182096 DEBUG oslo_concurrency.lockutils [req-47c4bf9a-875a-48ec-99ba-0af07a331fbb req-8d9a6e11-0b42-4127-859f-b53eeae9911f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.984 182096 DEBUG oslo_concurrency.lockutils [req-47c4bf9a-875a-48ec-99ba-0af07a331fbb req-8d9a6e11-0b42-4127-859f-b53eeae9911f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.984 182096 DEBUG oslo_concurrency.lockutils [req-47c4bf9a-875a-48ec-99ba-0af07a331fbb req-8d9a6e11-0b42-4127-859f-b53eeae9911f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a7e87540-5713-4ac2-a9c8-942e11144ee8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.984 182096 DEBUG nova.compute.manager [req-47c4bf9a-875a-48ec-99ba-0af07a331fbb req-8d9a6e11-0b42-4127-859f-b53eeae9911f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] No waiting events found dispatching network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:40:10 compute-0 nova_compute[182092]: 2026-01-23 09:40:10.984 182096 WARNING nova.compute.manager [req-47c4bf9a-875a-48ec-99ba-0af07a331fbb req-8d9a6e11-0b42-4127-859f-b53eeae9911f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received unexpected event network-vif-plugged-2ffa52f2-8471-4c8c-ada7-a235d863feed for instance with vm_state deleted and task_state None.
Jan 23 09:40:11 compute-0 nova_compute[182092]: 2026-01-23 09:40:11.518 182096 DEBUG nova.compute.manager [req-7aa501dd-7448-437f-9380-da3acef53be8 req-70bc27fa-3563-4434-849c-5a977aef7057 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Received event network-vif-deleted-b3f8d54c-85ef-4fa7-8739-13144ea79fd4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:11 compute-0 nova_compute[182092]: 2026-01-23 09:40:11.519 182096 INFO nova.compute.manager [req-7aa501dd-7448-437f-9380-da3acef53be8 req-70bc27fa-3563-4434-849c-5a977aef7057 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Neutron deleted interface b3f8d54c-85ef-4fa7-8739-13144ea79fd4; detaching it from the instance and deleting it from the info cache
Jan 23 09:40:11 compute-0 nova_compute[182092]: 2026-01-23 09:40:11.519 182096 DEBUG nova.network.neutron [req-7aa501dd-7448-437f-9380-da3acef53be8 req-70bc27fa-3563-4434-849c-5a977aef7057 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 23 09:40:11 compute-0 nova_compute[182092]: 2026-01-23 09:40:11.520 182096 DEBUG nova.compute.manager [req-7aa501dd-7448-437f-9380-da3acef53be8 req-70bc27fa-3563-4434-849c-5a977aef7057 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Detach interface failed, port_id=b3f8d54c-85ef-4fa7-8739-13144ea79fd4, reason: Instance a7e87540-5713-4ac2-a9c8-942e11144ee8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 09:40:12 compute-0 nova_compute[182092]: 2026-01-23 09:40:12.522 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:12 compute-0 nova_compute[182092]: 2026-01-23 09:40:12.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:13 compute-0 nova_compute[182092]: 2026-01-23 09:40:13.575 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:14 compute-0 nova_compute[182092]: 2026-01-23 09:40:14.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:15 compute-0 ovn_controller[94697]: 2026-01-23T09:40:15Z|00642|binding|INFO|Releasing lport ddbf3865-7fb3-4d33-bb90-eb46c309610d from this chassis (sb_readonly=0)
Jan 23 09:40:15 compute-0 nova_compute[182092]: 2026-01-23 09:40:15.116 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:16 compute-0 nova_compute[182092]: 2026-01-23 09:40:16.646 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:16 compute-0 nova_compute[182092]: 2026-01-23 09:40:16.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:16 compute-0 nova_compute[182092]: 2026-01-23 09:40:16.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:40:16 compute-0 nova_compute[182092]: 2026-01-23 09:40:16.664 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:40:16 compute-0 nova_compute[182092]: 2026-01-23 09:40:16.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:16 compute-0 nova_compute[182092]: 2026-01-23 09:40:16.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:17 compute-0 nova_compute[182092]: 2026-01-23 09:40:17.523 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:18 compute-0 podman[230329]: 2026-01-23 09:40:18.22540692 +0000 UTC m=+0.060221101 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.576 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.671 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.672 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.672 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.672 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.715 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.763 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.764 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:40:18 compute-0 nova_compute[182092]: 2026-01-23 09:40:18.810 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.015 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.016 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5575MB free_disk=73.18394470214844GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.016 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.017 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.072 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance c090cdd2-7b44-4f97-9ccb-253fe16eef62 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.073 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.073 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.113 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.127 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.144 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:40:19 compute-0 nova_compute[182092]: 2026-01-23 09:40:19.144 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:21 compute-0 nova_compute[182092]: 2026-01-23 09:40:21.145 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:22 compute-0 nova_compute[182092]: 2026-01-23 09:40:22.524 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:22 compute-0 nova_compute[182092]: 2026-01-23 09:40:22.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:22 compute-0 nova_compute[182092]: 2026-01-23 09:40:22.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:40:23 compute-0 nova_compute[182092]: 2026-01-23 09:40:23.539 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161208.538522, a7e87540-5713-4ac2-a9c8-942e11144ee8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:40:23 compute-0 nova_compute[182092]: 2026-01-23 09:40:23.539 182096 INFO nova.compute.manager [-] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] VM Stopped (Lifecycle Event)
Jan 23 09:40:23 compute-0 nova_compute[182092]: 2026-01-23 09:40:23.555 182096 DEBUG nova.compute.manager [None req-f5dee852-1724-4d16-a26a-948d8a7c121c - - - - - -] [instance: a7e87540-5713-4ac2-a9c8-942e11144ee8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:40:23 compute-0 nova_compute[182092]: 2026-01-23 09:40:23.579 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:24 compute-0 nova_compute[182092]: 2026-01-23 09:40:24.438 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:24.915 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:40:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:24.916 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:40:24 compute-0 nova_compute[182092]: 2026-01-23 09:40:24.917 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:24.917 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:25 compute-0 nova_compute[182092]: 2026-01-23 09:40:25.852 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:26 compute-0 podman[230359]: 2026-01-23 09:40:26.202526115 +0000 UTC m=+0.040639456 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 23 09:40:26 compute-0 podman[230360]: 2026-01-23 09:40:26.203010128 +0000 UTC m=+0.038574452 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:40:27 compute-0 nova_compute[182092]: 2026-01-23 09:40:27.527 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:27 compute-0 nova_compute[182092]: 2026-01-23 09:40:27.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:40:28 compute-0 nova_compute[182092]: 2026-01-23 09:40:28.580 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:32 compute-0 nova_compute[182092]: 2026-01-23 09:40:32.527 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:32 compute-0 nova_compute[182092]: 2026-01-23 09:40:32.688 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.003 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'hostId': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.003 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.011 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.011 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b2886af-4fc4-4f9d-ae54-fedb5b22542c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.003718', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ff1fc70-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.533102458, 'message_signature': 'a5fcbf32bad97b4fdb08571b3439920bc3acc0973e759ef8c6dc92a0d24ab528'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.003718', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ff206fc-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.533102458, 'message_signature': '733fcec76135792d4f0212ccee844ca335b10bd6d6d3a3c659793231fe93cdfd'}]}, 'timestamp': '2026-01-23 09:40:33.011794', '_unique_id': 'f8f9d9dcc4aa421282b1530e22521f8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.012 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.014 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c090cdd2-7b44-4f97-9ccb-253fe16eef62 / tapdd650e27-b8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.014 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7450bcdf-237a-4852-a0f7-db3b40091994', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.013279', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ff2808c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': '9d9cfc2af3a8214282974026d71aaeca31d2582efca28ec3dabb54aadbd9ec12'}]}, 'timestamp': '2026-01-23 09:40:33.014906', '_unique_id': '62c87e009aae4ae5a77c89d05a4fbc77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.015 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6e35407-fdc9-4f77-9a93-35b504094efd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.015967', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ff2b2f0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': '7441178de0587a6d3e842ccf6140197cc4b44ec10f6d2848be4abb2c900dc1dd'}]}, 'timestamp': '2026-01-23 09:40:33.016194', '_unique_id': 'cbfae6915ed144a19dc999ca18a11e03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.016 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd861ec3-507c-4288-b6cd-c80d0a919e78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.017220', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ff2e3ce-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': 'e30647fbe4c35fe9e0e5235419a5d2508ab4d8e5c4e06d660c511c05ef5a3ed4'}]}, 'timestamp': '2026-01-23 09:40:33.017444', '_unique_id': '790ab15c526849e698b7f3debedf3a7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.017 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.018 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.028 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/memory.usage volume: 42.59375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7fc158a-0c65-41b1-97fe-f122a169db18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.59375, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'timestamp': '2026-01-23T09:40:33.018480', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '8ff4a3e4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.557991592, 'message_signature': '81d2ae8fc22e116018b61b520915588005b963487ed2372e08f96539ee48ebd1'}]}, 'timestamp': '2026-01-23 09:40:33.028910', '_unique_id': 'cf007ad98e0847b9b003a23dce1bb537'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.029 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/cpu volume: 10240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e0430ca-9496-406b-a20f-0fc5100b7dd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10240000000, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'timestamp': '2026-01-23T09:40:33.029962', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8ff4d580-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.557991592, 'message_signature': '2834a93ca45ee455f74209c9a980dc111d74c1e2901627472784f1b5cb37247c'}]}, 'timestamp': '2026-01-23 09:40:33.030180', '_unique_id': 'a097fa8917174193a606be3edcf8b532'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.048 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.read.latency volume: 257492441 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.048 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.read.latency volume: 119295814 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2ee985c-bfbd-4954-b805-faab81b680d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 257492441, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.031207', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ff7a972-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': '78f55a2dbff321f3471ab9f92444fe685ea04b69b338528ba1eb2667b17b07fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 119295814, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.031207', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ff7b78c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': 'b05af489afddf64e05ab9438ede3a86f06d5883b459672abb0bf9b8d40329a37'}]}, 'timestamp': '2026-01-23 09:40:33.049085', '_unique_id': '778bfa013eb349b4a20d916a730785f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.049 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.050 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6009788a-88f8-4174-82a7-28f83c06b205', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.050942', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ff80b06-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': '1fab0eb1c65c66b72f12a4ee56f32755889cf9c7244ce0182047bfd2094b52bb'}]}, 'timestamp': '2026-01-23 09:40:33.051222', '_unique_id': 'b1771fb20df04c26bccca32c9a2cf14e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.051 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.052 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.052 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385>]
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa9605b2-bdde-4bb6-a8b2-bf9ad928bb76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.053122', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ff85fd4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': 'bbd65687bb095ef1bfc221cc4e7c3c02500c860ae8976d2aba2831c5a20c75f0'}]}, 'timestamp': '2026-01-23 09:40:33.053390', '_unique_id': 'c51cbf77f09848cbb071a70796807b44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.053 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.054 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.write.bytes volume: 73052160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99f3e078-672c-4c44-b23a-b957b7f777da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73052160, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.054830', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ff8a246-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': '31118ede89d3483b4843c9e8eb6500b160758529ee7d0aa2b1de41818fb9e985'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.054830', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ff8ab7e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': '520442e7f58c4801c719bbcfbaf1c6dd208d0ed062deb1f0b59abc13b694a637'}]}, 'timestamp': '2026-01-23 09:40:33.055310', '_unique_id': 'cb25ea080ead4e9a9b3be8ff43e00209'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.056 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.incoming.bytes volume: 34403 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd8a2139-8f91-4cdb-8afc-8a83fc1cc0d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 34403, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.056728', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ff8ec92-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': '3d26defa98c78a009744c25a0af62932d0127edd83f58f7f92317342c6d3ebe0'}]}, 'timestamp': '2026-01-23 09:40:33.056990', '_unique_id': '12531cbaa5f84a6196cc1d728a2774ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.058 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.058 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.incoming.packets volume: 181 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92412d0e-a7ed-4423-865a-c63604a2b072', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 181, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.058330', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ff92aea-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': 'cb123c6866f9d2c69929a66cb27656611603acbcb784364f01ac62b7884599e2'}]}, 'timestamp': '2026-01-23 09:40:33.058586', '_unique_id': 'c5b063b58ba84ccd8cebd6742b6c2db1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.059 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.060 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.060 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.write.latency volume: 296384864 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.060 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19de15c2-c60c-4636-8462-acbcbf7c0f88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 296384864, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.060104', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ff975a4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': 'f90f3a2598e0b2475c0fb06feb4058d3ed6c34d44eeff7b7bffe78e1f399cd1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.060104', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ff97e96-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': '938f09434600dab9a296dbefd0673eb9cd808060b2e96c5b56f4fdb07d00b152'}]}, 'timestamp': '2026-01-23 09:40:33.060746', '_unique_id': 'fcbde4a8981b4114b413a6f1ce1337ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.062 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.062 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385>]
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.062 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.outgoing.packets volume: 217 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da2a197c-5d65-457a-8ea7-de6a205040c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 217, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.062481', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ff9ccde-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': 'bc6702180a302401c50094477427a24f6c28010a5c55949a8ca75a5712b7ec96'}]}, 'timestamp': '2026-01-23 09:40:33.062760', '_unique_id': '668b7f527e284d19b2d9b8852a7a8517'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.063 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.064 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.read.bytes volume: 30317056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.064 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef63d926-6c55-497a-8157-956bbac377b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30317056, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.064113', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ffa0cc6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': '93ea5dc240bfb82ce74dc5aa37bfa4564786d4358da8a721848880417adb6c34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.064113', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ffa15c2-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': 'e0fbbab4a89b5b1172f189728918967f04ea7843e643a3df316c5798ef123e46'}]}, 'timestamp': '2026-01-23 09:40:33.064585', '_unique_id': '3c952f5288a14310bb62745668006e40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.065 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff5be421-cb8e-4de9-b202-30185f7f3f4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.065990', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ffa5654-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': '9826dca91c18b6a04213589f2133815bd157c909566145ff1f40920068f187ec'}]}, 'timestamp': '2026-01-23 09:40:33.066252', '_unique_id': 'f0ae8062f48e4beabac298e1161e0751'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.066 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.067 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.067 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/network.outgoing.bytes volume: 30964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7a370e4-b462-4fd3-a9b9-429b8ef2b7d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30964, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'instance-0000009e-c090cdd2-7b44-4f97-9ccb-253fe16eef62-tapdd650e27-b8', 'timestamp': '2026-01-23T09:40:33.067770', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'tapdd650e27-b8', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:8d:19', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdd650e27-b8'}, 'message_id': '8ffa9bdc-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.542661434, 'message_signature': '9173326c261fd77142d1e9c4a07bea4034616d0f90a5619fd37d4e81a8539d49'}]}, 'timestamp': '2026-01-23 09:40:33.068030', '_unique_id': 'bd5de4c38ae84c9984e1f56628919f26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.069 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.069 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.069 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4231ffee-2b74-48bc-b40a-48c54495c3d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.069354', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ffad98a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.533102458, 'message_signature': 'd68caa7e4dda30005b5e7b597cd042f3780b13f358ef5f5871a7638b8634a704'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.069354', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ffae3ee-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.533102458, 'message_signature': '153b8c98892eb679c8fa45b26f0eba106b4f172d5e6b1d139ef139dd59e6e1f1'}]}, 'timestamp': '2026-01-23 09:40:33.069865', '_unique_id': '0507ca6f961646d8a4537e4ff6b6006c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.070 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.071 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.071 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385>]
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.071 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.071 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '686f28d4-c743-4efc-a63d-7f3ea700f179', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.071580', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ffb315a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.533102458, 'message_signature': '1ba24aa16515512bed75e780eb800f80dd1179566ad8b5ac328460c6ae0f3762'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.071580', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ffb3a60-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.533102458, 'message_signature': '2dc2405357ad57acca096346dc7de7adc1c9a8e67822f9dc3836f33a151e0f4a'}]}, 'timestamp': '2026-01-23 09:40:33.072076', '_unique_id': '708c4b6c592940e9935869c20ab8025e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.073 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.073 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.write.requests volume: 334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.073 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50b122a1-9da4-42a6-9168-6852545b6f30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 334, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.073433', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ffb78e0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': '3f3ccfec21aa483bac214cd7b9637a8330048245692bcf3dc2c2eba92d31a2b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.073433', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ffb8380-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': '38fbe7e099baf88696496736ae1ca158daf8e13f0347785c7ff7973113111b30'}]}, 'timestamp': '2026-01-23 09:40:33.073949', '_unique_id': '65c9883c43b24e77854498b3ebf00958'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.075 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.075 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385>]
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.075 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.read.requests volume: 1085 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.075 12 DEBUG ceilometer.compute.pollsters [-] c090cdd2-7b44-4f97-9ccb-253fe16eef62/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '050cb233-e937-4ac4-aafb-8a5879c03b0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1085, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-vda', 'timestamp': '2026-01-23T09:40:33.075694', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8ffbd150-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': 'ad580fb7b25606e3f668f1aee4f8d5ca357527c960687370e3129895b60611fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'ca9dd50592344370a9c82fc29637452b', 'user_name': None, 'project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'project_name': None, 'resource_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62-sda', 'timestamp': '2026-01-23T09:40:33.075694', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385', 'name': 'instance-0000009e', 'instance_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'instance_type': 'm1.nano', 'host': '73123354eece802613fde78149938a07daeb55b820e61aa84f4df4c7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8ffbda2e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4738.560587687, 'message_signature': '583074cfe0e19a54ec0765af8b71309981026ae112c5835eceab7bbba14a9aa9'}]}, 'timestamp': '2026-01-23 09:40:33.076167', '_unique_id': 'fd401c110195441dbfbb26bff8b57f1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:40:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:40:33.076 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:40:33 compute-0 nova_compute[182092]: 2026-01-23 09:40:33.582 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:34.267 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:96:a5 2001:db8:0:1:f816:3eff:fe47:96a5 2001:db8::f816:3eff:fe47:96a5'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe47:96a5/64 2001:db8::f816:3eff:fe47:96a5/64', 'neutron:device_id': 'ovnmeta-209c613a-95e1-4666-8213-5d13e99d782b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-209c613a-95e1-4666-8213-5d13e99d782b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d600020a-932e-4dc5-a06d-992855600728, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=441ade13-f4fb-4ada-9c1d-f680f7355e3d) old=Port_Binding(mac=['fa:16:3e:47:96:a5 2001:db8::f816:3eff:fe47:96a5'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe47:96a5/64', 'neutron:device_id': 'ovnmeta-209c613a-95e1-4666-8213-5d13e99d782b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-209c613a-95e1-4666-8213-5d13e99d782b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:40:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:34.268 103978 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 441ade13-f4fb-4ada-9c1d-f680f7355e3d in datapath 209c613a-95e1-4666-8213-5d13e99d782b updated
Jan 23 09:40:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:34.269 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 209c613a-95e1-4666-8213-5d13e99d782b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:40:34 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:34.270 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[05bbc297-7cf9-4e83-abc1-35eaf8257a34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:35 compute-0 nova_compute[182092]: 2026-01-23 09:40:35.946 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:35 compute-0 nova_compute[182092]: 2026-01-23 09:40:35.947 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:35 compute-0 nova_compute[182092]: 2026-01-23 09:40:35.947 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:35 compute-0 nova_compute[182092]: 2026-01-23 09:40:35.947 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:35 compute-0 nova_compute[182092]: 2026-01-23 09:40:35.947 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:35 compute-0 nova_compute[182092]: 2026-01-23 09:40:35.955 182096 INFO nova.compute.manager [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Terminating instance
Jan 23 09:40:35 compute-0 nova_compute[182092]: 2026-01-23 09:40:35.970 182096 DEBUG nova.compute.manager [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:40:35 compute-0 kernel: tapdd650e27-b8 (unregistering): left promiscuous mode
Jan 23 09:40:36 compute-0 NetworkManager[54920]: <info>  [1769161236.0018] device (tapdd650e27-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:40:36 compute-0 ovn_controller[94697]: 2026-01-23T09:40:36Z|00643|binding|INFO|Releasing lport dd650e27-b8d1-4200-a549-2495e4e181fa from this chassis (sb_readonly=0)
Jan 23 09:40:36 compute-0 ovn_controller[94697]: 2026-01-23T09:40:36Z|00644|binding|INFO|Setting lport dd650e27-b8d1-4200-a549-2495e4e181fa down in Southbound
Jan 23 09:40:36 compute-0 ovn_controller[94697]: 2026-01-23T09:40:36Z|00645|binding|INFO|Removing iface tapdd650e27-b8 ovn-installed in OVS
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.012 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.019 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:8d:19 10.100.0.13'], port_security=['fa:16:3e:b3:8d:19 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c090cdd2-7b44-4f97-9ccb-253fe16eef62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f1cf616-2c32-488c-baab-73aa253de79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5be4fcb5d9df463eb390de7dc8be6fec', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2c46b324-68c0-4a50-a85f-4e212875add8 56176f0a-2fb7-4e45-921d-b2b7e6c271bf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae90b9ba-24e1-405b-9cf2-8420d2ca0301, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=dd650e27-b8d1-4200-a549-2495e4e181fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.019 103978 INFO neutron.agent.ovn.metadata.agent [-] Port dd650e27-b8d1-4200-a549-2495e4e181fa in datapath 9f1cf616-2c32-488c-baab-73aa253de79f unbound from our chassis
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.021 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f1cf616-2c32-488c-baab-73aa253de79f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.021 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[37257547-2bcc-4b77-982a-60c6d02b4b3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.022 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f namespace which is not needed anymore
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.027 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Jan 23 09:40:36 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000009e.scope: Consumed 11.847s CPU time.
Jan 23 09:40:36 compute-0 systemd-machined[153562]: Machine qemu-79-instance-0000009e terminated.
Jan 23 09:40:36 compute-0 neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f[230055]: [NOTICE]   (230059) : haproxy version is 2.8.14-c23fe91
Jan 23 09:40:36 compute-0 neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f[230055]: [NOTICE]   (230059) : path to executable is /usr/sbin/haproxy
Jan 23 09:40:36 compute-0 neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f[230055]: [WARNING]  (230059) : Exiting Master process...
Jan 23 09:40:36 compute-0 neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f[230055]: [WARNING]  (230059) : Exiting Master process...
Jan 23 09:40:36 compute-0 neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f[230055]: [ALERT]    (230059) : Current worker (230061) exited with code 143 (Terminated)
Jan 23 09:40:36 compute-0 neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f[230055]: [WARNING]  (230059) : All workers exited. Exiting... (0)
Jan 23 09:40:36 compute-0 systemd[1]: libpod-4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744.scope: Deactivated successfully.
Jan 23 09:40:36 compute-0 conmon[230055]: conmon 4489bf894954e9043632 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744.scope/container/memory.events
Jan 23 09:40:36 compute-0 podman[230418]: 2026-01-23 09:40:36.124717379 +0000 UTC m=+0.034727669 container died 4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:40:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-92a96dacea434b4d2030329f295a568a2ea4c5daa874076935a1f54718b0b801-merged.mount: Deactivated successfully.
Jan 23 09:40:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744-userdata-shm.mount: Deactivated successfully.
Jan 23 09:40:36 compute-0 podman[230418]: 2026-01-23 09:40:36.144150564 +0000 UTC m=+0.054160855 container cleanup 4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 09:40:36 compute-0 systemd[1]: libpod-conmon-4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744.scope: Deactivated successfully.
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.187 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.189 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 podman[230444]: 2026-01-23 09:40:36.194184415 +0000 UTC m=+0.032343883 container remove 4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.198 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[831cd576-c0e8-42f1-9e59-54b6597032a0]: (4, ('Fri Jan 23 09:40:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f (4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744)\n4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744\nFri Jan 23 09:40:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f (4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744)\n4489bf894954e90436324e3f70f7fc1643148e845b3703d2af33b1c8a062c744\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.199 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9550caa3-a582-4a2a-b49d-b8b2dce6f9f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.201 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f1cf616-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.202 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 kernel: tap9f1cf616-20: left promiscuous mode
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.217 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.218 182096 INFO nova.virt.libvirt.driver [-] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Instance destroyed successfully.
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.218 182096 DEBUG nova.objects.instance [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lazy-loading 'resources' on Instance uuid c090cdd2-7b44-4f97-9ccb-253fe16eef62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.222 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[28b79e9f-1972-4d84-ab3d-017c917cfb3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.223 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.229 182096 DEBUG nova.virt.libvirt.vif [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:39:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-315406914-access_point-626832385',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-315406914-acc',id=158,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDhSC5lTbToOC4esGJ7L03gtjuSkFHF+O4VOv/0ddgi0bh9OfG3oIyqm1/8NB7Q4aoFUFogxzG/XzWucYO6KnLvyJ2b8377h31gIuMHmN7lo/b662FlKgXZo/fiSt4ZgZQ==',key_name='tempest-TestSecurityGroupsBasicOps-26663372',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:39:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5be4fcb5d9df463eb390de7dc8be6fec',ramdisk_id='',reservation_id='r-fmd9kbtt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-315406914',owner_user_name='tempest-TestSecurityGroupsBasicOps-315406914-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:39:46Z,user_data=None,user_id='ca9dd50592344370a9c82fc29637452b',uuid=c090cdd2-7b44-4f97-9ccb-253fe16eef62,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.230 182096 DEBUG nova.network.os_vif_util [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Converting VIF {"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.230 182096 DEBUG nova.network.os_vif_util [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:8d:19,bridge_name='br-int',has_traffic_filtering=True,id=dd650e27-b8d1-4200-a549-2495e4e181fa,network=Network(9f1cf616-2c32-488c-baab-73aa253de79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd650e27-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.231 182096 DEBUG os_vif [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:8d:19,bridge_name='br-int',has_traffic_filtering=True,id=dd650e27-b8d1-4200-a549-2495e4e181fa,network=Network(9f1cf616-2c32-488c-baab-73aa253de79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd650e27-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.232 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.233 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd650e27-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.232 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb6141b-b8e9-4e8d-b5f8-4824ef8e1c2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.234 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4a83e731-9f5e-4bda-84e8-fe1257c98918]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.235 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.236 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.242 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.245 182096 INFO os_vif [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:8d:19,bridge_name='br-int',has_traffic_filtering=True,id=dd650e27-b8d1-4200-a549-2495e4e181fa,network=Network(9f1cf616-2c32-488c-baab-73aa253de79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd650e27-b8')
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.245 182096 INFO nova.virt.libvirt.driver [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Deleting instance files /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62_del
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.245 182096 INFO nova.virt.libvirt.driver [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Deletion of /var/lib/nova/instances/c090cdd2-7b44-4f97-9ccb-253fe16eef62_del complete
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.251 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2ebc3dd6-6ca7-4c47-a07e-2df2495722ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469072, 'reachable_time': 28116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230486, 'error': None, 'target': 'ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d9f1cf616\x2d2c32\x2d488c\x2dbaab\x2d73aa253de79f.mount: Deactivated successfully.
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.253 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f1cf616-2c32-488c-baab-73aa253de79f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:40:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:36.253 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbf538b-3f36-4727-a282-e025d1591867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:36 compute-0 podman[230472]: 2026-01-23 09:40:36.282857719 +0000 UTC m=+0.051295921 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.291 182096 INFO nova.compute.manager [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.291 182096 DEBUG oslo.service.loopingcall [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.292 182096 DEBUG nova.compute.manager [-] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.292 182096 DEBUG nova.network.neutron [-] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:40:36 compute-0 podman[230470]: 2026-01-23 09:40:36.294340522 +0000 UTC m=+0.061128193 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.450 182096 DEBUG nova.compute.manager [req-5c02f6cf-9aa5-4798-b73e-c1c1461c9865 req-c77fb2e2-8542-4b5b-a58a-2f5b6254439a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Received event network-changed-dd650e27-b8d1-4200-a549-2495e4e181fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.450 182096 DEBUG nova.compute.manager [req-5c02f6cf-9aa5-4798-b73e-c1c1461c9865 req-c77fb2e2-8542-4b5b-a58a-2f5b6254439a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Refreshing instance network info cache due to event network-changed-dd650e27-b8d1-4200-a549-2495e4e181fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.450 182096 DEBUG oslo_concurrency.lockutils [req-5c02f6cf-9aa5-4798-b73e-c1c1461c9865 req-c77fb2e2-8542-4b5b-a58a-2f5b6254439a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.450 182096 DEBUG oslo_concurrency.lockutils [req-5c02f6cf-9aa5-4798-b73e-c1c1461c9865 req-c77fb2e2-8542-4b5b-a58a-2f5b6254439a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:40:36 compute-0 nova_compute[182092]: 2026-01-23 09:40:36.451 182096 DEBUG nova.network.neutron [req-5c02f6cf-9aa5-4798-b73e-c1c1461c9865 req-c77fb2e2-8542-4b5b-a58a-2f5b6254439a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Refreshing network info cache for port dd650e27-b8d1-4200-a549-2495e4e181fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.040 182096 DEBUG nova.network.neutron [-] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.049 182096 INFO nova.compute.manager [-] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Took 0.76 seconds to deallocate network for instance.
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.113 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.114 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.167 182096 DEBUG nova.compute.provider_tree [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.189 182096 DEBUG nova.scheduler.client.report [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.212 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.245 182096 INFO nova.scheduler.client.report [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Deleted allocations for instance c090cdd2-7b44-4f97-9ccb-253fe16eef62
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.293 182096 DEBUG oslo_concurrency.lockutils [None req-2ad51891-0817-4d76-a65f-0bbdc6b8af45 ca9dd50592344370a9c82fc29637452b 5be4fcb5d9df463eb390de7dc8be6fec - - default default] Lock "c090cdd2-7b44-4f97-9ccb-253fe16eef62" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.529 182096 DEBUG nova.network.neutron [req-5c02f6cf-9aa5-4798-b73e-c1c1461c9865 req-c77fb2e2-8542-4b5b-a58a-2f5b6254439a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Updated VIF entry in instance network info cache for port dd650e27-b8d1-4200-a549-2495e4e181fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.530 182096 DEBUG nova.network.neutron [req-5c02f6cf-9aa5-4798-b73e-c1c1461c9865 req-c77fb2e2-8542-4b5b-a58a-2f5b6254439a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Updating instance_info_cache with network_info: [{"id": "dd650e27-b8d1-4200-a549-2495e4e181fa", "address": "fa:16:3e:b3:8d:19", "network": {"id": "9f1cf616-2c32-488c-baab-73aa253de79f", "bridge": "br-int", "label": "tempest-network-smoke--168395869", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5be4fcb5d9df463eb390de7dc8be6fec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd650e27-b8", "ovs_interfaceid": "dd650e27-b8d1-4200-a549-2495e4e181fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.531 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:37 compute-0 nova_compute[182092]: 2026-01-23 09:40:37.544 182096 DEBUG oslo_concurrency.lockutils [req-5c02f6cf-9aa5-4798-b73e-c1c1461c9865 req-c77fb2e2-8542-4b5b-a58a-2f5b6254439a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-c090cdd2-7b44-4f97-9ccb-253fe16eef62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:40:38 compute-0 nova_compute[182092]: 2026-01-23 09:40:38.513 182096 DEBUG nova.compute.manager [req-d36e7456-9374-46c4-8aae-1f5dda78c2db req-f093d906-6c42-4258-981e-d2746f547f8f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Received event network-vif-deleted-dd650e27-b8d1-4200-a549-2495e4e181fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:39 compute-0 podman[230513]: 2026-01-23 09:40:39.198703591 +0000 UTC m=+0.037622306 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter)
Jan 23 09:40:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:39.871 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:39.871 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:39.871 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:41 compute-0 nova_compute[182092]: 2026-01-23 09:40:41.235 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:41 compute-0 nova_compute[182092]: 2026-01-23 09:40:41.703 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:41 compute-0 nova_compute[182092]: 2026-01-23 09:40:41.885 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:42 compute-0 nova_compute[182092]: 2026-01-23 09:40:42.530 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:46 compute-0 nova_compute[182092]: 2026-01-23 09:40:46.236 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:47 compute-0 nova_compute[182092]: 2026-01-23 09:40:47.532 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:49 compute-0 podman[230533]: 2026-01-23 09:40:49.220278122 +0000 UTC m=+0.056986503 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 09:40:51 compute-0 nova_compute[182092]: 2026-01-23 09:40:51.216 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161236.2154353, c090cdd2-7b44-4f97-9ccb-253fe16eef62 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:40:51 compute-0 nova_compute[182092]: 2026-01-23 09:40:51.217 182096 INFO nova.compute.manager [-] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] VM Stopped (Lifecycle Event)
Jan 23 09:40:51 compute-0 nova_compute[182092]: 2026-01-23 09:40:51.230 182096 DEBUG nova.compute.manager [None req-d9b2bbf9-fd1c-4203-ad31-f109a8ee7500 - - - - - -] [instance: c090cdd2-7b44-4f97-9ccb-253fe16eef62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:40:51 compute-0 nova_compute[182092]: 2026-01-23 09:40:51.237 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:52 compute-0 nova_compute[182092]: 2026-01-23 09:40:52.534 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.405 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.405 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.422 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.488 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.489 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.493 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.493 182096 INFO nova.compute.claims [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.603 182096 DEBUG nova.compute.provider_tree [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.615 182096 DEBUG nova.scheduler.client.report [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.629 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.629 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.668 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.668 182096 DEBUG nova.network.neutron [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.678 182096 INFO nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.695 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.804 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.805 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.806 182096 INFO nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Creating image(s)
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.806 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.806 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.807 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.817 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.861 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.862 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.862 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.871 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.915 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.916 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.935 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.936 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.936 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.948 182096 DEBUG nova.policy [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.978 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.979 182096 DEBUG nova.virt.disk.api [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Checking if we can resize image /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:40:53 compute-0 nova_compute[182092]: 2026-01-23 09:40:53.979 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.022 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.023 182096 DEBUG nova.virt.disk.api [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Cannot resize image /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.023 182096 DEBUG nova.objects.instance [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'migration_context' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.036 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.037 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Ensure instance console log exists: /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.037 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.038 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.038 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:54 compute-0 nova_compute[182092]: 2026-01-23 09:40:54.612 182096 DEBUG nova.network.neutron [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Successfully created port: 063af96a-932f-4abc-b86a-e046a2f8ba53 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:40:55 compute-0 nova_compute[182092]: 2026-01-23 09:40:55.492 182096 DEBUG nova.network.neutron [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Successfully updated port: 063af96a-932f-4abc-b86a-e046a2f8ba53 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:40:55 compute-0 nova_compute[182092]: 2026-01-23 09:40:55.505 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:40:55 compute-0 nova_compute[182092]: 2026-01-23 09:40:55.505 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:40:55 compute-0 nova_compute[182092]: 2026-01-23 09:40:55.505 182096 DEBUG nova.network.neutron [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:40:55 compute-0 nova_compute[182092]: 2026-01-23 09:40:55.590 182096 DEBUG nova.compute.manager [req-74b1ec66-e9ba-491f-8b30-c88370376f06 req-91c9875c-5d58-405d-a78c-bade9d1f5700 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-changed-063af96a-932f-4abc-b86a-e046a2f8ba53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:55 compute-0 nova_compute[182092]: 2026-01-23 09:40:55.590 182096 DEBUG nova.compute.manager [req-74b1ec66-e9ba-491f-8b30-c88370376f06 req-91c9875c-5d58-405d-a78c-bade9d1f5700 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing instance network info cache due to event network-changed-063af96a-932f-4abc-b86a-e046a2f8ba53. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:40:55 compute-0 nova_compute[182092]: 2026-01-23 09:40:55.590 182096 DEBUG oslo_concurrency.lockutils [req-74b1ec66-e9ba-491f-8b30-c88370376f06 req-91c9875c-5d58-405d-a78c-bade9d1f5700 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:40:55 compute-0 nova_compute[182092]: 2026-01-23 09:40:55.642 182096 DEBUG nova.network.neutron [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.238 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.614 182096 DEBUG nova.network.neutron [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.633 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.633 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Instance network_info: |[{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.633 182096 DEBUG oslo_concurrency.lockutils [req-74b1ec66-e9ba-491f-8b30-c88370376f06 req-91c9875c-5d58-405d-a78c-bade9d1f5700 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.634 182096 DEBUG nova.network.neutron [req-74b1ec66-e9ba-491f-8b30-c88370376f06 req-91c9875c-5d58-405d-a78c-bade9d1f5700 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing network info cache for port 063af96a-932f-4abc-b86a-e046a2f8ba53 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.636 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Start _get_guest_xml network_info=[{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.640 182096 WARNING nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.645 182096 DEBUG nova.virt.libvirt.host [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.646 182096 DEBUG nova.virt.libvirt.host [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.648 182096 DEBUG nova.virt.libvirt.host [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.649 182096 DEBUG nova.virt.libvirt.host [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.650 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.650 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.650 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.651 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.651 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.651 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.651 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.652 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.652 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.652 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.652 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.653 182096 DEBUG nova.virt.hardware [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.656 182096 DEBUG nova.virt.libvirt.vif [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:40:53Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.656 182096 DEBUG nova.network.os_vif_util [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.657 182096 DEBUG nova.network.os_vif_util [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:1e:7e,bridge_name='br-int',has_traffic_filtering=True,id=063af96a-932f-4abc-b86a-e046a2f8ba53,network=Network(17a214aa-cca5-4eb9-92b6-24247cdd6a0d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap063af96a-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.657 182096 DEBUG nova.objects.instance [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'pci_devices' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.665 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <uuid>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</uuid>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <name>instance-000000a2</name>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkBasicOps-server-600889586</nova:name>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:40:56</nova:creationTime>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:40:56 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:40:56 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:40:56 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:40:56 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:40:56 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:40:56 compute-0 nova_compute[182092]:         <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:40:56 compute-0 nova_compute[182092]:         <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:40:56 compute-0 nova_compute[182092]:         <nova:port uuid="063af96a-932f-4abc-b86a-e046a2f8ba53">
Jan 23 09:40:56 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <system>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <entry name="serial">d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <entry name="uuid">d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </system>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <os>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   </os>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <features>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   </features>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.config"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:c8:1e:7e"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <target dev="tap063af96a-93"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log" append="off"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <video>
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </video>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:40:56 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:40:56 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:40:56 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:40:56 compute-0 nova_compute[182092]: </domain>
Jan 23 09:40:56 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.667 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Preparing to wait for external event network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.667 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.667 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.668 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.668 182096 DEBUG nova.virt.libvirt.vif [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:40:53Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.668 182096 DEBUG nova.network.os_vif_util [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.669 182096 DEBUG nova.network.os_vif_util [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:1e:7e,bridge_name='br-int',has_traffic_filtering=True,id=063af96a-932f-4abc-b86a-e046a2f8ba53,network=Network(17a214aa-cca5-4eb9-92b6-24247cdd6a0d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap063af96a-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.669 182096 DEBUG os_vif [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:1e:7e,bridge_name='br-int',has_traffic_filtering=True,id=063af96a-932f-4abc-b86a-e046a2f8ba53,network=Network(17a214aa-cca5-4eb9-92b6-24247cdd6a0d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap063af96a-93') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.670 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.670 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.670 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.674 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.675 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap063af96a-93, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.675 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap063af96a-93, col_values=(('external_ids', {'iface-id': '063af96a-932f-4abc-b86a-e046a2f8ba53', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:1e:7e', 'vm-uuid': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.676 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:56 compute-0 NetworkManager[54920]: <info>  [1769161256.6772] manager: (tap063af96a-93): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.679 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.680 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.681 182096 INFO os_vif [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:1e:7e,bridge_name='br-int',has_traffic_filtering=True,id=063af96a-932f-4abc-b86a-e046a2f8ba53,network=Network(17a214aa-cca5-4eb9-92b6-24247cdd6a0d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap063af96a-93')
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.714 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.714 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.714 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No VIF found with MAC fa:16:3e:c8:1e:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:40:56 compute-0 nova_compute[182092]: 2026-01-23 09:40:56.715 182096 INFO nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Using config drive
Jan 23 09:40:56 compute-0 podman[230576]: 2026-01-23 09:40:56.751202717 +0000 UTC m=+0.046212194 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:40:56 compute-0 podman[230575]: 2026-01-23 09:40:56.755769467 +0000 UTC m=+0.053569358 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.055 182096 INFO nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Creating config drive at /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.config
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.059 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0q95fkv4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.179 182096 DEBUG oslo_concurrency.processutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0q95fkv4" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:40:57 compute-0 NetworkManager[54920]: <info>  [1769161257.2185] manager: (tap063af96a-93): new Tun device (/org/freedesktop/NetworkManager/Devices/320)
Jan 23 09:40:57 compute-0 kernel: tap063af96a-93: entered promiscuous mode
Jan 23 09:40:57 compute-0 ovn_controller[94697]: 2026-01-23T09:40:57Z|00646|binding|INFO|Claiming lport 063af96a-932f-4abc-b86a-e046a2f8ba53 for this chassis.
Jan 23 09:40:57 compute-0 ovn_controller[94697]: 2026-01-23T09:40:57Z|00647|binding|INFO|063af96a-932f-4abc-b86a-e046a2f8ba53: Claiming fa:16:3e:c8:1e:7e 10.100.0.6
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.224 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:57 compute-0 systemd-udevd[230629]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.253 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:1e:7e 10.100.0.6'], port_security=['fa:16:3e:c8:1e:7e 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17a214aa-cca5-4eb9-92b6-24247cdd6a0d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d8e2ea8-a5ae-4aa4-917a-02fc18fac44b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6166dd82-cb3d-4477-9fed-bb08c05a2a36, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=063af96a-932f-4abc-b86a-e046a2f8ba53) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.254 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 063af96a-932f-4abc-b86a-e046a2f8ba53 in datapath 17a214aa-cca5-4eb9-92b6-24247cdd6a0d bound to our chassis
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.256 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17a214aa-cca5-4eb9-92b6-24247cdd6a0d
Jan 23 09:40:57 compute-0 NetworkManager[54920]: <info>  [1769161257.2585] device (tap063af96a-93): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:40:57 compute-0 NetworkManager[54920]: <info>  [1769161257.2591] device (tap063af96a-93): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:40:57 compute-0 systemd-machined[153562]: New machine qemu-80-instance-000000a2.
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.264 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f776f4a0-dcdd-429d-8cb6-93f1836b5014]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.265 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17a214aa-c1 in ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.267 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17a214aa-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.267 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[05057d04-252c-44ab-abc0-a94a1900f0e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.267 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[71fd039d-ae2e-4e3c-8e77-1823f5b3fadc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.275 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[75b7eb00-181e-43ba-a9be-960614972611]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.279 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:57 compute-0 systemd[1]: Started Virtual Machine qemu-80-instance-000000a2.
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.283 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:57 compute-0 ovn_controller[94697]: 2026-01-23T09:40:57Z|00648|binding|INFO|Setting lport 063af96a-932f-4abc-b86a-e046a2f8ba53 ovn-installed in OVS
Jan 23 09:40:57 compute-0 ovn_controller[94697]: 2026-01-23T09:40:57Z|00649|binding|INFO|Setting lport 063af96a-932f-4abc-b86a-e046a2f8ba53 up in Southbound
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.285 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.286 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[90261019-310a-4c8f-beef-3dc53db9f94b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.312 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[d658a0fd-ac78-42bc-9403-4dd2bcec7b5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 NetworkManager[54920]: <info>  [1769161257.3182] manager: (tap17a214aa-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/321)
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.317 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c5582576-1684-4cdc-b272-688e1f6815af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.343 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8507d173-ae82-4c35-a157-868d8b4a380f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.345 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e39aab2d-70d8-45b2-9437-27a94fe5d7fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 NetworkManager[54920]: <info>  [1769161257.3618] device (tap17a214aa-c0): carrier: link connected
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.367 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b97321f6-f1e7-4bd1-8c45-538717c8d21b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.381 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1e0155df-36f6-4cf8-b283-38d11b3b3769]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17a214aa-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:09:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476284, 'reachable_time': 39344, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230655, 'error': None, 'target': 'ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.392 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d2250633-7f4e-4e82-8675-b2f1fb253553]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:9e5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 476284, 'tstamp': 476284}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230656, 'error': None, 'target': 'ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.405 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8335fb61-4a9d-4d8e-a18d-133265979030]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17a214aa-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:09:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476284, 'reachable_time': 39344, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230657, 'error': None, 'target': 'ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.427 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[dbd5454f-227b-45d7-8acf-2aa4de961373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.470 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c9070924-5279-410f-bb31-da6f13fb8c19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.471 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17a214aa-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.471 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.472 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17a214aa-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.473 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:57 compute-0 NetworkManager[54920]: <info>  [1769161257.4739] manager: (tap17a214aa-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Jan 23 09:40:57 compute-0 kernel: tap17a214aa-c0: entered promiscuous mode
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.476 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.476 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17a214aa-c0, col_values=(('external_ids', {'iface-id': '083ece3e-9182-4251-86be-2cee076862d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:40:57 compute-0 ovn_controller[94697]: 2026-01-23T09:40:57Z|00650|binding|INFO|Releasing lport 083ece3e-9182-4251-86be-2cee076862d9 from this chassis (sb_readonly=0)
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.488 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.489 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17a214aa-cca5-4eb9-92b6-24247cdd6a0d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17a214aa-cca5-4eb9-92b6-24247cdd6a0d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.489 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2a66bf-a0ed-4b45-b092-7c505767e631]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.490 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-17a214aa-cca5-4eb9-92b6-24247cdd6a0d
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/17a214aa-cca5-4eb9-92b6-24247cdd6a0d.pid.haproxy
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 17a214aa-cca5-4eb9-92b6-24247cdd6a0d
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:40:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:40:57.490 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d', 'env', 'PROCESS_TAG=haproxy-17a214aa-cca5-4eb9-92b6-24247cdd6a0d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17a214aa-cca5-4eb9-92b6-24247cdd6a0d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.534 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.702 182096 DEBUG nova.compute.manager [req-6af40dba-a5cc-447f-a1c1-7f8952cb468e req-63590b30-f50f-40f2-a562-7f189384fc6a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.703 182096 DEBUG oslo_concurrency.lockutils [req-6af40dba-a5cc-447f-a1c1-7f8952cb468e req-63590b30-f50f-40f2-a562-7f189384fc6a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.703 182096 DEBUG oslo_concurrency.lockutils [req-6af40dba-a5cc-447f-a1c1-7f8952cb468e req-63590b30-f50f-40f2-a562-7f189384fc6a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.703 182096 DEBUG oslo_concurrency.lockutils [req-6af40dba-a5cc-447f-a1c1-7f8952cb468e req-63590b30-f50f-40f2-a562-7f189384fc6a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.703 182096 DEBUG nova.compute.manager [req-6af40dba-a5cc-447f-a1c1-7f8952cb468e req-63590b30-f50f-40f2-a562-7f189384fc6a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Processing event network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.704 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161257.7025046, d2ecc8e0-c714-4bea-a634-4908c7d6cdef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.704 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] VM Started (Lifecycle Event)
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.706 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.709 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.712 182096 INFO nova.virt.libvirt.driver [-] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Instance spawned successfully.
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.712 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.718 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.720 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.727 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.727 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.727 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.728 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.728 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.728 182096 DEBUG nova.virt.libvirt.driver [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.745 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.746 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161257.702622, d2ecc8e0-c714-4bea-a634-4908c7d6cdef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.746 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] VM Paused (Lifecycle Event)
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.767 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.769 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161257.7103941, d2ecc8e0-c714-4bea-a634-4908c7d6cdef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.769 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] VM Resumed (Lifecycle Event)
Jan 23 09:40:57 compute-0 podman[230692]: 2026-01-23 09:40:57.781525625 +0000 UTC m=+0.033609139 container create 275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.785 182096 INFO nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Took 3.98 seconds to spawn the instance on the hypervisor.
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.786 182096 DEBUG nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.787 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.793 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:40:57 compute-0 systemd[1]: Started libpod-conmon-275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687.scope.
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.811 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:40:57 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:40:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/352cff5850991215301d4c05f63baa3c5a1d3db642ed53f962a0e267baba74be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:40:57 compute-0 podman[230692]: 2026-01-23 09:40:57.836938248 +0000 UTC m=+0.089021752 container init 275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:40:57 compute-0 podman[230692]: 2026-01-23 09:40:57.841964707 +0000 UTC m=+0.094048220 container start 275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 09:40:57 compute-0 podman[230692]: 2026-01-23 09:40:57.765597268 +0000 UTC m=+0.017680802 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.848 182096 INFO nova.compute.manager [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Took 4.38 seconds to build instance.
Jan 23 09:40:57 compute-0 neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d[230704]: [NOTICE]   (230708) : New worker (230710) forked
Jan 23 09:40:57 compute-0 neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d[230704]: [NOTICE]   (230708) : Loading success.
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.860 182096 DEBUG oslo_concurrency.lockutils [None req-9565261d-a953-4ebd-9a60-b2554ff1eb4b 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.895 182096 DEBUG nova.network.neutron [req-74b1ec66-e9ba-491f-8b30-c88370376f06 req-91c9875c-5d58-405d-a78c-bade9d1f5700 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updated VIF entry in instance network info cache for port 063af96a-932f-4abc-b86a-e046a2f8ba53. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.896 182096 DEBUG nova.network.neutron [req-74b1ec66-e9ba-491f-8b30-c88370376f06 req-91c9875c-5d58-405d-a78c-bade9d1f5700 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:40:57 compute-0 nova_compute[182092]: 2026-01-23 09:40:57.907 182096 DEBUG oslo_concurrency.lockutils [req-74b1ec66-e9ba-491f-8b30-c88370376f06 req-91c9875c-5d58-405d-a78c-bade9d1f5700 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:40:59 compute-0 nova_compute[182092]: 2026-01-23 09:40:59.804 182096 DEBUG nova.compute.manager [req-1d6fe3de-419c-445c-9dc9-67df3fd8716b req-a6197361-d8cd-4c5f-b906-ba2d46b48ee9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:40:59 compute-0 nova_compute[182092]: 2026-01-23 09:40:59.805 182096 DEBUG oslo_concurrency.lockutils [req-1d6fe3de-419c-445c-9dc9-67df3fd8716b req-a6197361-d8cd-4c5f-b906-ba2d46b48ee9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:40:59 compute-0 nova_compute[182092]: 2026-01-23 09:40:59.805 182096 DEBUG oslo_concurrency.lockutils [req-1d6fe3de-419c-445c-9dc9-67df3fd8716b req-a6197361-d8cd-4c5f-b906-ba2d46b48ee9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:40:59 compute-0 nova_compute[182092]: 2026-01-23 09:40:59.805 182096 DEBUG oslo_concurrency.lockutils [req-1d6fe3de-419c-445c-9dc9-67df3fd8716b req-a6197361-d8cd-4c5f-b906-ba2d46b48ee9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:40:59 compute-0 nova_compute[182092]: 2026-01-23 09:40:59.806 182096 DEBUG nova.compute.manager [req-1d6fe3de-419c-445c-9dc9-67df3fd8716b req-a6197361-d8cd-4c5f-b906-ba2d46b48ee9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] No waiting events found dispatching network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:40:59 compute-0 nova_compute[182092]: 2026-01-23 09:40:59.806 182096 WARNING nova.compute.manager [req-1d6fe3de-419c-445c-9dc9-67df3fd8716b req-a6197361-d8cd-4c5f-b906-ba2d46b48ee9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received unexpected event network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 for instance with vm_state active and task_state None.
Jan 23 09:40:59 compute-0 NetworkManager[54920]: <info>  [1769161259.9392] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Jan 23 09:40:59 compute-0 NetworkManager[54920]: <info>  [1769161259.9397] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Jan 23 09:40:59 compute-0 nova_compute[182092]: 2026-01-23 09:40:59.940 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:00 compute-0 nova_compute[182092]: 2026-01-23 09:41:00.112 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:00 compute-0 ovn_controller[94697]: 2026-01-23T09:41:00Z|00651|binding|INFO|Releasing lport 083ece3e-9182-4251-86be-2cee076862d9 from this chassis (sb_readonly=0)
Jan 23 09:41:00 compute-0 nova_compute[182092]: 2026-01-23 09:41:00.129 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:00 compute-0 nova_compute[182092]: 2026-01-23 09:41:00.176 182096 DEBUG nova.compute.manager [req-ec94bada-fb78-46e0-ae6f-13e280386a07 req-c725fd9b-68c4-4f28-8f4b-09b119b01730 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-changed-063af96a-932f-4abc-b86a-e046a2f8ba53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:00 compute-0 nova_compute[182092]: 2026-01-23 09:41:00.177 182096 DEBUG nova.compute.manager [req-ec94bada-fb78-46e0-ae6f-13e280386a07 req-c725fd9b-68c4-4f28-8f4b-09b119b01730 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing instance network info cache due to event network-changed-063af96a-932f-4abc-b86a-e046a2f8ba53. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:41:00 compute-0 nova_compute[182092]: 2026-01-23 09:41:00.177 182096 DEBUG oslo_concurrency.lockutils [req-ec94bada-fb78-46e0-ae6f-13e280386a07 req-c725fd9b-68c4-4f28-8f4b-09b119b01730 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:41:00 compute-0 nova_compute[182092]: 2026-01-23 09:41:00.177 182096 DEBUG oslo_concurrency.lockutils [req-ec94bada-fb78-46e0-ae6f-13e280386a07 req-c725fd9b-68c4-4f28-8f4b-09b119b01730 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:41:00 compute-0 nova_compute[182092]: 2026-01-23 09:41:00.177 182096 DEBUG nova.network.neutron [req-ec94bada-fb78-46e0-ae6f-13e280386a07 req-c725fd9b-68c4-4f28-8f4b-09b119b01730 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing network info cache for port 063af96a-932f-4abc-b86a-e046a2f8ba53 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:41:01 compute-0 nova_compute[182092]: 2026-01-23 09:41:01.036 182096 DEBUG nova.network.neutron [req-ec94bada-fb78-46e0-ae6f-13e280386a07 req-c725fd9b-68c4-4f28-8f4b-09b119b01730 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updated VIF entry in instance network info cache for port 063af96a-932f-4abc-b86a-e046a2f8ba53. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:41:01 compute-0 nova_compute[182092]: 2026-01-23 09:41:01.037 182096 DEBUG nova.network.neutron [req-ec94bada-fb78-46e0-ae6f-13e280386a07 req-c725fd9b-68c4-4f28-8f4b-09b119b01730 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:41:01 compute-0 nova_compute[182092]: 2026-01-23 09:41:01.050 182096 DEBUG oslo_concurrency.lockutils [req-ec94bada-fb78-46e0-ae6f-13e280386a07 req-c725fd9b-68c4-4f28-8f4b-09b119b01730 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:41:01 compute-0 nova_compute[182092]: 2026-01-23 09:41:01.678 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:02 compute-0 nova_compute[182092]: 2026-01-23 09:41:02.536 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:04 compute-0 nova_compute[182092]: 2026-01-23 09:41:04.281 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:04 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:04.281 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:41:04 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:04.282 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:41:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:05.284 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:06 compute-0 nova_compute[182092]: 2026-01-23 09:41:06.680 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:07 compute-0 podman[230717]: 2026-01-23 09:41:07.213261802 +0000 UTC m=+0.047334240 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:41:07 compute-0 podman[230716]: 2026-01-23 09:41:07.214314648 +0000 UTC m=+0.049575798 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:41:07 compute-0 nova_compute[182092]: 2026-01-23 09:41:07.537 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:07 compute-0 ovn_controller[94697]: 2026-01-23T09:41:07Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:1e:7e 10.100.0.6
Jan 23 09:41:07 compute-0 ovn_controller[94697]: 2026-01-23T09:41:07Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:1e:7e 10.100.0.6
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.092 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.092 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.109 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.189 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.190 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.197 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.198 182096 INFO nova.compute.claims [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.329 182096 DEBUG nova.compute.provider_tree [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.344 182096 DEBUG nova.scheduler.client.report [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.358 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.358 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.399 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.399 182096 DEBUG nova.network.neutron [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.412 182096 INFO nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.424 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.525 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.526 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.526 182096 INFO nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Creating image(s)
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.527 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "/var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.527 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "/var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.528 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "/var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.538 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.597 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.598 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.599 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.608 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.666 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.667 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.689 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.690 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.690 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.747 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.748 182096 DEBUG nova.virt.disk.api [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Checking if we can resize image /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.749 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.806 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.807 182096 DEBUG nova.virt.disk.api [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Cannot resize image /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.807 182096 DEBUG nova.objects.instance [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lazy-loading 'migration_context' on Instance uuid 66b8d4e6-8a07-42e5-b981-86a1226fffd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.816 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.817 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Ensure instance console log exists: /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.817 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.818 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.818 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:08 compute-0 nova_compute[182092]: 2026-01-23 09:41:08.925 182096 DEBUG nova.network.neutron [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Successfully created port: a63aa922-fe9f-43fc-b153-7e0ace515ca1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:41:10 compute-0 podman[230782]: 2026-01-23 09:41:10.207497146 +0000 UTC m=+0.044209689 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, release=1755695350)
Jan 23 09:41:10 compute-0 nova_compute[182092]: 2026-01-23 09:41:10.988 182096 DEBUG nova.network.neutron [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Successfully updated port: a63aa922-fe9f-43fc-b153-7e0ace515ca1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:41:11 compute-0 nova_compute[182092]: 2026-01-23 09:41:11.000 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "refresh_cache-66b8d4e6-8a07-42e5-b981-86a1226fffd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:41:11 compute-0 nova_compute[182092]: 2026-01-23 09:41:11.000 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquired lock "refresh_cache-66b8d4e6-8a07-42e5-b981-86a1226fffd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:41:11 compute-0 nova_compute[182092]: 2026-01-23 09:41:11.000 182096 DEBUG nova.network.neutron [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:41:11 compute-0 nova_compute[182092]: 2026-01-23 09:41:11.675 182096 DEBUG nova.network.neutron [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:41:11 compute-0 nova_compute[182092]: 2026-01-23 09:41:11.683 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.374 182096 DEBUG nova.network.neutron [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Updating instance_info_cache with network_info: [{"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.388 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Releasing lock "refresh_cache-66b8d4e6-8a07-42e5-b981-86a1226fffd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.388 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Instance network_info: |[{"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.390 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Start _get_guest_xml network_info=[{"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.394 182096 WARNING nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.398 182096 DEBUG nova.virt.libvirt.host [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.398 182096 DEBUG nova.virt.libvirt.host [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.401 182096 DEBUG nova.virt.libvirt.host [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.401 182096 DEBUG nova.virt.libvirt.host [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.403 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.403 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.403 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.404 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.404 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.404 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.404 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.404 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.404 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.405 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.405 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.405 182096 DEBUG nova.virt.hardware [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.408 182096 DEBUG nova.virt.libvirt.vif [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:41:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1666454447',display_name='tempest-TestServerMultinode-server-1666454447',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1666454447',id=165,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8e68861a50e9454ab7a4f1b437b859d7',ramdisk_id='',reservation_id='r-l9qw3alu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1114040142',owner_user_name='tempest-TestServerMultinode-1114040142-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:41:08Z,user_data=None,user_id='7cdda3fda14844ceb053212cbacca2c0',uuid=66b8d4e6-8a07-42e5-b981-86a1226fffd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.408 182096 DEBUG nova.network.os_vif_util [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Converting VIF {"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.409 182096 DEBUG nova.network.os_vif_util [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:61:f5,bridge_name='br-int',has_traffic_filtering=True,id=a63aa922-fe9f-43fc-b153-7e0ace515ca1,network=Network(970f64b7-d25f-4e3e-b0fc-94005bc75845),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63aa922-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.409 182096 DEBUG nova.objects.instance [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 66b8d4e6-8a07-42e5-b981-86a1226fffd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.423 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <uuid>66b8d4e6-8a07-42e5-b981-86a1226fffd0</uuid>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <name>instance-000000a5</name>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <nova:name>tempest-TestServerMultinode-server-1666454447</nova:name>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:41:12</nova:creationTime>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:41:12 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:41:12 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:41:12 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:41:12 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:41:12 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:41:12 compute-0 nova_compute[182092]:         <nova:user uuid="7cdda3fda14844ceb053212cbacca2c0">tempest-TestServerMultinode-1114040142-project-admin</nova:user>
Jan 23 09:41:12 compute-0 nova_compute[182092]:         <nova:project uuid="8e68861a50e9454ab7a4f1b437b859d7">tempest-TestServerMultinode-1114040142</nova:project>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:41:12 compute-0 nova_compute[182092]:         <nova:port uuid="a63aa922-fe9f-43fc-b153-7e0ace515ca1">
Jan 23 09:41:12 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <system>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <entry name="serial">66b8d4e6-8a07-42e5-b981-86a1226fffd0</entry>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <entry name="uuid">66b8d4e6-8a07-42e5-b981-86a1226fffd0</entry>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </system>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <os>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   </os>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <features>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   </features>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk.config"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:31:61:f5"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <target dev="tapa63aa922-fe"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/console.log" append="off"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <video>
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </video>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:41:12 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:41:12 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:41:12 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:41:12 compute-0 nova_compute[182092]: </domain>
Jan 23 09:41:12 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.423 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Preparing to wait for external event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.423 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.424 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.424 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.424 182096 DEBUG nova.virt.libvirt.vif [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:41:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1666454447',display_name='tempest-TestServerMultinode-server-1666454447',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1666454447',id=165,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8e68861a50e9454ab7a4f1b437b859d7',ramdisk_id='',reservation_id='r-l9qw3alu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1114040142',owner_user_name='tempest-TestServerMultinode-1114040142-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:41:08Z,user_data=None,user_id='7cdda3fda14844ceb053212cbacca2c0',uuid=66b8d4e6-8a07-42e5-b981-86a1226fffd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.424 182096 DEBUG nova.network.os_vif_util [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Converting VIF {"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.425 182096 DEBUG nova.network.os_vif_util [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:61:f5,bridge_name='br-int',has_traffic_filtering=True,id=a63aa922-fe9f-43fc-b153-7e0ace515ca1,network=Network(970f64b7-d25f-4e3e-b0fc-94005bc75845),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63aa922-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.425 182096 DEBUG os_vif [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:61:f5,bridge_name='br-int',has_traffic_filtering=True,id=a63aa922-fe9f-43fc-b153-7e0ace515ca1,network=Network(970f64b7-d25f-4e3e-b0fc-94005bc75845),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63aa922-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.425 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.426 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.426 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.428 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.428 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa63aa922-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.428 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa63aa922-fe, col_values=(('external_ids', {'iface-id': 'a63aa922-fe9f-43fc-b153-7e0ace515ca1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:61:f5', 'vm-uuid': '66b8d4e6-8a07-42e5-b981-86a1226fffd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.429 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:12 compute-0 NetworkManager[54920]: <info>  [1769161272.4307] manager: (tapa63aa922-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.433 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.437 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.438 182096 INFO os_vif [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:61:f5,bridge_name='br-int',has_traffic_filtering=True,id=a63aa922-fe9f-43fc-b153-7e0ace515ca1,network=Network(970f64b7-d25f-4e3e-b0fc-94005bc75845),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63aa922-fe')
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.478 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.478 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.479 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] No VIF found with MAC fa:16:3e:31:61:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.479 182096 INFO nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Using config drive
Jan 23 09:41:12 compute-0 nova_compute[182092]: 2026-01-23 09:41:12.538 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.055 182096 INFO nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Creating config drive at /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk.config
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.060 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqydotsi4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.109 182096 DEBUG nova.compute.manager [req-6d46ce41-0094-413c-8a3e-f58b677b6285 req-37663bcb-dd94-4192-a015-b12e5a1c3d93 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received event network-changed-a63aa922-fe9f-43fc-b153-7e0ace515ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.109 182096 DEBUG nova.compute.manager [req-6d46ce41-0094-413c-8a3e-f58b677b6285 req-37663bcb-dd94-4192-a015-b12e5a1c3d93 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Refreshing instance network info cache due to event network-changed-a63aa922-fe9f-43fc-b153-7e0ace515ca1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.109 182096 DEBUG oslo_concurrency.lockutils [req-6d46ce41-0094-413c-8a3e-f58b677b6285 req-37663bcb-dd94-4192-a015-b12e5a1c3d93 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-66b8d4e6-8a07-42e5-b981-86a1226fffd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.110 182096 DEBUG oslo_concurrency.lockutils [req-6d46ce41-0094-413c-8a3e-f58b677b6285 req-37663bcb-dd94-4192-a015-b12e5a1c3d93 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-66b8d4e6-8a07-42e5-b981-86a1226fffd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.110 182096 DEBUG nova.network.neutron [req-6d46ce41-0094-413c-8a3e-f58b677b6285 req-37663bcb-dd94-4192-a015-b12e5a1c3d93 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Refreshing network info cache for port a63aa922-fe9f-43fc-b153-7e0ace515ca1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.179 182096 DEBUG oslo_concurrency.processutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqydotsi4" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:41:13 compute-0 NetworkManager[54920]: <info>  [1769161273.2244] manager: (tapa63aa922-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/326)
Jan 23 09:41:13 compute-0 kernel: tapa63aa922-fe: entered promiscuous mode
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.228 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:13 compute-0 ovn_controller[94697]: 2026-01-23T09:41:13Z|00652|binding|INFO|Claiming lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 for this chassis.
Jan 23 09:41:13 compute-0 ovn_controller[94697]: 2026-01-23T09:41:13Z|00653|binding|INFO|a63aa922-fe9f-43fc-b153-7e0ace515ca1: Claiming fa:16:3e:31:61:f5 10.100.0.10
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.234 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:61:f5 10.100.0.10'], port_security=['fa:16:3e:31:61:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '66b8d4e6-8a07-42e5-b981-86a1226fffd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8e68861a50e9454ab7a4f1b437b859d7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aec0ca2e-7632-48c8-bb43-22a067f06685', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4dc180f-5441-47d2-bd05-18a2a07d1d54, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a63aa922-fe9f-43fc-b153-7e0ace515ca1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.235 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a63aa922-fe9f-43fc-b153-7e0ace515ca1 in datapath 970f64b7-d25f-4e3e-b0fc-94005bc75845 bound to our chassis
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.236 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 970f64b7-d25f-4e3e-b0fc-94005bc75845
Jan 23 09:41:13 compute-0 ovn_controller[94697]: 2026-01-23T09:41:13Z|00654|binding|INFO|Setting lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 ovn-installed in OVS
Jan 23 09:41:13 compute-0 ovn_controller[94697]: 2026-01-23T09:41:13Z|00655|binding|INFO|Setting lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 up in Southbound
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.243 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.246 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.249 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7f7b86-ad5d-48d4-87b5-1cac5f7c7976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.250 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap970f64b7-d1 in ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.252 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap970f64b7-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.252 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c92e0979-e399-4d98-a411-7eb7eeeaa5c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.253 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1b40cf27-122a-445a-b12f-7c35288e7e86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 systemd-udevd[230821]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.260 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[9d12028b-3faa-4c7f-b706-8df6e5f7e280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 NetworkManager[54920]: <info>  [1769161273.2729] device (tapa63aa922-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:41:13 compute-0 NetworkManager[54920]: <info>  [1769161273.2737] device (tapa63aa922-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:41:13 compute-0 systemd-machined[153562]: New machine qemu-81-instance-000000a5.
Jan 23 09:41:13 compute-0 systemd[1]: Started Virtual Machine qemu-81-instance-000000a5.
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.281 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[771ee8b4-5414-4bc4-aa47-a55df222f563]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.306 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f6b126-4eba-423e-84d4-74d32a883128]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 NetworkManager[54920]: <info>  [1769161273.3122] manager: (tap970f64b7-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.311 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb61c1f-55a8-428b-896e-54ec7d46d02f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.336 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[96e4a141-3419-418f-9d17-0394b27b8cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.338 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[fa83c22b-c74c-4626-9bd6-df81dfd772ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 NetworkManager[54920]: <info>  [1769161273.3533] device (tap970f64b7-d0): carrier: link connected
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.358 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1f81d4da-221e-4c25-95e5-87ecd795bd61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.371 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[60da8b3a-6769-40b4-a6e3-445ccb94de36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap970f64b7-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:b0:7f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477883, 'reachable_time': 16892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230845, 'error': None, 'target': 'ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.381 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1e48c63a-467a-46d0-bbc8-5186d865900a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:b07f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477883, 'tstamp': 477883}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230846, 'error': None, 'target': 'ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.393 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1c02e73d-60b7-41a7-93dc-fa9112d94acf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap970f64b7-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:b0:7f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477883, 'reachable_time': 16892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230847, 'error': None, 'target': 'ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.413 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e7013c-e31b-40eb-822a-c353e6b23c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.455 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2f289fe8-c127-43db-930b-177e4983739c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.456 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap970f64b7-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.457 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.457 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap970f64b7-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:13 compute-0 NetworkManager[54920]: <info>  [1769161273.4593] manager: (tap970f64b7-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 23 09:41:13 compute-0 kernel: tap970f64b7-d0: entered promiscuous mode
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.458 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.463 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap970f64b7-d0, col_values=(('external_ids', {'iface-id': 'b0a29239-5659-4ff4-b8aa-51a199013b15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.463 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:13 compute-0 ovn_controller[94697]: 2026-01-23T09:41:13Z|00656|binding|INFO|Releasing lport b0a29239-5659-4ff4-b8aa-51a199013b15 from this chassis (sb_readonly=0)
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.465 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.467 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/970f64b7-d25f-4e3e-b0fc-94005bc75845.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/970f64b7-d25f-4e3e-b0fc-94005bc75845.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.467 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d1b7bd6-213b-485b-a1df-715d7e04071b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.468 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-970f64b7-d25f-4e3e-b0fc-94005bc75845
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/970f64b7-d25f-4e3e-b0fc-94005bc75845.pid.haproxy
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 970f64b7-d25f-4e3e-b0fc-94005bc75845
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:41:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:13.470 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'env', 'PROCESS_TAG=haproxy-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/970f64b7-d25f-4e3e-b0fc-94005bc75845.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.477 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:13 compute-0 nova_compute[182092]: 2026-01-23 09:41:13.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:13 compute-0 podman[230876]: 2026-01-23 09:41:13.748961494 +0000 UTC m=+0.032487423 container create ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:41:13 compute-0 systemd[1]: Started libpod-conmon-ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c.scope.
Jan 23 09:41:13 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:41:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e93b555fa5ae2adcbdf79f1a8d3f1ed6e43093e6a935e2ec2ee2e564066a5e68/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:41:13 compute-0 podman[230876]: 2026-01-23 09:41:13.821053178 +0000 UTC m=+0.104579118 container init ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:41:13 compute-0 podman[230876]: 2026-01-23 09:41:13.825887285 +0000 UTC m=+0.109413215 container start ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:41:13 compute-0 podman[230876]: 2026-01-23 09:41:13.734063309 +0000 UTC m=+0.017589250 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:41:13 compute-0 neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845[230888]: [NOTICE]   (230892) : New worker (230894) forked
Jan 23 09:41:13 compute-0 neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845[230888]: [NOTICE]   (230892) : Loading success.
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.680 182096 INFO nova.compute.manager [None req-93280f03-1c16-42c8-8b6b-909e6d3f135f 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Get console output
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.684 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.698 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161274.6978638, 66b8d4e6-8a07-42e5-b981-86a1226fffd0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.698 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] VM Started (Lifecycle Event)
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.710 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.712 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161274.6984725, 66b8d4e6-8a07-42e5-b981-86a1226fffd0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.712 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] VM Paused (Lifecycle Event)
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.728 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.730 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.731 182096 DEBUG nova.network.neutron [req-6d46ce41-0094-413c-8a3e-f58b677b6285 req-37663bcb-dd94-4192-a015-b12e5a1c3d93 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Updated VIF entry in instance network info cache for port a63aa922-fe9f-43fc-b153-7e0ace515ca1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.732 182096 DEBUG nova.network.neutron [req-6d46ce41-0094-413c-8a3e-f58b677b6285 req-37663bcb-dd94-4192-a015-b12e5a1c3d93 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Updating instance_info_cache with network_info: [{"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.756 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:41:14 compute-0 nova_compute[182092]: 2026-01-23 09:41:14.758 182096 DEBUG oslo_concurrency.lockutils [req-6d46ce41-0094-413c-8a3e-f58b677b6285 req-37663bcb-dd94-4192-a015-b12e5a1c3d93 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-66b8d4e6-8a07-42e5-b981-86a1226fffd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.184 182096 DEBUG nova.compute.manager [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.184 182096 DEBUG oslo_concurrency.lockutils [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.185 182096 DEBUG oslo_concurrency.lockutils [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.185 182096 DEBUG oslo_concurrency.lockutils [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.185 182096 DEBUG nova.compute.manager [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Processing event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.186 182096 DEBUG nova.compute.manager [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.186 182096 DEBUG oslo_concurrency.lockutils [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.186 182096 DEBUG oslo_concurrency.lockutils [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.186 182096 DEBUG oslo_concurrency.lockutils [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.187 182096 DEBUG nova.compute.manager [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] No waiting events found dispatching network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.187 182096 WARNING nova.compute.manager [req-c09518e8-cd1d-4032-88e7-86be0262c226 req-e741b952-6dbe-4fb6-885f-12d58f7784a8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received unexpected event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 for instance with vm_state building and task_state spawning.
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.187 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.190 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161275.1900792, 66b8d4e6-8a07-42e5-b981-86a1226fffd0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.190 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] VM Resumed (Lifecycle Event)
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.193 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.195 182096 INFO nova.virt.libvirt.driver [-] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Instance spawned successfully.
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.195 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.206 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.210 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.212 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.212 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.213 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.213 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.213 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.214 182096 DEBUG nova.virt.libvirt.driver [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.230 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.262 182096 INFO nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Took 6.74 seconds to spawn the instance on the hypervisor.
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.262 182096 DEBUG nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.320 182096 INFO nova.compute.manager [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Took 7.17 seconds to build instance.
Jan 23 09:41:15 compute-0 nova_compute[182092]: 2026-01-23 09:41:15.336 182096 DEBUG oslo_concurrency.lockutils [None req-7680f6e0-69a8-41f0-aa41-814dcbbf0550 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:16 compute-0 nova_compute[182092]: 2026-01-23 09:41:16.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:16 compute-0 nova_compute[182092]: 2026-01-23 09:41:16.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.015 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.015 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.016 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.016 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.016 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.023 182096 INFO nova.compute.manager [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Terminating instance
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.029 182096 DEBUG nova.compute.manager [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:41:17 compute-0 kernel: tapa63aa922-fe (unregistering): left promiscuous mode
Jan 23 09:41:17 compute-0 NetworkManager[54920]: <info>  [1769161277.0494] device (tapa63aa922-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00657|binding|INFO|Releasing lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 from this chassis (sb_readonly=0)
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00658|binding|INFO|Setting lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 down in Southbound
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00659|binding|INFO|Removing iface tapa63aa922-fe ovn-installed in OVS
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.056 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.062 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:61:f5 10.100.0.10'], port_security=['fa:16:3e:31:61:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '66b8d4e6-8a07-42e5-b981-86a1226fffd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8e68861a50e9454ab7a4f1b437b859d7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aec0ca2e-7632-48c8-bb43-22a067f06685', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4dc180f-5441-47d2-bd05-18a2a07d1d54, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a63aa922-fe9f-43fc-b153-7e0ace515ca1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.064 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a63aa922-fe9f-43fc-b153-7e0ace515ca1 in datapath 970f64b7-d25f-4e3e-b0fc-94005bc75845 unbound from our chassis
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.066 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 970f64b7-d25f-4e3e-b0fc-94005bc75845, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.069 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.067 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a97e40db-4bda-46cb-a354-c4e3793e08f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.068 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845 namespace which is not needed anymore
Jan 23 09:41:17 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 23 09:41:17 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a5.scope: Consumed 3.221s CPU time.
Jan 23 09:41:17 compute-0 systemd-machined[153562]: Machine qemu-81-instance-000000a5 terminated.
Jan 23 09:41:17 compute-0 neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845[230888]: [NOTICE]   (230892) : haproxy version is 2.8.14-c23fe91
Jan 23 09:41:17 compute-0 neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845[230888]: [NOTICE]   (230892) : path to executable is /usr/sbin/haproxy
Jan 23 09:41:17 compute-0 neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845[230888]: [WARNING]  (230892) : Exiting Master process...
Jan 23 09:41:17 compute-0 neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845[230888]: [ALERT]    (230892) : Current worker (230894) exited with code 143 (Terminated)
Jan 23 09:41:17 compute-0 neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845[230888]: [WARNING]  (230892) : All workers exited. Exiting... (0)
Jan 23 09:41:17 compute-0 systemd[1]: libpod-ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c.scope: Deactivated successfully.
Jan 23 09:41:17 compute-0 podman[230926]: 2026-01-23 09:41:17.166108423 +0000 UTC m=+0.033601153 container died ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:41:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c-userdata-shm.mount: Deactivated successfully.
Jan 23 09:41:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-e93b555fa5ae2adcbdf79f1a8d3f1ed6e43093e6a935e2ec2ee2e564066a5e68-merged.mount: Deactivated successfully.
Jan 23 09:41:17 compute-0 podman[230926]: 2026-01-23 09:41:17.189889937 +0000 UTC m=+0.057382668 container cleanup ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:41:17 compute-0 systemd[1]: libpod-conmon-ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c.scope: Deactivated successfully.
Jan 23 09:41:17 compute-0 kernel: tapa63aa922-fe: entered promiscuous mode
Jan 23 09:41:17 compute-0 kernel: tapa63aa922-fe (unregistering): left promiscuous mode
Jan 23 09:41:17 compute-0 NetworkManager[54920]: <info>  [1769161277.2427] manager: (tapa63aa922-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00660|binding|INFO|Claiming lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 for this chassis.
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00661|binding|INFO|a63aa922-fe9f-43fc-b153-7e0ace515ca1: Claiming fa:16:3e:31:61:f5 10.100.0.10
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.245 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.253 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:61:f5 10.100.0.10'], port_security=['fa:16:3e:31:61:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '66b8d4e6-8a07-42e5-b981-86a1226fffd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8e68861a50e9454ab7a4f1b437b859d7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aec0ca2e-7632-48c8-bb43-22a067f06685', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4dc180f-5441-47d2-bd05-18a2a07d1d54, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a63aa922-fe9f-43fc-b153-7e0ace515ca1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00662|binding|INFO|Setting lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 ovn-installed in OVS
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00663|binding|INFO|Setting lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 up in Southbound
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00664|binding|INFO|Releasing lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 from this chassis (sb_readonly=1)
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00665|if_status|INFO|Dropped 1 log messages in last 362 seconds (most recently, 362 seconds ago) due to excessive rate
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00666|if_status|INFO|Not setting lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 down as sb is readonly
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00667|binding|INFO|Releasing lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 from this chassis (sb_readonly=0)
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00668|binding|INFO|Removing iface tapa63aa922-fe ovn-installed in OVS
Jan 23 09:41:17 compute-0 ovn_controller[94697]: 2026-01-23T09:41:17Z|00669|binding|INFO|Setting lport a63aa922-fe9f-43fc-b153-7e0ace515ca1 down in Southbound
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.270 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.274 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 podman[230952]: 2026-01-23 09:41:17.276154459 +0000 UTC m=+0.071891959 container remove ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.280 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[760fdc52-9f5e-4dad-bf8a-1ba1b968257b]: (4, ('Fri Jan 23 09:41:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845 (ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c)\nee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c\nFri Jan 23 09:41:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845 (ee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c)\nee72b883d66d338dd24e55a095ab9cf672d6a7bf432a5420234fb52f87854f6c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.281 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[76545892-74df-4007-b3d2-e284728aeb92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.282 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap970f64b7-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.284 182096 INFO nova.virt.libvirt.driver [-] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Instance destroyed successfully.
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.283 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:61:f5 10.100.0.10'], port_security=['fa:16:3e:31:61:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '66b8d4e6-8a07-42e5-b981-86a1226fffd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8e68861a50e9454ab7a4f1b437b859d7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aec0ca2e-7632-48c8-bb43-22a067f06685', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4dc180f-5441-47d2-bd05-18a2a07d1d54, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=a63aa922-fe9f-43fc-b153-7e0ace515ca1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:41:17 compute-0 kernel: tap970f64b7-d0: left promiscuous mode
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.286 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.287 182096 DEBUG nova.objects.instance [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lazy-loading 'resources' on Instance uuid 66b8d4e6-8a07-42e5-b981-86a1226fffd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.289 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.290 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6d21f075-40d0-4d2b-80c1-5db1431e1ae9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.303 182096 DEBUG nova.virt.libvirt.vif [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:41:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1666454447',display_name='tempest-TestServerMultinode-server-1666454447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1666454447',id=165,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:41:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8e68861a50e9454ab7a4f1b437b859d7',ramdisk_id='',reservation_id='r-l9qw3alu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1114040142',owner_user_name='tempest-TestServerMultinode-1114040142-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:41:15Z,user_data=None,user_id='7cdda3fda14844ceb053212cbacca2c0',uuid=66b8d4e6-8a07-42e5-b981-86a1226fffd0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.303 182096 DEBUG nova.network.os_vif_util [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Converting VIF {"id": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "address": "fa:16:3e:31:61:f5", "network": {"id": "970f64b7-d25f-4e3e-b0fc-94005bc75845", "bridge": "br-int", "label": "tempest-TestServerMultinode-1193567515-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "019b8be0eeda4da8bb32421915e47e2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63aa922-fe", "ovs_interfaceid": "a63aa922-fe9f-43fc-b153-7e0ace515ca1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.304 182096 DEBUG nova.network.os_vif_util [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:61:f5,bridge_name='br-int',has_traffic_filtering=True,id=a63aa922-fe9f-43fc-b153-7e0ace515ca1,network=Network(970f64b7-d25f-4e3e-b0fc-94005bc75845),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63aa922-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.304 182096 DEBUG os_vif [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:61:f5,bridge_name='br-int',has_traffic_filtering=True,id=a63aa922-fe9f-43fc-b153-7e0ace515ca1,network=Network(970f64b7-d25f-4e3e-b0fc-94005bc75845),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63aa922-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.305 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.305 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa63aa922-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.305 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1b24e422-d363-47c1-b481-cfb8a1d5776a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.306 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.308 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.306 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[65fc88c4-14ec-483f-a9ea-1ad9f1da9092]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.310 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.312 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.313 182096 INFO os_vif [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:61:f5,bridge_name='br-int',has_traffic_filtering=True,id=a63aa922-fe9f-43fc-b153-7e0ace515ca1,network=Network(970f64b7-d25f-4e3e-b0fc-94005bc75845),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63aa922-fe')
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.313 182096 INFO nova.virt.libvirt.driver [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Deleting instance files /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0_del
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.314 182096 INFO nova.virt.libvirt.driver [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Deletion of /var/lib/nova/instances/66b8d4e6-8a07-42e5-b981-86a1226fffd0_del complete
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.317 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8ebc5d30-076b-4037-98c0-4589bc768909]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477878, 'reachable_time': 40234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230982, 'error': None, 'target': 'ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d970f64b7\x2dd25f\x2d4e3e\x2db0fc\x2d94005bc75845.mount: Deactivated successfully.
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.320 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-970f64b7-d25f-4e3e-b0fc-94005bc75845 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.321 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[77c5e38e-776f-49f5-8d30-7c1eb74b9a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.321 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a63aa922-fe9f-43fc-b153-7e0ace515ca1 in datapath 970f64b7-d25f-4e3e-b0fc-94005bc75845 unbound from our chassis
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.323 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 970f64b7-d25f-4e3e-b0fc-94005bc75845, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.323 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ee15c909-2acd-48e8-81b8-ca174c115038]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.324 103978 INFO neutron.agent.ovn.metadata.agent [-] Port a63aa922-fe9f-43fc-b153-7e0ace515ca1 in datapath 970f64b7-d25f-4e3e-b0fc-94005bc75845 unbound from our chassis
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.325 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 970f64b7-d25f-4e3e-b0fc-94005bc75845, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:41:17 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:17.325 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[68812a1d-295d-486b-a1f7-4e659a01918b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.357 182096 INFO nova.compute.manager [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.358 182096 DEBUG oslo.service.loopingcall [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.358 182096 DEBUG nova.compute.manager [-] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.358 182096 DEBUG nova.network.neutron [-] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.539 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.820 182096 DEBUG nova.compute.manager [req-e0935660-7171-444f-8d3c-45c7f5da3a8b req-4837de87-407d-4e39-8d0e-53504cd0836f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received event network-vif-unplugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.820 182096 DEBUG oslo_concurrency.lockutils [req-e0935660-7171-444f-8d3c-45c7f5da3a8b req-4837de87-407d-4e39-8d0e-53504cd0836f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.820 182096 DEBUG oslo_concurrency.lockutils [req-e0935660-7171-444f-8d3c-45c7f5da3a8b req-4837de87-407d-4e39-8d0e-53504cd0836f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.821 182096 DEBUG oslo_concurrency.lockutils [req-e0935660-7171-444f-8d3c-45c7f5da3a8b req-4837de87-407d-4e39-8d0e-53504cd0836f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.821 182096 DEBUG nova.compute.manager [req-e0935660-7171-444f-8d3c-45c7f5da3a8b req-4837de87-407d-4e39-8d0e-53504cd0836f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] No waiting events found dispatching network-vif-unplugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:41:17 compute-0 nova_compute[182092]: 2026-01-23 09:41:17.821 182096 DEBUG nova.compute.manager [req-e0935660-7171-444f-8d3c-45c7f5da3a8b req-4837de87-407d-4e39-8d0e-53504cd0836f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received event network-vif-unplugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.239 182096 DEBUG nova.network.neutron [-] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.255 182096 INFO nova.compute.manager [-] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Took 0.90 seconds to deallocate network for instance.
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.311 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.311 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.363 182096 DEBUG nova.compute.provider_tree [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.381 182096 DEBUG nova.scheduler.client.report [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.396 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.420 182096 INFO nova.scheduler.client.report [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Deleted allocations for instance 66b8d4e6-8a07-42e5-b981-86a1226fffd0
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.478 182096 DEBUG oslo_concurrency.lockutils [None req-b6a7c0d2-598c-4695-8b70-09a3612a0bf9 7cdda3fda14844ceb053212cbacca2c0 8e68861a50e9454ab7a4f1b437b859d7 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.463s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.836 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.836 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.837 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.837 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.962 182096 DEBUG oslo_concurrency.lockutils [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "interface-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.963 182096 DEBUG oslo_concurrency.lockutils [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "interface-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:18 compute-0 nova_compute[182092]: 2026-01-23 09:41:18.963 182096 DEBUG nova.objects.instance [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'flavor' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:41:19 compute-0 nova_compute[182092]: 2026-01-23 09:41:19.960 182096 DEBUG nova.objects.instance [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'pci_requests' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:41:19 compute-0 nova_compute[182092]: 2026-01-23 09:41:19.970 182096 DEBUG nova.network.neutron [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.010 182096 DEBUG nova.compute.manager [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.010 182096 DEBUG oslo_concurrency.lockutils [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.011 182096 DEBUG oslo_concurrency.lockutils [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.011 182096 DEBUG oslo_concurrency.lockutils [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.011 182096 DEBUG nova.compute.manager [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] No waiting events found dispatching network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.011 182096 WARNING nova.compute.manager [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received unexpected event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 for instance with vm_state deleted and task_state None.
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.011 182096 DEBUG nova.compute.manager [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.012 182096 DEBUG oslo_concurrency.lockutils [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.012 182096 DEBUG oslo_concurrency.lockutils [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.012 182096 DEBUG oslo_concurrency.lockutils [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "66b8d4e6-8a07-42e5-b981-86a1226fffd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.012 182096 DEBUG nova.compute.manager [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] No waiting events found dispatching network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.012 182096 WARNING nova.compute.manager [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received unexpected event network-vif-plugged-a63aa922-fe9f-43fc-b153-7e0ace515ca1 for instance with vm_state deleted and task_state None.
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.013 182096 DEBUG nova.compute.manager [req-947c8ed6-be5d-40e7-9f45-06200b05fff5 req-a16989a5-cde4-4a84-8f7a-cae0d462ba5a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Received event network-vif-deleted-a63aa922-fe9f-43fc-b153-7e0ace515ca1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:20 compute-0 podman[230983]: 2026-01-23 09:41:20.240544511 +0000 UTC m=+0.066467250 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.379 182096 DEBUG nova.policy [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.558 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.572 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.573 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.573 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.573 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.588 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.588 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.589 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.589 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.627 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.679 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.679 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.726 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.739 182096 DEBUG nova.network.neutron [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Successfully created port: 2a029ff6-f6a7-4903-a87d-48339ec92a1b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.929 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.930 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5549MB free_disk=73.18409729003906GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.931 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.931 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.985 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance d2ecc8e0-c714-4bea-a634-4908c7d6cdef actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.986 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:41:20 compute-0 nova_compute[182092]: 2026-01-23 09:41:20.986 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:41:21 compute-0 nova_compute[182092]: 2026-01-23 09:41:21.025 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:41:21 compute-0 nova_compute[182092]: 2026-01-23 09:41:21.045 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:41:21 compute-0 nova_compute[182092]: 2026-01-23 09:41:21.070 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:41:21 compute-0 nova_compute[182092]: 2026-01-23 09:41:21.070 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:22 compute-0 nova_compute[182092]: 2026-01-23 09:41:22.007 182096 DEBUG nova.network.neutron [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Successfully updated port: 2a029ff6-f6a7-4903-a87d-48339ec92a1b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:41:22 compute-0 nova_compute[182092]: 2026-01-23 09:41:22.038 182096 DEBUG oslo_concurrency.lockutils [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:41:22 compute-0 nova_compute[182092]: 2026-01-23 09:41:22.038 182096 DEBUG oslo_concurrency.lockutils [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:41:22 compute-0 nova_compute[182092]: 2026-01-23 09:41:22.038 182096 DEBUG nova.network.neutron [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:41:22 compute-0 nova_compute[182092]: 2026-01-23 09:41:22.147 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:22 compute-0 nova_compute[182092]: 2026-01-23 09:41:22.307 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:22 compute-0 nova_compute[182092]: 2026-01-23 09:41:22.541 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:23 compute-0 nova_compute[182092]: 2026-01-23 09:41:23.223 182096 DEBUG nova.compute.manager [req-7a4b1beb-5171-4eb4-ba04-a9729d8449be req-39c2cf8c-b190-41ce-b7e8-7f8726531d46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-changed-2a029ff6-f6a7-4903-a87d-48339ec92a1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:23 compute-0 nova_compute[182092]: 2026-01-23 09:41:23.224 182096 DEBUG nova.compute.manager [req-7a4b1beb-5171-4eb4-ba04-a9729d8449be req-39c2cf8c-b190-41ce-b7e8-7f8726531d46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing instance network info cache due to event network-changed-2a029ff6-f6a7-4903-a87d-48339ec92a1b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:41:23 compute-0 nova_compute[182092]: 2026-01-23 09:41:23.224 182096 DEBUG oslo_concurrency.lockutils [req-7a4b1beb-5171-4eb4-ba04-a9729d8449be req-39c2cf8c-b190-41ce-b7e8-7f8726531d46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.276 182096 DEBUG nova.network.neutron [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.289 182096 DEBUG oslo_concurrency.lockutils [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.290 182096 DEBUG oslo_concurrency.lockutils [req-7a4b1beb-5171-4eb4-ba04-a9729d8449be req-39c2cf8c-b190-41ce-b7e8-7f8726531d46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.290 182096 DEBUG nova.network.neutron [req-7a4b1beb-5171-4eb4-ba04-a9729d8449be req-39c2cf8c-b190-41ce-b7e8-7f8726531d46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing network info cache for port 2a029ff6-f6a7-4903-a87d-48339ec92a1b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.292 182096 DEBUG nova.virt.libvirt.vif [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:40:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:40:57Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.293 182096 DEBUG nova.network.os_vif_util [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.293 182096 DEBUG nova.network.os_vif_util [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.294 182096 DEBUG os_vif [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.294 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.294 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.295 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.297 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.297 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a029ff6-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.298 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a029ff6-f6, col_values=(('external_ids', {'iface-id': '2a029ff6-f6a7-4903-a87d-48339ec92a1b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:9f:1c', 'vm-uuid': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.299 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 NetworkManager[54920]: <info>  [1769161284.2997] manager: (tap2a029ff6-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.303 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.304 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.305 182096 INFO os_vif [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6')
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.305 182096 DEBUG nova.virt.libvirt.vif [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:40:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:40:57Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.305 182096 DEBUG nova.network.os_vif_util [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.306 182096 DEBUG nova.network.os_vif_util [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.307 182096 DEBUG nova.virt.libvirt.guest [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] attach device xml: <interface type="ethernet">
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:ef:9f:1c"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <target dev="tap2a029ff6-f6"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]: </interface>
Jan 23 09:41:24 compute-0 nova_compute[182092]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 23 09:41:24 compute-0 kernel: tap2a029ff6-f6: entered promiscuous mode
Jan 23 09:41:24 compute-0 NetworkManager[54920]: <info>  [1769161284.3178] manager: (tap2a029ff6-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/331)
Jan 23 09:41:24 compute-0 ovn_controller[94697]: 2026-01-23T09:41:24Z|00670|binding|INFO|Claiming lport 2a029ff6-f6a7-4903-a87d-48339ec92a1b for this chassis.
Jan 23 09:41:24 compute-0 ovn_controller[94697]: 2026-01-23T09:41:24Z|00671|binding|INFO|2a029ff6-f6a7-4903-a87d-48339ec92a1b: Claiming fa:16:3e:ef:9f:1c 10.100.0.18
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.318 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.326 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:9f:1c 10.100.0.18'], port_security=['fa:16:3e:ef:9f:1c 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6385902e-a3fc-45ed-ae47-016bab41e365', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1a496382-70b1-408c-bf2f-e11df7c98661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f76504-0307-4659-beb0-ded71ba0faa8, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2a029ff6-f6a7-4903-a87d-48339ec92a1b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.327 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2a029ff6-f6a7-4903-a87d-48339ec92a1b in datapath 6385902e-a3fc-45ed-ae47-016bab41e365 bound to our chassis
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.328 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6385902e-a3fc-45ed-ae47-016bab41e365
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.336 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0dcdb698-c2d0-429a-b0bb-427fa2b4a37e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.338 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6385902e-a1 in ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.339 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6385902e-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.340 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[09db6418-32cf-49c5-886e-23b59e3e1a6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.341 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c18c14e7-31d0-43f1-b6fd-b69c21f478b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 systemd-udevd[231021]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:41:24 compute-0 NetworkManager[54920]: <info>  [1769161284.3515] device (tap2a029ff6-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:41:24 compute-0 NetworkManager[54920]: <info>  [1769161284.3519] device (tap2a029ff6-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.352 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd52276-218a-4578-ac7c-d30e2f5c3653]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.355 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 ovn_controller[94697]: 2026-01-23T09:41:24Z|00672|binding|INFO|Setting lport 2a029ff6-f6a7-4903-a87d-48339ec92a1b ovn-installed in OVS
Jan 23 09:41:24 compute-0 ovn_controller[94697]: 2026-01-23T09:41:24Z|00673|binding|INFO|Setting lport 2a029ff6-f6a7-4903-a87d-48339ec92a1b up in Southbound
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.363 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.363 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb22321-3b52-4602-a188-126d8ab8390b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.384 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[68938825-5e0e-49f6-a8fb-f93be923b92f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.388 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3c0024-3a97-4cf5-9e61-fd9d098ead61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 NetworkManager[54920]: <info>  [1769161284.3892] manager: (tap6385902e-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/332)
Jan 23 09:41:24 compute-0 systemd-udevd[231024]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.396 182096 DEBUG nova.virt.libvirt.driver [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.397 182096 DEBUG nova.virt.libvirt.driver [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.397 182096 DEBUG nova.virt.libvirt.driver [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No VIF found with MAC fa:16:3e:c8:1e:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.397 182096 DEBUG nova.virt.libvirt.driver [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No VIF found with MAC fa:16:3e:ef:9f:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.412 182096 DEBUG nova.virt.libvirt.guest [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <nova:name>tempest-TestNetworkBasicOps-server-600889586</nova:name>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:41:24</nova:creationTime>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:port uuid="063af96a-932f-4abc-b86a-e046a2f8ba53">
Jan 23 09:41:24 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     <nova:port uuid="2a029ff6-f6a7-4903-a87d-48339ec92a1b">
Jan 23 09:41:24 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Jan 23 09:41:24 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:41:24 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:41:24 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:41:24 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.413 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[ca4a1747-615a-453f-8ffd-f6ac354448e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.417 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[dca0b1cb-c76e-4cb0-9c0e-0ae8b5a03bba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.430 182096 DEBUG oslo_concurrency.lockutils [None req-d48dc47b-b6f9-455f-9e48-1a8767b6a117 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "interface-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:24 compute-0 NetworkManager[54920]: <info>  [1769161284.4368] device (tap6385902e-a0): carrier: link connected
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.441 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[cc99a49f-18e3-4902-83e0-fcdef1bf348d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.453 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9f66c4f0-cd74-444e-a45c-3f3ab46a2b7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6385902e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:fc:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478992, 'reachable_time': 39474, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231041, 'error': None, 'target': 'ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.464 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d2e112-dc9f-483a-947f-5a5deae98cce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:fcc8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 478992, 'tstamp': 478992}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231042, 'error': None, 'target': 'ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.477 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[114ea1f4-d927-4f51-ade6-436464764a0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6385902e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:fc:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 201], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478992, 'reachable_time': 39474, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231043, 'error': None, 'target': 'ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.499 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9a9f42-b5ee-4dc1-a5d0-5a4e7b46537d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.538 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4f80a64d-a4e2-42cf-a3cb-3ae1587d50fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.539 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6385902e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.540 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.541 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6385902e-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:24 compute-0 kernel: tap6385902e-a0: entered promiscuous mode
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.543 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 NetworkManager[54920]: <info>  [1769161284.5434] manager: (tap6385902e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/333)
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.546 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6385902e-a0, col_values=(('external_ids', {'iface-id': '06e98fc4-85e3-4aa7-be1e-a9fdd8dd70e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 ovn_controller[94697]: 2026-01-23T09:41:24Z|00674|binding|INFO|Releasing lport 06e98fc4-85e3-4aa7-be1e-a9fdd8dd70e5 from this chassis (sb_readonly=0)
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.549 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.551 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6385902e-a3fc-45ed-ae47-016bab41e365.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6385902e-a3fc-45ed-ae47-016bab41e365.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.551 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a9036af6-076d-4a4e-936a-de43d40d7261]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.552 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-6385902e-a3fc-45ed-ae47-016bab41e365
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/6385902e-a3fc-45ed-ae47-016bab41e365.pid.haproxy
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 6385902e-a3fc-45ed-ae47-016bab41e365
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:41:24 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:24.552 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365', 'env', 'PROCESS_TAG=haproxy-6385902e-a3fc-45ed-ae47-016bab41e365', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6385902e-a3fc-45ed-ae47-016bab41e365.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.561 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:41:24 compute-0 nova_compute[182092]: 2026-01-23 09:41:24.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:41:24 compute-0 podman[231071]: 2026-01-23 09:41:24.819731181 +0000 UTC m=+0.030833105 container create 6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:41:24 compute-0 systemd[1]: Started libpod-conmon-6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81.scope.
Jan 23 09:41:24 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:41:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/458d357c8482050f163f74c363d895805681d29ffcd09c36021f879265a26298/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:41:24 compute-0 podman[231071]: 2026-01-23 09:41:24.868441125 +0000 UTC m=+0.079543047 container init 6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 09:41:24 compute-0 podman[231071]: 2026-01-23 09:41:24.873623217 +0000 UTC m=+0.084725139 container start 6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 09:41:24 compute-0 podman[231071]: 2026-01-23 09:41:24.806685511 +0000 UTC m=+0.017787444 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:41:24 compute-0 neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365[231083]: [NOTICE]   (231087) : New worker (231089) forked
Jan 23 09:41:24 compute-0 neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365[231083]: [NOTICE]   (231087) : Loading success.
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.947 182096 DEBUG nova.compute.manager [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.947 182096 DEBUG oslo_concurrency.lockutils [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.948 182096 DEBUG oslo_concurrency.lockutils [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.948 182096 DEBUG oslo_concurrency.lockutils [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.948 182096 DEBUG nova.compute.manager [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] No waiting events found dispatching network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.948 182096 WARNING nova.compute.manager [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received unexpected event network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b for instance with vm_state active and task_state None.
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.948 182096 DEBUG nova.compute.manager [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.948 182096 DEBUG oslo_concurrency.lockutils [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.948 182096 DEBUG oslo_concurrency.lockutils [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.949 182096 DEBUG oslo_concurrency.lockutils [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.949 182096 DEBUG nova.compute.manager [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] No waiting events found dispatching network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:41:25 compute-0 nova_compute[182092]: 2026-01-23 09:41:25.949 182096 WARNING nova.compute.manager [req-401effd2-adf1-4847-b99a-f168ef2057d8 req-f4d5b3d0-531e-4102-a5eb-266f626ddb82 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received unexpected event network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b for instance with vm_state active and task_state None.
Jan 23 09:41:26 compute-0 nova_compute[182092]: 2026-01-23 09:41:26.008 182096 DEBUG nova.network.neutron [req-7a4b1beb-5171-4eb4-ba04-a9729d8449be req-39c2cf8c-b190-41ce-b7e8-7f8726531d46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updated VIF entry in instance network info cache for port 2a029ff6-f6a7-4903-a87d-48339ec92a1b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:41:26 compute-0 nova_compute[182092]: 2026-01-23 09:41:26.008 182096 DEBUG nova.network.neutron [req-7a4b1beb-5171-4eb4-ba04-a9729d8449be req-39c2cf8c-b190-41ce-b7e8-7f8726531d46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:41:26 compute-0 nova_compute[182092]: 2026-01-23 09:41:26.022 182096 DEBUG oslo_concurrency.lockutils [req-7a4b1beb-5171-4eb4-ba04-a9729d8449be req-39c2cf8c-b190-41ce-b7e8-7f8726531d46 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:41:26 compute-0 ovn_controller[94697]: 2026-01-23T09:41:26Z|00675|binding|INFO|Releasing lport 083ece3e-9182-4251-86be-2cee076862d9 from this chassis (sb_readonly=0)
Jan 23 09:41:26 compute-0 ovn_controller[94697]: 2026-01-23T09:41:26Z|00676|binding|INFO|Releasing lport 06e98fc4-85e3-4aa7-be1e-a9fdd8dd70e5 from this chassis (sb_readonly=0)
Jan 23 09:41:26 compute-0 nova_compute[182092]: 2026-01-23 09:41:26.603 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:27 compute-0 ovn_controller[94697]: 2026-01-23T09:41:27Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ef:9f:1c 10.100.0.18
Jan 23 09:41:27 compute-0 ovn_controller[94697]: 2026-01-23T09:41:27Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:9f:1c 10.100.0.18
Jan 23 09:41:27 compute-0 podman[231094]: 2026-01-23 09:41:27.21642316 +0000 UTC m=+0.049813486 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 23 09:41:27 compute-0 podman[231095]: 2026-01-23 09:41:27.238332635 +0000 UTC m=+0.069453731 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:41:27 compute-0 nova_compute[182092]: 2026-01-23 09:41:27.543 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:29 compute-0 nova_compute[182092]: 2026-01-23 09:41:29.300 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:29 compute-0 ovn_controller[94697]: 2026-01-23T09:41:29Z|00677|binding|INFO|Releasing lport 083ece3e-9182-4251-86be-2cee076862d9 from this chassis (sb_readonly=0)
Jan 23 09:41:29 compute-0 ovn_controller[94697]: 2026-01-23T09:41:29Z|00678|binding|INFO|Releasing lport 06e98fc4-85e3-4aa7-be1e-a9fdd8dd70e5 from this chassis (sb_readonly=0)
Jan 23 09:41:29 compute-0 nova_compute[182092]: 2026-01-23 09:41:29.551 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:32 compute-0 nova_compute[182092]: 2026-01-23 09:41:32.275 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161277.2733035, 66b8d4e6-8a07-42e5-b981-86a1226fffd0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:41:32 compute-0 nova_compute[182092]: 2026-01-23 09:41:32.275 182096 INFO nova.compute.manager [-] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] VM Stopped (Lifecycle Event)
Jan 23 09:41:32 compute-0 nova_compute[182092]: 2026-01-23 09:41:32.291 182096 DEBUG nova.compute.manager [None req-c2ddd2e4-94a0-4a78-9995-fb0661c2ab5e - - - - - -] [instance: 66b8d4e6-8a07-42e5-b981-86a1226fffd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:41:32 compute-0 nova_compute[182092]: 2026-01-23 09:41:32.545 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:34 compute-0 nova_compute[182092]: 2026-01-23 09:41:34.304 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:35 compute-0 nova_compute[182092]: 2026-01-23 09:41:35.661 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:37 compute-0 nova_compute[182092]: 2026-01-23 09:41:37.445 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:37 compute-0 nova_compute[182092]: 2026-01-23 09:41:37.547 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:38 compute-0 podman[231133]: 2026-01-23 09:41:38.206243528 +0000 UTC m=+0.038131657 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:41:38 compute-0 podman[231132]: 2026-01-23 09:41:38.213080461 +0000 UTC m=+0.046607530 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 09:41:39 compute-0 nova_compute[182092]: 2026-01-23 09:41:39.306 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:39.872 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:41:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:39.872 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:41:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:39.873 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:41:40 compute-0 nova_compute[182092]: 2026-01-23 09:41:40.820 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:41 compute-0 podman[231169]: 2026-01-23 09:41:41.210399379 +0000 UTC m=+0.041927776 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container)
Jan 23 09:41:42 compute-0 nova_compute[182092]: 2026-01-23 09:41:42.514 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:42 compute-0 nova_compute[182092]: 2026-01-23 09:41:42.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:42 compute-0 nova_compute[182092]: 2026-01-23 09:41:42.565 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:42.566 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:41:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:42.567 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:41:44 compute-0 nova_compute[182092]: 2026-01-23 09:41:44.308 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:47 compute-0 nova_compute[182092]: 2026-01-23 09:41:47.551 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:48 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:41:48.569 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:41:49 compute-0 nova_compute[182092]: 2026-01-23 09:41:49.310 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:51 compute-0 podman[231187]: 2026-01-23 09:41:51.21828069 +0000 UTC m=+0.058018738 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:41:52 compute-0 nova_compute[182092]: 2026-01-23 09:41:52.122 182096 DEBUG nova.compute.manager [req-557353f9-ccd1-4857-bed8-c498a7fb4eb1 req-39174cb7-2080-4449-b462-f473cb975fee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-changed-2a029ff6-f6a7-4903-a87d-48339ec92a1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:41:52 compute-0 nova_compute[182092]: 2026-01-23 09:41:52.122 182096 DEBUG nova.compute.manager [req-557353f9-ccd1-4857-bed8-c498a7fb4eb1 req-39174cb7-2080-4449-b462-f473cb975fee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing instance network info cache due to event network-changed-2a029ff6-f6a7-4903-a87d-48339ec92a1b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:41:52 compute-0 nova_compute[182092]: 2026-01-23 09:41:52.123 182096 DEBUG oslo_concurrency.lockutils [req-557353f9-ccd1-4857-bed8-c498a7fb4eb1 req-39174cb7-2080-4449-b462-f473cb975fee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:41:52 compute-0 nova_compute[182092]: 2026-01-23 09:41:52.123 182096 DEBUG oslo_concurrency.lockutils [req-557353f9-ccd1-4857-bed8-c498a7fb4eb1 req-39174cb7-2080-4449-b462-f473cb975fee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:41:52 compute-0 nova_compute[182092]: 2026-01-23 09:41:52.123 182096 DEBUG nova.network.neutron [req-557353f9-ccd1-4857-bed8-c498a7fb4eb1 req-39174cb7-2080-4449-b462-f473cb975fee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing network info cache for port 2a029ff6-f6a7-4903-a87d-48339ec92a1b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:41:52 compute-0 nova_compute[182092]: 2026-01-23 09:41:52.553 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:53 compute-0 nova_compute[182092]: 2026-01-23 09:41:53.362 182096 DEBUG nova.network.neutron [req-557353f9-ccd1-4857-bed8-c498a7fb4eb1 req-39174cb7-2080-4449-b462-f473cb975fee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updated VIF entry in instance network info cache for port 2a029ff6-f6a7-4903-a87d-48339ec92a1b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:41:53 compute-0 nova_compute[182092]: 2026-01-23 09:41:53.363 182096 DEBUG nova.network.neutron [req-557353f9-ccd1-4857-bed8-c498a7fb4eb1 req-39174cb7-2080-4449-b462-f473cb975fee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:41:53 compute-0 nova_compute[182092]: 2026-01-23 09:41:53.387 182096 DEBUG oslo_concurrency.lockutils [req-557353f9-ccd1-4857-bed8-c498a7fb4eb1 req-39174cb7-2080-4449-b462-f473cb975fee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:41:54 compute-0 nova_compute[182092]: 2026-01-23 09:41:54.311 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:55 compute-0 ovn_controller[94697]: 2026-01-23T09:41:55Z|00679|binding|INFO|Releasing lport 083ece3e-9182-4251-86be-2cee076862d9 from this chassis (sb_readonly=0)
Jan 23 09:41:55 compute-0 ovn_controller[94697]: 2026-01-23T09:41:55Z|00680|binding|INFO|Releasing lport 06e98fc4-85e3-4aa7-be1e-a9fdd8dd70e5 from this chassis (sb_readonly=0)
Jan 23 09:41:55 compute-0 nova_compute[182092]: 2026-01-23 09:41:55.940 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:57 compute-0 nova_compute[182092]: 2026-01-23 09:41:57.554 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:41:58 compute-0 podman[231211]: 2026-01-23 09:41:58.212206531 +0000 UTC m=+0.043162965 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 23 09:41:58 compute-0 podman[231212]: 2026-01-23 09:41:58.212954442 +0000 UTC m=+0.042440032 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:41:59 compute-0 nova_compute[182092]: 2026-01-23 09:41:59.313 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:02 compute-0 nova_compute[182092]: 2026-01-23 09:42:02.556 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:04 compute-0 nova_compute[182092]: 2026-01-23 09:42:04.316 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:05 compute-0 nova_compute[182092]: 2026-01-23 09:42:05.804 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:07 compute-0 nova_compute[182092]: 2026-01-23 09:42:07.558 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:08 compute-0 nova_compute[182092]: 2026-01-23 09:42:08.713 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:09 compute-0 podman[231251]: 2026-01-23 09:42:09.204212632 +0000 UTC m=+0.040258147 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:42:09 compute-0 podman[231250]: 2026-01-23 09:42:09.232191384 +0000 UTC m=+0.071293321 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:42:09 compute-0 nova_compute[182092]: 2026-01-23 09:42:09.317 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:12 compute-0 podman[231287]: 2026-01-23 09:42:12.233809168 +0000 UTC m=+0.066506705 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 23 09:42:12 compute-0 nova_compute[182092]: 2026-01-23 09:42:12.560 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:12 compute-0 nova_compute[182092]: 2026-01-23 09:42:12.854 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:14 compute-0 nova_compute[182092]: 2026-01-23 09:42:14.319 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:14.902 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:4d:8d 2001:db8:0:1:f816:3eff:fe0c:4d8d 2001:db8::f816:3eff:fe0c:4d8d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe0c:4d8d/64 2001:db8::f816:3eff:fe0c:4d8d/64', 'neutron:device_id': 'ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2482367c-d492-48cd-8a65-fd27ef9491ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71b900ed-163d-4540-b633-6108e740ce75, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c957aec4-5720-4b01-aab9-5f9131b765a9) old=Port_Binding(mac=['fa:16:3e:0c:4d:8d 2001:db8::f816:3eff:fe0c:4d8d'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe0c:4d8d/64', 'neutron:device_id': 'ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2482367c-d492-48cd-8a65-fd27ef9491ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:42:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:14.903 103978 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c957aec4-5720-4b01-aab9-5f9131b765a9 in datapath 2482367c-d492-48cd-8a65-fd27ef9491ab updated
Jan 23 09:42:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:14.904 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2482367c-d492-48cd-8a65-fd27ef9491ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:42:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:14.905 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8abbcd98-071b-442c-a782-7cbff039e592]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:15 compute-0 nova_compute[182092]: 2026-01-23 09:42:15.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:15 compute-0 nova_compute[182092]: 2026-01-23 09:42:15.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:17 compute-0 nova_compute[182092]: 2026-01-23 09:42:17.561 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:17 compute-0 nova_compute[182092]: 2026-01-23 09:42:17.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:17 compute-0 nova_compute[182092]: 2026-01-23 09:42:17.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.241 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.241 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.256 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.321 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.326 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.326 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.332 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.333 182096 INFO nova.compute.claims [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.428 182096 DEBUG nova.compute.provider_tree [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.437 182096 DEBUG nova.scheduler.client.report [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.451 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.451 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.505 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.505 182096 DEBUG nova.network.neutron [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.517 182096 INFO nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.530 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.625 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.626 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.627 182096 INFO nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Creating image(s)
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.627 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "/var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.627 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.628 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.638 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.652 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.653 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.653 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.666 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.671 182096 DEBUG nova.policy [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.686 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.687 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.687 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.696 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.743 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.744 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.766 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.767 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.767 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.810 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.811 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.811 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.811 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.815 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.815 182096 DEBUG nova.virt.disk.api [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Checking if we can resize image /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.816 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.862 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.863 182096 DEBUG nova.virt.disk.api [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Cannot resize image /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.863 182096 DEBUG nova.objects.instance [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'migration_context' on Instance uuid 1d01d877-250f-4069-b7a9-da76e21520a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.876 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.876 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Ensure instance console log exists: /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.877 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.877 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:19 compute-0 nova_compute[182092]: 2026-01-23 09:42:19.878 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:20 compute-0 nova_compute[182092]: 2026-01-23 09:42:20.346 182096 DEBUG nova.network.neutron [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Successfully created port: 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:42:20 compute-0 nova_compute[182092]: 2026-01-23 09:42:20.796 182096 DEBUG nova.network.neutron [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Successfully created port: 1a75b58f-843f-4939-99e9-00991ea0a602 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.548 182096 DEBUG nova.network.neutron [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Successfully updated port: 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.568 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.586 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.587 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.587 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.587 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.602 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.602 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.602 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.603 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.632 182096 DEBUG nova.compute.manager [req-f321068c-ad81-4dba-b717-5a514b00db19 req-08938caf-f50d-4cca-a412-02b7336b5572 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-changed-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.632 182096 DEBUG nova.compute.manager [req-f321068c-ad81-4dba-b717-5a514b00db19 req-08938caf-f50d-4cca-a412-02b7336b5572 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Refreshing instance network info cache due to event network-changed-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.633 182096 DEBUG oslo_concurrency.lockutils [req-f321068c-ad81-4dba-b717-5a514b00db19 req-08938caf-f50d-4cca-a412-02b7336b5572 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.633 182096 DEBUG oslo_concurrency.lockutils [req-f321068c-ad81-4dba-b717-5a514b00db19 req-08938caf-f50d-4cca-a412-02b7336b5572 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.633 182096 DEBUG nova.network.neutron [req-f321068c-ad81-4dba-b717-5a514b00db19 req-08938caf-f50d-4cca-a412-02b7336b5572 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Refreshing network info cache for port 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.648 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.695 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.696 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.744 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.774 182096 DEBUG nova.network.neutron [req-f321068c-ad81-4dba-b717-5a514b00db19 req-08938caf-f50d-4cca-a412-02b7336b5572 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.966 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.967 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5515MB free_disk=73.18330764770508GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.967 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:21 compute-0 nova_compute[182092]: 2026-01-23 09:42:21.967 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.031 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance d2ecc8e0-c714-4bea-a634-4908c7d6cdef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.031 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 1d01d877-250f-4069-b7a9-da76e21520a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.031 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.032 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.076 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.084 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.102 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.102 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:22 compute-0 podman[231326]: 2026-01-23 09:42:22.219312839 +0000 UTC m=+0.056871591 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller)
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.563 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.681 182096 DEBUG nova.network.neutron [req-f321068c-ad81-4dba-b717-5a514b00db19 req-08938caf-f50d-4cca-a412-02b7336b5572 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.691 182096 DEBUG oslo_concurrency.lockutils [req-f321068c-ad81-4dba-b717-5a514b00db19 req-08938caf-f50d-4cca-a412-02b7336b5572 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.828 182096 DEBUG nova.network.neutron [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Successfully updated port: 1a75b58f-843f-4939-99e9-00991ea0a602 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.842 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.842 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquired lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:42:22 compute-0 nova_compute[182092]: 2026-01-23 09:42:22.842 182096 DEBUG nova.network.neutron [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:42:23 compute-0 nova_compute[182092]: 2026-01-23 09:42:23.149 182096 DEBUG nova.network.neutron [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:42:23 compute-0 nova_compute[182092]: 2026-01-23 09:42:23.711 182096 DEBUG nova.compute.manager [req-d96add38-5192-41be-a179-52b7762cddeb req-d9dd2031-a6eb-4145-ba66-147ea42fa299 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-changed-1a75b58f-843f-4939-99e9-00991ea0a602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:23 compute-0 nova_compute[182092]: 2026-01-23 09:42:23.712 182096 DEBUG nova.compute.manager [req-d96add38-5192-41be-a179-52b7762cddeb req-d9dd2031-a6eb-4145-ba66-147ea42fa299 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Refreshing instance network info cache due to event network-changed-1a75b58f-843f-4939-99e9-00991ea0a602. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:42:23 compute-0 nova_compute[182092]: 2026-01-23 09:42:23.712 182096 DEBUG oslo_concurrency.lockutils [req-d96add38-5192-41be-a179-52b7762cddeb req-d9dd2031-a6eb-4145-ba66-147ea42fa299 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.164 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.323 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.639 182096 DEBUG nova.network.neutron [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updating instance_info_cache with network_info: [{"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.653 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Releasing lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.653 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Instance network_info: |[{"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.654 182096 DEBUG oslo_concurrency.lockutils [req-d96add38-5192-41be-a179-52b7762cddeb req-d9dd2031-a6eb-4145-ba66-147ea42fa299 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.654 182096 DEBUG nova.network.neutron [req-d96add38-5192-41be-a179-52b7762cddeb req-d9dd2031-a6eb-4145-ba66-147ea42fa299 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Refreshing network info cache for port 1a75b58f-843f-4939-99e9-00991ea0a602 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.657 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Start _get_guest_xml network_info=[{"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.660 182096 WARNING nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.664 182096 DEBUG nova.virt.libvirt.host [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.664 182096 DEBUG nova.virt.libvirt.host [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.668 182096 DEBUG nova.virt.libvirt.host [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.668 182096 DEBUG nova.virt.libvirt.host [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.669 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.669 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.670 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.670 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.670 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.670 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.671 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.671 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.671 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.671 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.671 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.671 182096 DEBUG nova.virt.hardware [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.674 182096 DEBUG nova.virt.libvirt.vif [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1786233331',display_name='tempest-TestGettingAddress-server-1786233331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1786233331',id=169,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxE7lMHOU5E+RbTg/U5SuOKDukhLcobVToS7AocbDXsPH/d6crJkExTAz3YaoZQmlXvcbkDDEgGuBAAgk+Mb1XMNU63YMD3ZP1PRDMF6g9lQQJRFK5O7+2AajE1CrmCcw==',key_name='tempest-TestGettingAddress-1872347893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-m076fkhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:19Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=1d01d877-250f-4069-b7a9-da76e21520a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.674 182096 DEBUG nova.network.os_vif_util [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.675 182096 DEBUG nova.network.os_vif_util [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:7d:6f,bridge_name='br-int',has_traffic_filtering=True,id=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d,network=Network(82773ff7-8c70-4ede-a3ab-2917fd9eda62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f10cfa2-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.675 182096 DEBUG nova.virt.libvirt.vif [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1786233331',display_name='tempest-TestGettingAddress-server-1786233331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1786233331',id=169,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxE7lMHOU5E+RbTg/U5SuOKDukhLcobVToS7AocbDXsPH/d6crJkExTAz3YaoZQmlXvcbkDDEgGuBAAgk+Mb1XMNU63YMD3ZP1PRDMF6g9lQQJRFK5O7+2AajE1CrmCcw==',key_name='tempest-TestGettingAddress-1872347893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-m076fkhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:19Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=1d01d877-250f-4069-b7a9-da76e21520a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.676 182096 DEBUG nova.network.os_vif_util [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.676 182096 DEBUG nova.network.os_vif_util [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:30:e1,bridge_name='br-int',has_traffic_filtering=True,id=1a75b58f-843f-4939-99e9-00991ea0a602,network=Network(2482367c-d492-48cd-8a65-fd27ef9491ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a75b58f-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.677 182096 DEBUG nova.objects.instance [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1d01d877-250f-4069-b7a9-da76e21520a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.686 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <uuid>1d01d877-250f-4069-b7a9-da76e21520a5</uuid>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <name>instance-000000a9</name>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <nova:name>tempest-TestGettingAddress-server-1786233331</nova:name>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:42:24</nova:creationTime>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:user uuid="2223cd913aab4f7cbffc6e9c703c6acc">tempest-TestGettingAddress-10741833-project-member</nova:user>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:project uuid="d4181f6c647942e881af13381cc2f253">tempest-TestGettingAddress-10741833</nova:project>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:port uuid="8f10cfa2-5bcf-4ed2-b3ba-90adc891327d">
Jan 23 09:42:24 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         <nova:port uuid="1a75b58f-843f-4939-99e9-00991ea0a602">
Jan 23 09:42:24 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe60:30e1" ipVersion="6"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe60:30e1" ipVersion="6"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <system>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <entry name="serial">1d01d877-250f-4069-b7a9-da76e21520a5</entry>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <entry name="uuid">1d01d877-250f-4069-b7a9-da76e21520a5</entry>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </system>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <os>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   </os>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <features>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   </features>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk.config"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:10:7d:6f"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <target dev="tap8f10cfa2-5b"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:60:30:e1"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <target dev="tap1a75b58f-84"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/console.log" append="off"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <video>
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </video>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:42:24 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:42:24 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:42:24 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:42:24 compute-0 nova_compute[182092]: </domain>
Jan 23 09:42:24 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.687 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Preparing to wait for external event network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.687 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.687 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.687 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.688 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Preparing to wait for external event network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.688 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.688 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.688 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.688 182096 DEBUG nova.virt.libvirt.vif [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1786233331',display_name='tempest-TestGettingAddress-server-1786233331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1786233331',id=169,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxE7lMHOU5E+RbTg/U5SuOKDukhLcobVToS7AocbDXsPH/d6crJkExTAz3YaoZQmlXvcbkDDEgGuBAAgk+Mb1XMNU63YMD3ZP1PRDMF6g9lQQJRFK5O7+2AajE1CrmCcw==',key_name='tempest-TestGettingAddress-1872347893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-m076fkhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:19Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=1d01d877-250f-4069-b7a9-da76e21520a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.689 182096 DEBUG nova.network.os_vif_util [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.689 182096 DEBUG nova.network.os_vif_util [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:7d:6f,bridge_name='br-int',has_traffic_filtering=True,id=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d,network=Network(82773ff7-8c70-4ede-a3ab-2917fd9eda62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f10cfa2-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.689 182096 DEBUG os_vif [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:7d:6f,bridge_name='br-int',has_traffic_filtering=True,id=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d,network=Network(82773ff7-8c70-4ede-a3ab-2917fd9eda62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f10cfa2-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.690 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.690 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.691 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.692 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.692 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f10cfa2-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.693 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8f10cfa2-5b, col_values=(('external_ids', {'iface-id': '8f10cfa2-5bcf-4ed2-b3ba-90adc891327d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:7d:6f', 'vm-uuid': '1d01d877-250f-4069-b7a9-da76e21520a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:24 compute-0 NetworkManager[54920]: <info>  [1769161344.6947] manager: (tap8f10cfa2-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.694 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.696 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.699 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.700 182096 INFO os_vif [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:7d:6f,bridge_name='br-int',has_traffic_filtering=True,id=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d,network=Network(82773ff7-8c70-4ede-a3ab-2917fd9eda62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f10cfa2-5b')
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.700 182096 DEBUG nova.virt.libvirt.vif [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1786233331',display_name='tempest-TestGettingAddress-server-1786233331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1786233331',id=169,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxE7lMHOU5E+RbTg/U5SuOKDukhLcobVToS7AocbDXsPH/d6crJkExTAz3YaoZQmlXvcbkDDEgGuBAAgk+Mb1XMNU63YMD3ZP1PRDMF6g9lQQJRFK5O7+2AajE1CrmCcw==',key_name='tempest-TestGettingAddress-1872347893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-m076fkhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:19Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=1d01d877-250f-4069-b7a9-da76e21520a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.701 182096 DEBUG nova.network.os_vif_util [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.701 182096 DEBUG nova.network.os_vif_util [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:30:e1,bridge_name='br-int',has_traffic_filtering=True,id=1a75b58f-843f-4939-99e9-00991ea0a602,network=Network(2482367c-d492-48cd-8a65-fd27ef9491ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a75b58f-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.701 182096 DEBUG os_vif [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:30:e1,bridge_name='br-int',has_traffic_filtering=True,id=1a75b58f-843f-4939-99e9-00991ea0a602,network=Network(2482367c-d492-48cd-8a65-fd27ef9491ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a75b58f-84') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.702 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.702 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.702 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.704 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.704 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a75b58f-84, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.704 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1a75b58f-84, col_values=(('external_ids', {'iface-id': '1a75b58f-843f-4939-99e9-00991ea0a602', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:30:e1', 'vm-uuid': '1d01d877-250f-4069-b7a9-da76e21520a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:24 compute-0 NetworkManager[54920]: <info>  [1769161344.7059] manager: (tap1a75b58f-84): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.705 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.708 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.710 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.710 182096 INFO os_vif [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:30:e1,bridge_name='br-int',has_traffic_filtering=True,id=1a75b58f-843f-4939-99e9-00991ea0a602,network=Network(2482367c-d492-48cd-8a65-fd27ef9491ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a75b58f-84')
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.746 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.746 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.746 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No VIF found with MAC fa:16:3e:10:7d:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.746 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No VIF found with MAC fa:16:3e:60:30:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:42:24 compute-0 nova_compute[182092]: 2026-01-23 09:42:24.747 182096 INFO nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Using config drive
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.534 182096 INFO nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Creating config drive at /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk.config
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.538 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgqe407xg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.657 182096 DEBUG oslo_concurrency.processutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgqe407xg" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.6992] manager: (tap8f10cfa2-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Jan 23 09:42:25 compute-0 kernel: tap8f10cfa2-5b: entered promiscuous mode
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00681|binding|INFO|Claiming lport 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d for this chassis.
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00682|binding|INFO|8f10cfa2-5bcf-4ed2-b3ba-90adc891327d: Claiming fa:16:3e:10:7d:6f 10.100.0.11
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.709 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.7122] manager: (tap1a75b58f-84): new Tun device (/org/freedesktop/NetworkManager/Devices/337)
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.713 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:7d:6f 10.100.0.11'], port_security=['fa:16:3e:10:7d:6f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82773ff7-8c70-4ede-a3ab-2917fd9eda62', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64bd9a93-c91d-4ee4-acef-a479ec6c08af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a85a5b-705d-4753-aa41-a6530a944e47, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.714 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d in datapath 82773ff7-8c70-4ede-a3ab-2917fd9eda62 bound to our chassis
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.716 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82773ff7-8c70-4ede-a3ab-2917fd9eda62
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00683|binding|INFO|Setting lport 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d ovn-installed in OVS
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00684|binding|INFO|Setting lport 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d up in Southbound
Jan 23 09:42:25 compute-0 kernel: tap1a75b58f-84: entered promiscuous mode
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.725 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00685|if_status|INFO|Dropped 1 log messages in last 195 seconds (most recently, 195 seconds ago) due to excessive rate
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00686|if_status|INFO|Not updating pb chassis for 1a75b58f-843f-4939-99e9-00991ea0a602 now as sb is readonly
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00687|binding|INFO|Claiming lport 1a75b58f-843f-4939-99e9-00991ea0a602 for this chassis.
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00688|binding|INFO|1a75b58f-843f-4939-99e9-00991ea0a602: Claiming fa:16:3e:60:30:e1 2001:db8:0:1:f816:3eff:fe60:30e1 2001:db8::f816:3eff:fe60:30e1
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.727 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6f56d791-0d77-4825-8689-17245de478ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.728 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82773ff7-81 in ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.731 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:30:e1 2001:db8:0:1:f816:3eff:fe60:30e1 2001:db8::f816:3eff:fe60:30e1'], port_security=['fa:16:3e:60:30:e1 2001:db8:0:1:f816:3eff:fe60:30e1 2001:db8::f816:3eff:fe60:30e1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe60:30e1/64 2001:db8::f816:3eff:fe60:30e1/64', 'neutron:device_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2482367c-d492-48cd-8a65-fd27ef9491ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64bd9a93-c91d-4ee4-acef-a479ec6c08af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71b900ed-163d-4540-b633-6108e740ce75, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1a75b58f-843f-4939-99e9-00991ea0a602) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.738 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82773ff7-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.738 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[676c4dbd-d23a-4442-94db-83629036067b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.739 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5c3b49c1-4767-4383-96eb-560d8e18c077]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00689|binding|INFO|Setting lport 1a75b58f-843f-4939-99e9-00991ea0a602 ovn-installed in OVS
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00690|binding|INFO|Setting lport 1a75b58f-843f-4939-99e9-00991ea0a602 up in Southbound
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.740 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:25 compute-0 systemd-udevd[231373]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:42:25 compute-0 systemd-udevd[231374]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.752 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a9bf02-4a24-45f8-a6ba-eb84d77723f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.7549] device (tap1a75b58f-84): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.7556] device (tap1a75b58f-84): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.7606] device (tap8f10cfa2-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.7618] device (tap8f10cfa2-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.762 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a385208c-0267-4717-a28e-848e17d93e08]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 systemd-machined[153562]: New machine qemu-82-instance-000000a9.
Jan 23 09:42:25 compute-0 systemd[1]: Started Virtual Machine qemu-82-instance-000000a9.
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.789 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[722659c8-7e9c-47d4-a335-287c88c649c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.793 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0d21bf4a-9ba0-4af4-8fbb-de64331b5c77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.7938] manager: (tap82773ff7-80): new Veth device (/org/freedesktop/NetworkManager/Devices/338)
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.820 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[de8ee2d2-8a9b-479d-91a3-0a2d37d900bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.822 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[57c5dd17-ae93-45c8-818d-93e89a33f196]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.8392] device (tap82773ff7-80): carrier: link connected
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.843 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe8c816-a820-4a25-856f-855ba95496eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.855 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[90ce935c-be15-497a-92f1-9b6f013e1adf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82773ff7-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:62:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485132, 'reachable_time': 25071, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231401, 'error': None, 'target': 'ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.867 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0aef51-6254-47b5-8d85-0c9db680345a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:6272'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485132, 'tstamp': 485132}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231402, 'error': None, 'target': 'ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.878 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a93c4dd8-13cf-4210-982d-ddfd81a31753]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82773ff7-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:62:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485132, 'reachable_time': 25071, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231403, 'error': None, 'target': 'ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.906 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6af7612d-6738-4dd6-9265-b6e6e452f5ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.919 182096 DEBUG nova.compute.manager [req-381968d0-8cb2-4dd1-b202-a66370471496 req-c2e03ba7-7206-456f-a6e8-5eda0f85b015 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.919 182096 DEBUG oslo_concurrency.lockutils [req-381968d0-8cb2-4dd1-b202-a66370471496 req-c2e03ba7-7206-456f-a6e8-5eda0f85b015 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.919 182096 DEBUG oslo_concurrency.lockutils [req-381968d0-8cb2-4dd1-b202-a66370471496 req-c2e03ba7-7206-456f-a6e8-5eda0f85b015 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.920 182096 DEBUG oslo_concurrency.lockutils [req-381968d0-8cb2-4dd1-b202-a66370471496 req-c2e03ba7-7206-456f-a6e8-5eda0f85b015 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.920 182096 DEBUG nova.compute.manager [req-381968d0-8cb2-4dd1-b202-a66370471496 req-c2e03ba7-7206-456f-a6e8-5eda0f85b015 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Processing event network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.951 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f7cfb8d7-f473-43e7-8430-cead63771b24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.952 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82773ff7-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.952 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.952 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82773ff7-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:25 compute-0 kernel: tap82773ff7-80: entered promiscuous mode
Jan 23 09:42:25 compute-0 NetworkManager[54920]: <info>  [1769161345.9553] manager: (tap82773ff7-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.957 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.959 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82773ff7-80, col_values=(('external_ids', {'iface-id': 'f270e0ea-abb3-46b6-8ab7-f3b2d93dd703'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:25 compute-0 ovn_controller[94697]: 2026-01-23T09:42:25Z|00691|binding|INFO|Releasing lport f270e0ea-abb3-46b6-8ab7-f3b2d93dd703 from this chassis (sb_readonly=0)
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.960 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.973 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.975 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82773ff7-8c70-4ede-a3ab-2917fd9eda62.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82773ff7-8c70-4ede-a3ab-2917fd9eda62.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.977 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[549e97a9-eeff-4c3e-acdc-f3eaaade48cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.977 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-82773ff7-8c70-4ede-a3ab-2917fd9eda62
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/82773ff7-8c70-4ede-a3ab-2917fd9eda62.pid.haproxy
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 82773ff7-8c70-4ede-a3ab-2917fd9eda62
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:42:25 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:25.978 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62', 'env', 'PROCESS_TAG=haproxy-82773ff7-8c70-4ede-a3ab-2917fd9eda62', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82773ff7-8c70-4ede-a3ab-2917fd9eda62.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.983 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161345.9831498, 1d01d877-250f-4069-b7a9-da76e21520a5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.983 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] VM Started (Lifecycle Event)
Jan 23 09:42:25 compute-0 nova_compute[182092]: 2026-01-23 09:42:25.999 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.001 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161345.9835927, 1d01d877-250f-4069-b7a9-da76e21520a5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.002 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] VM Paused (Lifecycle Event)
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.016 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.018 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.034 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:42:26 compute-0 podman[231439]: 2026-01-23 09:42:26.254117514 +0000 UTC m=+0.030412083 container create 36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 09:42:26 compute-0 systemd[1]: Started libpod-conmon-36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9.scope.
Jan 23 09:42:26 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:42:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dbdcd4fcf4f51378e65222d3f2f39d7d98614b6db343a76f665d04a22dfc106/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:42:26 compute-0 podman[231439]: 2026-01-23 09:42:26.309783299 +0000 UTC m=+0.086077868 container init 36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:42:26 compute-0 podman[231439]: 2026-01-23 09:42:26.314341567 +0000 UTC m=+0.090636135 container start 36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 09:42:26 compute-0 podman[231439]: 2026-01-23 09:42:26.240267253 +0000 UTC m=+0.016561842 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:42:26 compute-0 neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62[231451]: [NOTICE]   (231455) : New worker (231457) forked
Jan 23 09:42:26 compute-0 neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62[231451]: [NOTICE]   (231455) : Loading success.
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.355 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1a75b58f-843f-4939-99e9-00991ea0a602 in datapath 2482367c-d492-48cd-8a65-fd27ef9491ab unbound from our chassis
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.357 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2482367c-d492-48cd-8a65-fd27ef9491ab
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.365 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[721a9a93-289f-4feb-94e7-bcb06aea96df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.366 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2482367c-d1 in ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.367 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2482367c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.368 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[23aceeb0-b71c-4d4f-8364-abc97bd773f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.369 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[95918339-ba7b-488e-92f8-08a5ec8f9438]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.379 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[6824d456-541d-4ec7-84bb-e021c21b5120]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.388 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9f011f-61f2-4b9e-9919-901831b592a1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.409 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ea83df-5dde-42c3-adf8-96ab6fa784fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.413 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3ddfa22d-62a6-46c3-8a1f-27967c0d7419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 NetworkManager[54920]: <info>  [1769161346.4141] manager: (tap2482367c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/340)
Jan 23 09:42:26 compute-0 systemd-udevd[231395]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.436 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2f384eb2-ee4c-4b23-9136-e5a46bd8b5d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.439 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[261de27d-a1ad-425e-b46a-05d00174f263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 NetworkManager[54920]: <info>  [1769161346.4558] device (tap2482367c-d0): carrier: link connected
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.459 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e8521b24-0ead-476b-8a8a-91281cc5eed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.470 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[27811785-f607-4b5e-a050-0126629b1d1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2482367c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:4d:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485194, 'reachable_time': 17350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231472, 'error': None, 'target': 'ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.482 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4df0a6cd-fa8e-47dc-b67d-a66ef64d93d3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:4d8d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485194, 'tstamp': 485194}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231473, 'error': None, 'target': 'ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.492 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[842a82f7-f632-492f-a406-468054b1fb6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2482367c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:4d:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485194, 'reachable_time': 17350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231474, 'error': None, 'target': 'ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.515 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d29102b8-2ec4-4cb6-8d16-809d38584b45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.540 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9454bb-74b7-449a-966c-1f2dca60db4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.541 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2482367c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.541 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.541 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2482367c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:26 compute-0 kernel: tap2482367c-d0: entered promiscuous mode
Jan 23 09:42:26 compute-0 NetworkManager[54920]: <info>  [1769161346.5446] manager: (tap2482367c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.543 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.547 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2482367c-d0, col_values=(('external_ids', {'iface-id': 'c957aec4-5720-4b01-aab9-5f9131b765a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.548 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:26 compute-0 ovn_controller[94697]: 2026-01-23T09:42:26Z|00692|binding|INFO|Releasing lport c957aec4-5720-4b01-aab9-5f9131b765a9 from this chassis (sb_readonly=0)
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.549 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2482367c-d492-48cd-8a65-fd27ef9491ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2482367c-d492-48cd-8a65-fd27ef9491ab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.550 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1b65da22-b8c3-42da-859a-cc770615e0a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.551 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-2482367c-d492-48cd-8a65-fd27ef9491ab
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/2482367c-d492-48cd-8a65-fd27ef9491ab.pid.haproxy
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 2482367c-d492-48cd-8a65-fd27ef9491ab
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:42:26 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:26.551 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab', 'env', 'PROCESS_TAG=haproxy-2482367c-d492-48cd-8a65-fd27ef9491ab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2482367c-d492-48cd-8a65-fd27ef9491ab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.561 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:26 compute-0 podman[231500]: 2026-01-23 09:42:26.825292896 +0000 UTC m=+0.031077269 container create fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 09:42:26 compute-0 systemd[1]: Started libpod-conmon-fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52.scope.
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.853 182096 DEBUG nova.network.neutron [req-d96add38-5192-41be-a179-52b7762cddeb req-d9dd2031-a6eb-4145-ba66-147ea42fa299 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updated VIF entry in instance network info cache for port 1a75b58f-843f-4939-99e9-00991ea0a602. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.854 182096 DEBUG nova.network.neutron [req-d96add38-5192-41be-a179-52b7762cddeb req-d9dd2031-a6eb-4145-ba66-147ea42fa299 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updating instance_info_cache with network_info: [{"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:42:26 compute-0 nova_compute[182092]: 2026-01-23 09:42:26.868 182096 DEBUG oslo_concurrency.lockutils [req-d96add38-5192-41be-a179-52b7762cddeb req-d9dd2031-a6eb-4145-ba66-147ea42fa299 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:42:26 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:42:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b44e5b8173979ace5bf6e492073611e9c266bd8b66e1c5a043dc8376dee5943/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:42:26 compute-0 podman[231500]: 2026-01-23 09:42:26.883329175 +0000 UTC m=+0.089113548 container init fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:42:26 compute-0 podman[231500]: 2026-01-23 09:42:26.887314892 +0000 UTC m=+0.093099255 container start fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 09:42:26 compute-0 podman[231500]: 2026-01-23 09:42:26.810724872 +0000 UTC m=+0.016509256 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:42:26 compute-0 neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab[231511]: [NOTICE]   (231515) : New worker (231517) forked
Jan 23 09:42:26 compute-0 neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab[231511]: [NOTICE]   (231515) : Loading success.
Jan 23 09:42:27 compute-0 nova_compute[182092]: 2026-01-23 09:42:27.564 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.017 182096 DEBUG nova.compute.manager [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.018 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.018 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.018 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.019 182096 DEBUG nova.compute.manager [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] No event matching network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d in dict_keys([('network-vif-plugged', '1a75b58f-843f-4939-99e9-00991ea0a602')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.019 182096 WARNING nova.compute.manager [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received unexpected event network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d for instance with vm_state building and task_state spawning.
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.019 182096 DEBUG nova.compute.manager [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.020 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.020 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.020 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.020 182096 DEBUG nova.compute.manager [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Processing event network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.021 182096 DEBUG nova.compute.manager [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.021 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.021 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.021 182096 DEBUG oslo_concurrency.lockutils [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.022 182096 DEBUG nova.compute.manager [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] No waiting events found dispatching network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.022 182096 WARNING nova.compute.manager [req-4012254d-6700-4edc-b3d6-1c5032ec795e req-ed22950e-97d0-42a5-af02-665f307c189c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received unexpected event network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 for instance with vm_state building and task_state spawning.
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.022 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.026 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161348.0262942, 1d01d877-250f-4069-b7a9-da76e21520a5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.026 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] VM Resumed (Lifecycle Event)
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.027 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.030 182096 INFO nova.virt.libvirt.driver [-] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Instance spawned successfully.
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.030 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.045 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.050 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.052 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.053 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.053 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.053 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.054 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.054 182096 DEBUG nova.virt.libvirt.driver [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.077 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.116 182096 INFO nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Took 8.49 seconds to spawn the instance on the hypervisor.
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.116 182096 DEBUG nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.190 182096 INFO nova.compute.manager [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Took 8.89 seconds to build instance.
Jan 23 09:42:28 compute-0 nova_compute[182092]: 2026-01-23 09:42:28.205 182096 DEBUG oslo_concurrency.lockutils [None req-fa0b81de-e160-4afa-b6aa-b371988e40d5 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:29 compute-0 podman[231522]: 2026-01-23 09:42:29.208221315 +0000 UTC m=+0.046368743 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Jan 23 09:42:29 compute-0 podman[231523]: 2026-01-23 09:42:29.214090725 +0000 UTC m=+0.049850377 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 23 09:42:29 compute-0 nova_compute[182092]: 2026-01-23 09:42:29.707 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:31 compute-0 nova_compute[182092]: 2026-01-23 09:42:31.646 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:42:32 compute-0 nova_compute[182092]: 2026-01-23 09:42:32.204 182096 DEBUG nova.compute.manager [req-f1d9ceb4-1253-4b72-993d-c3674d01ea25 req-c05d1c03-d299-403b-a43b-413b4f5e450a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-changed-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:32 compute-0 nova_compute[182092]: 2026-01-23 09:42:32.204 182096 DEBUG nova.compute.manager [req-f1d9ceb4-1253-4b72-993d-c3674d01ea25 req-c05d1c03-d299-403b-a43b-413b4f5e450a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Refreshing instance network info cache due to event network-changed-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:42:32 compute-0 nova_compute[182092]: 2026-01-23 09:42:32.205 182096 DEBUG oslo_concurrency.lockutils [req-f1d9ceb4-1253-4b72-993d-c3674d01ea25 req-c05d1c03-d299-403b-a43b-413b4f5e450a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:42:32 compute-0 nova_compute[182092]: 2026-01-23 09:42:32.206 182096 DEBUG oslo_concurrency.lockutils [req-f1d9ceb4-1253-4b72-993d-c3674d01ea25 req-c05d1c03-d299-403b-a43b-413b4f5e450a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:42:32 compute-0 nova_compute[182092]: 2026-01-23 09:42:32.206 182096 DEBUG nova.network.neutron [req-f1d9ceb4-1253-4b72-993d-c3674d01ea25 req-c05d1c03-d299-403b-a43b-413b4f5e450a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Refreshing network info cache for port 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:42:32 compute-0 nova_compute[182092]: 2026-01-23 09:42:32.566 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.003 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'name': 'tempest-TestGettingAddress-server-1786233331', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a9', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd4181f6c647942e881af13381cc2f253', 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'hostId': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.004 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'name': 'tempest-TestNetworkBasicOps-server-600889586', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a2', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'user_id': '8aa2911d0bc0474cb77214528548d308', 'hostId': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.005 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.007 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1d01d877-250f-4069-b7a9-da76e21520a5 / tap8f10cfa2-5b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.007 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1d01d877-250f-4069-b7a9-da76e21520a5 / tap1a75b58f-84 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.007 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.008 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.009 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d2ecc8e0-c714-4bea-a634-4908c7d6cdef / tap063af96a-93 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.010 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d2ecc8e0-c714-4bea-a634-4908c7d6cdef / tap2a029ff6-f6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.010 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.bytes volume: 127505 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.010 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.bytes volume: 3654 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3ef4e2e-5987-45f9-93b6-5a2f0e34a499', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.005350', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd77804e0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'badf9c588ae4b9bba3894b13b68245de2404b6f751b43910c627bc3815b64748'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.005350', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd778105c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '979e34c41ef0b39e92638a6ce17d8188c365f729cba26bac93f073d1af25a5c1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 127505, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.005350', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd7786642-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '84c6281f7e9113bb31eb86b75c3e29ee9fd1ababbbe4e5ce9b4e9edd3a273d12'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3654, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.005350', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd778710a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '3ab5ea7bceabd1a50bfe9424591d88557c8e6efd807706ef4759108eb86dead9'}]}, 'timestamp': '2026-01-23 09:42:33.010934', '_unique_id': '13948188c247445f943830d2d44b6ee6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.011 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.012 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.013 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.013 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.packets volume: 747 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.013 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.packets volume: 51 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc2515b5-fa0d-4962-ad26-5c90dd4ebf17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.012887', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd778c81c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'f75155209f16e19c286a02becb6a9587861937c6bda6095104962b41bede8d33'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.012887', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd778d1fe-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'bdf113984d155acf9050e36fa9e923f0f783f9ff9a0dae0bde3e300b3625a69e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 747, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.012887', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd778db9a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'a95254df6c170c68758a27a7647f7ad04b05b9b45ddf870782e27f838aa0a986'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 51, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.012887', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd778e554-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'ba72e33ff0014fefd4d1ec4228d2a7429496648439d9f5021cdd272801e9db52'}]}, 'timestamp': '2026-01-23 09:42:33.013890', '_unique_id': 'c61f965b855149e5a3a840d9341a8087'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.014 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.022 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.022 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.029 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee6aa50b-a099-45df-9949-73f9de9af034', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.015369', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd77a39ae-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.544754613, 'message_signature': 'ca56494a452109bbfd31491a21e35b15aa94215324d0cb9211dbeb81fac3a4e8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.015369', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd77a44e4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.544754613, 'message_signature': '6e59100f9b9fa32abb63eebbc6d7724f7ffa063c5f156fba45d9dbffd1de8242'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.015369', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd77b5ad2-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.552249815, 'message_signature': '93cadee79881b96e9b2664980c98f8f1dadc5ca67e7faa00fec246d2a7db30c3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.015369', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd77b6464-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.552249815, 'message_signature': '03527dcfaa966a559d99a50b3f83bc9ae56711b721562d16cc45ea79803023e7'}]}, 'timestamp': '2026-01-23 09:42:33.030250', '_unique_id': '0d52618070d34972b976ed2f917e6898'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.031 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.031 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1786233331>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-600889586>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1786233331>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-600889586>]
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.049 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.049 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.067 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.write.requests volume: 381 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.067 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81d27686-fa08-42b8-9f79-703de4ad9a61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.032226', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd77e6704-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': '9a881d8fe827f82dbde937c57ebed5552644edbac3226a43171e908bda0dfffb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.032226', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd77e7136-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': '3b91fedfe60debe0eae34ad4c620d3aeb06258a8317275f48ab8639c47f25447'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 381, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.032226', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7811486-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': 'cf5e35608d3ce232a218f7966341c5536686dffa02499c0608d6d388ef71f2b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.032226', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7811fc6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '3b068ea05bc362d76c79b8c2a6757303ead1678db88870e8ad01e5ae4671a5d9'}]}, 'timestamp': '2026-01-23 09:42:33.067822', '_unique_id': 'faffd46130534faca120c99fc22d62f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.068 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.069 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.069 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.069 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.070 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.070 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92814858-90b4-41c3-8a95-dcac8f2cf93e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.069459', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd7816aa8-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '9b4d118b5a6e1bc7fa660312999823b3afce05e5fff21b87416afc61c9aa70f7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.069459', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd78174ee-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '4760b38536ee01c8621a7a0aa2973820de97ae976a39db242d07b4f671fce8ee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.069459', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd7817e08-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '646673e2e5a36404000ff489ed03555bf1e9da21ca294cd10bf142d239d78132'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.069459', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd78186f0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'c0e7445d171d6290559132fd08d669b37f8e11688aa583234eac3cc011d97ff3'}]}, 'timestamp': '2026-01-23 09:42:33.070456', '_unique_id': '2233cf01ddc74bb5b891897cea711e34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.072 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.072 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1786233331>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-600889586>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1786233331>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-600889586>]
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.072 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.072 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.read.latency volume: 116469589 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.072 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.read.latency volume: 5323639 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.072 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.read.latency volume: 197726741 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.read.latency volume: 56262170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58eb17dd-dbf6-4907-bbd8-026dadf14d6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 116469589, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.072408', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd781dde4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': '1f59c8307b0238ed5070dfaf54b1e73c94a043b80fa2d21e0382ad7d4c2b9230'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5323639, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.072408', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd781e7e4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': '9157b9ab5c90e74d38743fcb570b2b92e936135b0511c680328d73f7aca2452e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 197726741, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.072408', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd781f090-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '4ae9f139d864a4fe4ee1bbd634f16e714686c583e3ffbbd557814c3b82f2779f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56262170, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.072408', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd781f8e2-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '4f898d7275bbd40eb5e1186727959a61fecaa4f25a0cede8218f1fbc5bf7c9fd'}]}, 'timestamp': '2026-01-23 09:42:33.073367', '_unique_id': 'a6af1088d2274fd1ab2e1086ea626443'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.073 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.074 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.085 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.085 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.085 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e8df089-b378-4960-bcea-95e7734ead79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.074880', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd7823d8e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'f511fc8b6a4c665b19bc4e97c22a387e858f7aac5cf50adba5cd6a895978d623'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.074880', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd783caa0-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'c8d1ed3bb2101828f265ac32a1b35bf297065278097ef4541f6dc1282d904db6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.074880', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd783d2f2-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '3dbb9358bc84c91d204d5ec07ba2e390f718379e54fe03abe819130a2d319060'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.074880', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd783db1c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'f96cf18570782b04d6293ab87d3e6a834166fa8a360845eed180ac8211ff1d2d'}]}, 'timestamp': '2026-01-23 09:42:33.085734', '_unique_id': '590640cbffd64e758a988817f6f58ca8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.087 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.087 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.087 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.packets volume: 659 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.087 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dcc5e90-c33f-49cb-b16b-4568f75d3c6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.087062', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd784183e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '99d33f05a7556f79cf100a822de1ff7c61c47076e1b67229ae690b0cba4ac06b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.087062', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd78420ae-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '22db0ad0cc7f77c8d39e77776d9b005e2f427bdb3bf8261335563f09450bf35e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 659, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.087062', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd78428e2-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '20d1f119ec354838c167541cd1b74c4c43b39251d58ba4593875e3ed4d099e20'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.087062', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd7843166-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '8f73842e8759ed5acb0bc580436e554353d6381e625c409c4fcb32da30dc40cc'}]}, 'timestamp': '2026-01-23 09:42:33.087926', '_unique_id': 'f1ba26a51cff45f4befdfe5a02b8b310'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.089 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.089 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.089 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.089 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '035a625d-9ef2-4f82-8c0a-609259847508', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.089039', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd7846564-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'fb83a657d2cd2ec946859b247205555b06a2fc5b92153f00de4b02f2ca67055e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.089039', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd7846d48-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '65178cdb8dbdfd4b1e564f0f07c55ebfa4e40a767e87fb94fab2428e22b09592'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.089039', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd784754a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'be1b5c10629604ee535fe0d8ccceeb3ebfaedae6c4e8e31026c4fd3189a62510'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.089039', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd7847d9c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '1b9555449e84499688a98497870235a36126c7b25a32c2af9943d05772163ec8'}]}, 'timestamp': '2026-01-23 09:42:33.089878', '_unique_id': '0f2f549e91074c3bb1d20181797df662'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.090 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.092 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.092 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.093 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.read.bytes volume: 30218752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.093 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72ba0db6-75e3-4a6a-8486-efa5650b4805', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.092451', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd784f042-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': 'e270a26d7ba5b05af27a889fc46f963a10a61a6803c4b7d39256844206d7b997'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.092451', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7850096-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': '58e868a8cdb949067e1244fe8a8105c5b65d553c8b968303c833ad91eb1b5716'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30218752, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.092451', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7850f3c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '8644077e3569f9d597766ff013052f73ff2b8d07a82a270406e52fa351fc32c7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.092451', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7852008-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '7029614b8d610f742d3773f9b2fc171cca3c5e56a4d428003ebc6e971ba5707a'}]}, 'timestamp': '2026-01-23 09:42:33.094097', '_unique_id': '03745dc2a01c417d926cab668f0df3b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.094 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.096 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.096 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.097 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.bytes volume: 113228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.097 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.bytes volume: 4950 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4737d2b-6045-4670-a362-ff511bf4a4e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.096291', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd7858606-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '1ee2b7dff57a2a111643742051277ef593b047511672d946317587a96ec96816'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.096291', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd7859740-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '2ca9bc65ecc110063bc4885179fe77af8bbbd173445c27a6b6b7be1e1d8c0509'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 113228, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.096291', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd785a4f6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '290ee317d2f57b7080b136abdff874e0b0fe3532a546a8098d4644788aa1e336'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4950, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.096291', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd785b612-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'bad7e0085d98d22ba347a45a4b895d94d87d59cbb1f7b41e2c1ebd2403d6acce'}]}, 'timestamp': '2026-01-23 09:42:33.097938', '_unique_id': 'a691cb825b1448508cb7409cdd02944d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.098 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.099 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.099 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1786233331>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-600889586>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1786233331>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-600889586>]
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.100 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.130 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.130 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 1d01d877-250f-4069-b7a9-da76e21520a5: ceilometer.compute.pollsters.NoVolumeException
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.143 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/memory.usage volume: 42.8359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ffa4fcf-8ed2-44e4-83ef-0588eb45eefb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.8359375, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'timestamp': '2026-01-23T09:42:33.100219', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd78cbe9e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.672915748, 'message_signature': '927d07f3b8a4a22b9a884456252c9518361a48358a9e8d047792f32b13726d9f'}]}, 'timestamp': '2026-01-23 09:42:33.144012', '_unique_id': 'fa0e76908b9648828a04d32cf98a593f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.144 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.147 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.147 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.148 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.usage volume: 30212096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.148 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10ae446b-d683-43fe-a5ad-9deebbcea29a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.147481', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd78d5606-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.544754613, 'message_signature': '07b28612745b3db10a59aeb051fce5bd8c864bde8e38959ef0f46b0d39b7d6e9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.147481', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd78d62fe-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.544754613, 'message_signature': '03c5d07e17048764ed6f3478cb5cbde36fd1f8c81e032c62de471247c9a1d9c2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30212096, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.147481', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd78d6ede-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.552249815, 'message_signature': '70dc63abae44e7fca50357791e6ee9f04295bda75202f8aa4cd90b71340c45ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.147481', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd78d7b40-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.552249815, 'message_signature': '72580f6e8e176e10ae9e487fc222301a3538983f32c69ca12a0057e462f628e3'}]}, 'timestamp': '2026-01-23 09:42:33.148804', '_unique_id': '80a9f5ee28904a6cb783d657eadc35f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.149 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.152 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.152 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1786233331>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-600889586>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1786233331>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-600889586>]
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.152 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.153 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.153 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.write.bytes volume: 73400320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.153 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba7e77cc-0904-491f-a151-65fa5636f9d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.152830', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd78e2504-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': 'a81fd7857fbc38551e28be41a37c30cd0e8e0061c7c5258dba98807530841aea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.152830', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd78e3120-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': 'dcc6f0a70b49bf9cdebfac92c9e019c305812a23ce636b6f8a841e52a82ddb7c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73400320, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.152830', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd78e3e72-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '1efdaabe436b1312a8529deee3d33afd82ddd9dd1e91c37b5d18404e1b68bb5c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.152830', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd78e4a02-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '7084eeb0ceedcd18a1dc4e4f4171d8a1df7c810724f1584b2b85d1d3593cd21b'}]}, 'timestamp': '2026-01-23 09:42:33.154094', '_unique_id': '4073231242894d04974c4622c72bf3a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.154 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.157 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.157 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.158 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.158 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5a4a274-cb25-4758-ab06-6b1c63000bd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.157376', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd78ed724-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.544754613, 'message_signature': '754cedf279680afb9f253b1f52db37f86d13cfcbee8bfbc6148a2922800e4e15'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.157376', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd78ee43a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.544754613, 'message_signature': '3864698fbb25a7841c9a4ab25aa1a6786ddd83acb3b3a8ecaa5b89f64acdee98'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.157376', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd78ef042-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.552249815, 'message_signature': '52d851b925900687ceed7da5476dc0e1ac31beec4a4fb65719eb9cc3d853179b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.157376', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd78efc18-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.552249815, 'message_signature': 'e9b0e7588b3ce6e3028895b3ce4913055fd4294d2095ab2b76794f6628d404f7'}]}, 'timestamp': '2026-01-23 09:42:33.158688', '_unique_id': '16b4cfd5511643bb949f9f22db420f95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.159 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.162 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.162 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.162 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4234491-afae-42c6-89a0-4d80e0eb3e9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.162058', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd78f8dcc-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '95088b09d017a6b42bbc28937a70f3c97b93c4f3b638ec26ffd4870c25114554'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.162058', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd78f9a88-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '87bb25733c3a885e934e89dfffd56527797e848e01b2ccab6a17bee7265451d6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.162058', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd78fa79e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '2216bc8dd82fc72a837bd7d943ef4e5f583aaa714e90d9b4dc161f4454cc662d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.162058', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd78fb32e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'c30c8aecc05cf8b690cb3b0e1a71c88f6a6feae6b358cc832c4484b950d0884b'}]}, 'timestamp': '2026-01-23 09:42:33.163342', '_unique_id': '4fae8f7b5ca7458283912670552dd1a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.163 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.166 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.166 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/cpu volume: 4930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.166 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/cpu volume: 10860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e7aeeb7-cc38-41ca-a608-4fbb6ed35c8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4930000000, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'timestamp': '2026-01-23T09:42:33.166602', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd7904064-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.659798829, 'message_signature': 'b8c40f87c53e752f718e334af22d9a3b2392df0b86a5593d9ed76056e4205668'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10860000000, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'timestamp': '2026-01-23T09:42:33.166602', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd7904c8a-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.672915748, 'message_signature': '75687a2012c4ab083cb3425f84a10028b0583b531b99ea1944f3f5622a70b9d5'}]}, 'timestamp': '2026-01-23 09:42:33.167267', '_unique_id': '1152d3b6a74b494190ba673e5c326fcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.167 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.170 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.170 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.171 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.write.latency volume: 368902417 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.171 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa1189e7-5f79-44e1-8a25-faa76a9f0a31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.170452', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd790d718-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': 'b72faa78a6ff90868e30034c59d5510f15762e3d24a5a6dd36f4b9b261f259f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.170452', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd790e316-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': '6720babb965c31a56527cf33580b21c7b6ab2fafab99e740229f9e73b53f5d8d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 368902417, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.170452', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd790eea6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '7221cf214b022de481d84465b47f7099a67d152eee49b31ef42a6e6c30068ce2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.170452', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd790fa2c-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '0950e2db5d22fd805b27b7441a525bf614286590ea9e588a22018d6adf7c1531'}]}, 'timestamp': '2026-01-23 09:42:33.171740', '_unique_id': '5a164caf144a46d1992c43448517fd11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.172 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.174 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.175 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.incoming.packets.drop volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.175 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.175 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1542fca1-4b25-4b64-a230-0c063a90e353', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.174941', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd7918514-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'aa439db28b2fa173b61aed495d10130ca9852564c01e4a7e8420676baf7d265a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.174941', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd7919180-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': '30e9f1f57e436dbe3ebc0fa19634cc2841ad233b769ccb366a9055c9cbefd5f4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.174941', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd7919f0e-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'caeb7146e5113e6e604511ac083f20f3de424c4188e75ce9363af78c2338febe'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.174941', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd791aac6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'c6b568dcabe2af30c85e116c47fab0657a3f8b7bb6fb3337a0ee8710f86a39a5'}]}, 'timestamp': '2026-01-23 09:42:33.176236', '_unique_id': 'c2a67b50c03d45ad824cc595c9fef012'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.176 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.179 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.179 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.180 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.180 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1be170b2-5253-465c-853a-6d2f9e0eeb58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-vda', 'timestamp': '2026-01-23T09:42:33.179433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd79235a4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': 'b977e88944f6efdcda05e7a4783284cc25958f6e9f99a99f14f922e68a54dfc0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '1d01d877-250f-4069-b7a9-da76e21520a5-sda', 'timestamp': '2026-01-23T09:42:33.179433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'instance-000000a9', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd79241b6-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.561610074, 'message_signature': 'a9b9dc6ffea4e3689be36c2979f5a8a529b876299aaec17214180e2682d849f0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-vda', 'timestamp': '2026-01-23T09:42:33.179433', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7924d28-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': 'b143b7b83882cbd877d6027c0daadded5fe71df7ecf10227caf67e510c0dbbad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef-sda', 'timestamp': '2026-01-23T09:42:33.179433', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'instance-000000a2', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd79258b8-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.579604107, 'message_signature': '2a760bd8409981e773a3b1601a15ae21916aac16aab95ee1c4d58bee0d849bd8'}]}, 'timestamp': '2026-01-23 09:42:33.180718', '_unique_id': '1b42f09c1fa94d8cbfc1c0914a3b50a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.181 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.182 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.183 12 DEBUG ceilometer.compute.pollsters [-] 1d01d877-250f-4069-b7a9-da76e21520a5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.184 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.184 12 DEBUG ceilometer.compute.pollsters [-] d2ecc8e0-c714-4bea-a634-4908c7d6cdef/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58ce7b22-d4b1-429c-a5e2-13760957b644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap8f10cfa2-5b', 'timestamp': '2026-01-23T09:42:33.182863', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap8f10cfa2-5b', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:7d:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f10cfa2-5b'}, 'message_id': 'd792bcb8-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'd2353d81c7952b888528ee0c4bb4c89dc76db424a87d91cadcded6cf4fc85e40'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000a9-1d01d877-250f-4069-b7a9-da76e21520a5-tap1a75b58f-84', 'timestamp': '2026-01-23T09:42:33.182863', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1786233331', 'name': 'tap1a75b58f-84', 'instance_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:30:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a75b58f-84'}, 'message_id': 'd792d892-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.534737283, 'message_signature': 'bd00542603a700703a5f73e47fa8b163dd569990fb7db41b52a8d38012e9f4dc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap063af96a-93', 'timestamp': '2026-01-23T09:42:33.182863', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap063af96a-93', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:1e:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap063af96a-93'}, 'message_id': 'd792edb4-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': 'e214099aeeabfeb9500c655d65e2f9ee816d97dc87e459953f0ae8738a9bbf5f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_name': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_name': None, 'resource_id': 'instance-000000a2-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-tap2a029ff6-f6', 'timestamp': '2026-01-23T09:42:33.182863', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-600889586', 'name': 'tap2a029ff6-f6', 'instance_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'instance_type': 'm1.nano', 'host': '37acf24753d109455a3a0e26c2b472a768e7dc7ccd7a4dad25d86fbc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:9f:1c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2a029ff6-f6'}, 'message_id': 'd792f976-f83f-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 4858.537804661, 'message_signature': '559caadf1aa808072445b691973fbcf27dac05716d4ec2123c9f13de565bffc2'}]}, 'timestamp': '2026-01-23 09:42:33.184818', '_unique_id': '12642cbe27844e9fa86da9c647d8a4e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:42:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:42:33.185 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:42:33 compute-0 nova_compute[182092]: 2026-01-23 09:42:33.745 182096 DEBUG nova.network.neutron [req-f1d9ceb4-1253-4b72-993d-c3674d01ea25 req-c05d1c03-d299-403b-a43b-413b4f5e450a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updated VIF entry in instance network info cache for port 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:42:33 compute-0 nova_compute[182092]: 2026-01-23 09:42:33.745 182096 DEBUG nova.network.neutron [req-f1d9ceb4-1253-4b72-993d-c3674d01ea25 req-c05d1c03-d299-403b-a43b-413b4f5e450a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updating instance_info_cache with network_info: [{"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:42:33 compute-0 nova_compute[182092]: 2026-01-23 09:42:33.761 182096 DEBUG oslo_concurrency.lockutils [req-f1d9ceb4-1253-4b72-993d-c3674d01ea25 req-c05d1c03-d299-403b-a43b-413b4f5e450a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:42:34 compute-0 nova_compute[182092]: 2026-01-23 09:42:34.711 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:37 compute-0 nova_compute[182092]: 2026-01-23 09:42:37.568 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:39 compute-0 nova_compute[182092]: 2026-01-23 09:42:39.714 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:39 compute-0 podman[231570]: 2026-01-23 09:42:39.798492997 +0000 UTC m=+0.057716011 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:42:39 compute-0 podman[231571]: 2026-01-23 09:42:39.825166075 +0000 UTC m=+0.083118874 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:42:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:39.872 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:39.873 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:39.874 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:40 compute-0 ovn_controller[94697]: 2026-01-23T09:42:40Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:10:7d:6f 10.100.0.11
Jan 23 09:42:40 compute-0 ovn_controller[94697]: 2026-01-23T09:42:40Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:7d:6f 10.100.0.11
Jan 23 09:42:42 compute-0 nova_compute[182092]: 2026-01-23 09:42:42.569 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:42.820 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:42:42 compute-0 nova_compute[182092]: 2026-01-23 09:42:42.820 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:42 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:42.821 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.037 182096 DEBUG oslo_concurrency.lockutils [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "interface-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-2a029ff6-f6a7-4903-a87d-48339ec92a1b" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.037 182096 DEBUG oslo_concurrency.lockutils [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "interface-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-2a029ff6-f6a7-4903-a87d-48339ec92a1b" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.045 182096 DEBUG nova.objects.instance [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'flavor' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.063 182096 DEBUG nova.virt.libvirt.vif [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:40:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:40:57Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.064 182096 DEBUG nova.network.os_vif_util [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.064 182096 DEBUG nova.network.os_vif_util [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.066 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.068 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.069 182096 DEBUG nova.virt.libvirt.driver [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Attempting to detach device tap2a029ff6-f6 from instance d2ecc8e0-c714-4bea-a634-4908c7d6cdef from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.070 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] detach device xml: <interface type="ethernet">
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:ef:9f:1c"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <target dev="tap2a029ff6-f6"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]: </interface>
Jan 23 09:42:43 compute-0 nova_compute[182092]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.074 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.076 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface>not found in domain: <domain type='kvm' id='80'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <name>instance-000000a2</name>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <uuid>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</uuid>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:name>tempest-TestNetworkBasicOps-server-600889586</nova:name>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:41:24</nova:creationTime>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:port uuid="063af96a-932f-4abc-b86a-e046a2f8ba53">
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:port uuid="2a029ff6-f6a7-4903-a87d-48339ec92a1b">
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:42:43 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <system>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='serial'>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='uuid'>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </system>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <os>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </os>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <features>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </features>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk' index='2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.config' index='1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:c8:1e:7e'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target dev='tap063af96a-93'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:ef:9f:1c'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target dev='tap2a029ff6-f6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='net1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log' append='off'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       </target>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log' append='off'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </console>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <video>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </video>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c296,c487</label>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c296,c487</imagelabel>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:42:43 compute-0 nova_compute[182092]: </domain>
Jan 23 09:42:43 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.078 182096 INFO nova.virt.libvirt.driver [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully detached device tap2a029ff6-f6 from instance d2ecc8e0-c714-4bea-a634-4908c7d6cdef from the persistent domain config.
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.078 182096 DEBUG nova.virt.libvirt.driver [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] (1/8): Attempting to detach device tap2a029ff6-f6 with device alias net1 from instance d2ecc8e0-c714-4bea-a634-4908c7d6cdef from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.078 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] detach device xml: <interface type="ethernet">
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <mac address="fa:16:3e:ef:9f:1c"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <model type="virtio"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <mtu size="1442"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <target dev="tap2a029ff6-f6"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]: </interface>
Jan 23 09:42:43 compute-0 nova_compute[182092]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 09:42:43 compute-0 kernel: tap2a029ff6-f6 (unregistering): left promiscuous mode
Jan 23 09:42:43 compute-0 NetworkManager[54920]: <info>  [1769161363.1699] device (tap2a029ff6-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:42:43 compute-0 ovn_controller[94697]: 2026-01-23T09:42:43Z|00693|binding|INFO|Releasing lport 2a029ff6-f6a7-4903-a87d-48339ec92a1b from this chassis (sb_readonly=0)
Jan 23 09:42:43 compute-0 ovn_controller[94697]: 2026-01-23T09:42:43Z|00694|binding|INFO|Setting lport 2a029ff6-f6a7-4903-a87d-48339ec92a1b down in Southbound
Jan 23 09:42:43 compute-0 ovn_controller[94697]: 2026-01-23T09:42:43Z|00695|binding|INFO|Removing iface tap2a029ff6-f6 ovn-installed in OVS
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.179 182096 DEBUG nova.virt.libvirt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Received event <DeviceRemovedEvent: 1769161363.1789916, d2ecc8e0-c714-4bea-a634-4908c7d6cdef => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.180 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.182 182096 DEBUG nova.virt.libvirt.driver [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Start waiting for the detach event from libvirt for device tap2a029ff6-f6 with device alias net1 for instance d2ecc8e0-c714-4bea-a634-4908c7d6cdef _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.182 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.187 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:9f:1c 10.100.0.18', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6385902e-a3fc-45ed-ae47-016bab41e365', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f76504-0307-4659-beb0-ded71ba0faa8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2a029ff6-f6a7-4903-a87d-48339ec92a1b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.188 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2a029ff6-f6a7-4903-a87d-48339ec92a1b in datapath 6385902e-a3fc-45ed-ae47-016bab41e365 unbound from our chassis
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.189 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6385902e-a3fc-45ed-ae47-016bab41e365, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.185 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface>not found in domain: <domain type='kvm' id='80'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <name>instance-000000a2</name>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <uuid>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</uuid>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:name>tempest-TestNetworkBasicOps-server-600889586</nova:name>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:41:24</nova:creationTime>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:port uuid="063af96a-932f-4abc-b86a-e046a2f8ba53">
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:port uuid="2a029ff6-f6a7-4903-a87d-48339ec92a1b">
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:42:43 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <system>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='serial'>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='uuid'>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </system>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <os>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </os>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <features>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </features>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk' index='2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.config' index='1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.190 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe80b8d-8837-4a9b-8702-daaca5f7f57c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.190 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365 namespace which is not needed anymore
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:c8:1e:7e'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target dev='tap063af96a-93'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log' append='off'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       </target>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log' append='off'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </console>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <video>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </video>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c296,c487</label>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c296,c487</imagelabel>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:42:43 compute-0 nova_compute[182092]: </domain>
Jan 23 09:42:43 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.185 182096 INFO nova.virt.libvirt.driver [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully detached device tap2a029ff6-f6 from instance d2ecc8e0-c714-4bea-a634-4908c7d6cdef from the live domain config.
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.186 182096 DEBUG nova.virt.libvirt.vif [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:40:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:40:57Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.186 182096 DEBUG nova.network.os_vif_util [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.187 182096 DEBUG nova.network.os_vif_util [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.187 182096 DEBUG os_vif [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.189 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.189 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a029ff6-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.195 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.197 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.201 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.211 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.212 182096 INFO os_vif [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6')
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.213 182096 DEBUG nova.virt.libvirt.guest [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:name>tempest-TestNetworkBasicOps-server-600889586</nova:name>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:42:43</nova:creationTime>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     <nova:port uuid="063af96a-932f-4abc-b86a-e046a2f8ba53">
Jan 23 09:42:43 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:42:43 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:42:43 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:42:43 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:42:43 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:42:43 compute-0 podman[231608]: 2026-01-23 09:42:43.234221485 +0000 UTC m=+0.073050628 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64)
Jan 23 09:42:43 compute-0 neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365[231083]: [NOTICE]   (231087) : haproxy version is 2.8.14-c23fe91
Jan 23 09:42:43 compute-0 neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365[231083]: [NOTICE]   (231087) : path to executable is /usr/sbin/haproxy
Jan 23 09:42:43 compute-0 neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365[231083]: [WARNING]  (231087) : Exiting Master process...
Jan 23 09:42:43 compute-0 neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365[231083]: [WARNING]  (231087) : Exiting Master process...
Jan 23 09:42:43 compute-0 neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365[231083]: [ALERT]    (231087) : Current worker (231089) exited with code 143 (Terminated)
Jan 23 09:42:43 compute-0 neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365[231083]: [WARNING]  (231087) : All workers exited. Exiting... (0)
Jan 23 09:42:43 compute-0 systemd[1]: libpod-6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81.scope: Deactivated successfully.
Jan 23 09:42:43 compute-0 conmon[231083]: conmon 6d529238e0d52a3c88f0 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81.scope/container/memory.events
Jan 23 09:42:43 compute-0 podman[231648]: 2026-01-23 09:42:43.300253463 +0000 UTC m=+0.035962908 container died 6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 09:42:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81-userdata-shm.mount: Deactivated successfully.
Jan 23 09:42:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-458d357c8482050f163f74c363d895805681d29ffcd09c36021f879265a26298-merged.mount: Deactivated successfully.
Jan 23 09:42:43 compute-0 podman[231648]: 2026-01-23 09:42:43.322576443 +0000 UTC m=+0.058285879 container cleanup 6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:42:43 compute-0 systemd[1]: libpod-conmon-6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81.scope: Deactivated successfully.
Jan 23 09:42:43 compute-0 podman[231672]: 2026-01-23 09:42:43.371102722 +0000 UTC m=+0.028492131 container remove 6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.374 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a4dbfb55-deb3-4dae-b3cd-00b45c9084f4]: (4, ('Fri Jan 23 09:42:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365 (6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81)\n6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81\nFri Jan 23 09:42:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365 (6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81)\n6d529238e0d52a3c88f0468f12917dccce97f4af3d9c6fad122464aac1564f81\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.375 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[221b3094-89b3-4857-ad9f-30a22b6eba2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.376 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6385902e-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.377 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:43 compute-0 kernel: tap6385902e-a0: left promiscuous mode
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.391 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.392 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0f92c3-94e5-4fb2-b869-6a6c6b8b2693]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.400 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec10b74-49ea-42fb-b8a6-84b84f330408]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.400 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[10715b3a-6af6-49a0-9b1e-864a9b4c1859]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.413 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c78d79c2-df6d-42c5-9e29-b02457c88e50]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478986, 'reachable_time': 39512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231685, 'error': None, 'target': 'ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.414 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6385902e-a3fc-45ed-ae47-016bab41e365 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:42:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:43.415 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[caf27c89-66dc-45ea-8305-491bc6e454a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d6385902e\x2da3fc\x2d45ed\x2dae47\x2d016bab41e365.mount: Deactivated successfully.
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.819 182096 DEBUG nova.compute.manager [req-116114dd-bc1b-491d-bc87-a4af4c1b097d req-575b3dd4-9dad-4abf-92b0-daa2ac794f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-unplugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.819 182096 DEBUG oslo_concurrency.lockutils [req-116114dd-bc1b-491d-bc87-a4af4c1b097d req-575b3dd4-9dad-4abf-92b0-daa2ac794f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.819 182096 DEBUG oslo_concurrency.lockutils [req-116114dd-bc1b-491d-bc87-a4af4c1b097d req-575b3dd4-9dad-4abf-92b0-daa2ac794f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.820 182096 DEBUG oslo_concurrency.lockutils [req-116114dd-bc1b-491d-bc87-a4af4c1b097d req-575b3dd4-9dad-4abf-92b0-daa2ac794f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.820 182096 DEBUG nova.compute.manager [req-116114dd-bc1b-491d-bc87-a4af4c1b097d req-575b3dd4-9dad-4abf-92b0-daa2ac794f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] No waiting events found dispatching network-vif-unplugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:42:43 compute-0 nova_compute[182092]: 2026-01-23 09:42:43.820 182096 WARNING nova.compute.manager [req-116114dd-bc1b-491d-bc87-a4af4c1b097d req-575b3dd4-9dad-4abf-92b0-daa2ac794f22 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received unexpected event network-vif-unplugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b for instance with vm_state active and task_state None.
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.013 182096 DEBUG oslo_concurrency.lockutils [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.014 182096 DEBUG oslo_concurrency.lockutils [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.014 182096 DEBUG nova.network.neutron [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.112 182096 DEBUG nova.compute.manager [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-deleted-2a029ff6-f6a7-4903-a87d-48339ec92a1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.112 182096 INFO nova.compute.manager [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Neutron deleted interface 2a029ff6-f6a7-4903-a87d-48339ec92a1b; detaching it from the instance and deleting it from the info cache
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.113 182096 DEBUG nova.network.neutron [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.132 182096 DEBUG nova.objects.instance [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lazy-loading 'system_metadata' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.150 182096 DEBUG nova.objects.instance [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lazy-loading 'flavor' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.168 182096 DEBUG nova.virt.libvirt.vif [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:40:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:40:57Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.168 182096 DEBUG nova.network.os_vif_util [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Converting VIF {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.168 182096 DEBUG nova.network.os_vif_util [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.170 182096 DEBUG nova.virt.libvirt.guest [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.172 182096 DEBUG nova.virt.libvirt.guest [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface>not found in domain: <domain type='kvm' id='80'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <name>instance-000000a2</name>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <uuid>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</uuid>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:name>tempest-TestNetworkBasicOps-server-600889586</nova:name>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:42:43</nova:creationTime>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:port uuid="063af96a-932f-4abc-b86a-e046a2f8ba53">
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:42:44 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <system>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='serial'>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='uuid'>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </system>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <os>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </os>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <features>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </features>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk' index='2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.config' index='1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:c8:1e:7e'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target dev='tap063af96a-93'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log' append='off'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       </target>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log' append='off'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </console>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <video>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </video>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c296,c487</label>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c296,c487</imagelabel>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:42:44 compute-0 nova_compute[182092]: </domain>
Jan 23 09:42:44 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.172 182096 DEBUG nova.virt.libvirt.guest [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.174 182096 DEBUG nova.virt.libvirt.guest [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ef:9f:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2a029ff6-f6"/></interface>not found in domain: <domain type='kvm' id='80'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <name>instance-000000a2</name>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <uuid>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</uuid>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:name>tempest-TestNetworkBasicOps-server-600889586</nova:name>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:42:43</nova:creationTime>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:port uuid="063af96a-932f-4abc-b86a-e046a2f8ba53">
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:42:44 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <memory unit='KiB'>131072</memory>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <vcpu placement='static'>1</vcpu>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <resource>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <partition>/machine</partition>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </resource>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <sysinfo type='smbios'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <system>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='manufacturer'>RDO</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='serial'>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='uuid'>d2ecc8e0-c714-4bea-a634-4908c7d6cdef</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <entry name='family'>Virtual Machine</entry>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </system>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <os>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <boot dev='hd'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <smbios mode='sysinfo'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </os>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <features>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <vmcoreinfo state='on'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </features>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <model fallback='forbid'>Nehalem</model>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <feature policy='require' name='x2apic'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <feature policy='require' name='hypervisor'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <feature policy='require' name='vme'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <clock offset='utc'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <timer name='hpet' present='no'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <on_poweroff>destroy</on_poweroff>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <on_reboot>restart</on_reboot>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <on_crash>destroy</on_crash>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <disk type='file' device='disk'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk' index='2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <backingStore type='file' index='3'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:         <format type='raw'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:         <source file='/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:         <backingStore/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       </backingStore>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target dev='vda' bus='virtio'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='virtio-disk0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <disk type='file' device='cdrom'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <source file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/disk.config' index='1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <backingStore/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target dev='sda' bus='sata'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <readonly/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='sata0-0-0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pcie.0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='1' port='0x10'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='2' port='0x11'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='3' port='0x12'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.3'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='4' port='0x13'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.4'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='5' port='0x14'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.5'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='6' port='0x15'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.6'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='7' port='0x16'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.7'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='8' port='0x17'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.8'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='9' port='0x18'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.9'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='10' port='0x19'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.10'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='11' port='0x1a'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.11'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='12' port='0x1b'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.12'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='13' port='0x1c'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.13'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='14' port='0x1d'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.14'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='15' port='0x1e'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.15'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='16' port='0x1f'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.16'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='17' port='0x20'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.17'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='18' port='0x21'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.18'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='19' port='0x22'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.19'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='20' port='0x23'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.20'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='21' port='0x24'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.21'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='22' port='0x25'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.22'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='23' port='0x26'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.23'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='24' port='0x27'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.24'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-root-port'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target chassis='25' port='0x28'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.25'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model name='pcie-pci-bridge'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='pci.26'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='usb'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <controller type='sata' index='0'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='ide'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </controller>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <interface type='ethernet'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <mac address='fa:16:3e:c8:1e:7e'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target dev='tap063af96a-93'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model type='virtio'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <mtu size='1442'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='net0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <serial type='pty'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log' append='off'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target type='isa-serial' port='0'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:         <model name='isa-serial'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       </target>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <source path='/dev/pts/0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <log file='/var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef/console.log' append='off'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <target type='serial' port='0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='serial0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </console>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <input type='tablet' bus='usb'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='input0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='usb' bus='0' port='1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <input type='mouse' bus='ps2'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='input1'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <input type='keyboard' bus='ps2'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='input2'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </input>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <listen type='address' address='::0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </graphics>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <audio id='1' type='none'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <video>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='video0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </video>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <watchdog model='itco' action='reset'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='watchdog0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </watchdog>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <memballoon model='virtio'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <stats period='10'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='balloon0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <rng model='virtio'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <backend model='random'>/dev/urandom</backend>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <alias name='rng0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <label>system_u:system_r:svirt_t:s0:c296,c487</label>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c296,c487</imagelabel>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <label>+107:+107</label>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <imagelabel>+107:+107</imagelabel>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </seclabel>
Jan 23 09:42:44 compute-0 nova_compute[182092]: </domain>
Jan 23 09:42:44 compute-0 nova_compute[182092]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.175 182096 WARNING nova.virt.libvirt.driver [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Detaching interface fa:16:3e:ef:9f:1c failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap2a029ff6-f6' not found.
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.175 182096 DEBUG nova.virt.libvirt.vif [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:40:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:40:57Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.175 182096 DEBUG nova.network.os_vif_util [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Converting VIF {"id": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "address": "fa:16:3e:ef:9f:1c", "network": {"id": "6385902e-a3fc-45ed-ae47-016bab41e365", "bridge": "br-int", "label": "tempest-network-smoke--914582681", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a029ff6-f6", "ovs_interfaceid": "2a029ff6-f6a7-4903-a87d-48339ec92a1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.176 182096 DEBUG nova.network.os_vif_util [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.176 182096 DEBUG os_vif [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.178 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.178 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a029ff6-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.178 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.180 182096 INFO os_vif [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:9f:1c,bridge_name='br-int',has_traffic_filtering=True,id=2a029ff6-f6a7-4903-a87d-48339ec92a1b,network=Network(6385902e-a3fc-45ed-ae47-016bab41e365),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a029ff6-f6')
Jan 23 09:42:44 compute-0 nova_compute[182092]: 2026-01-23 09:42:44.180 182096 DEBUG nova.virt.libvirt.guest [req-ce8c59cb-8f03-46e7-bb27-85c5785b6eeb req-d5bc26bb-cf71-403f-ba3e-e49d142ecd34 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:name>tempest-TestNetworkBasicOps-server-600889586</nova:name>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:creationTime>2026-01-23 09:42:44</nova:creationTime>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:flavor name="m1.nano">
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:memory>128</nova:memory>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:disk>1</nova:disk>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:swap>0</nova:swap>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:vcpus>1</nova:vcpus>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:flavor>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:owner>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:owner>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   <nova:ports>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     <nova:port uuid="063af96a-932f-4abc-b86a-e046a2f8ba53">
Jan 23 09:42:44 compute-0 nova_compute[182092]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 09:42:44 compute-0 nova_compute[182092]:     </nova:port>
Jan 23 09:42:44 compute-0 nova_compute[182092]:   </nova:ports>
Jan 23 09:42:44 compute-0 nova_compute[182092]: </nova:instance>
Jan 23 09:42:44 compute-0 nova_compute[182092]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 09:42:45 compute-0 ovn_controller[94697]: 2026-01-23T09:42:45Z|00696|binding|INFO|Releasing lport 083ece3e-9182-4251-86be-2cee076862d9 from this chassis (sb_readonly=0)
Jan 23 09:42:45 compute-0 ovn_controller[94697]: 2026-01-23T09:42:45Z|00697|binding|INFO|Releasing lport c957aec4-5720-4b01-aab9-5f9131b765a9 from this chassis (sb_readonly=0)
Jan 23 09:42:45 compute-0 ovn_controller[94697]: 2026-01-23T09:42:45Z|00698|binding|INFO|Releasing lport f270e0ea-abb3-46b6-8ab7-f3b2d93dd703 from this chassis (sb_readonly=0)
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.268 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.549 182096 INFO nova.network.neutron [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Port 2a029ff6-f6a7-4903-a87d-48339ec92a1b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.550 182096 DEBUG nova.network.neutron [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [{"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.565 182096 DEBUG oslo_concurrency.lockutils [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.585 182096 DEBUG oslo_concurrency.lockutils [None req-c5011267-7422-47dc-bea5-388a3470749a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "interface-d2ecc8e0-c714-4bea-a634-4908c7d6cdef-2a029ff6-f6a7-4903-a87d-48339ec92a1b" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.913 182096 DEBUG nova.compute.manager [req-bd25e6d7-00c6-4be9-8f9e-af773d16d674 req-88ff8073-fea8-470a-bb26-acc5f74ace70 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.913 182096 DEBUG oslo_concurrency.lockutils [req-bd25e6d7-00c6-4be9-8f9e-af773d16d674 req-88ff8073-fea8-470a-bb26-acc5f74ace70 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.913 182096 DEBUG oslo_concurrency.lockutils [req-bd25e6d7-00c6-4be9-8f9e-af773d16d674 req-88ff8073-fea8-470a-bb26-acc5f74ace70 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.913 182096 DEBUG oslo_concurrency.lockutils [req-bd25e6d7-00c6-4be9-8f9e-af773d16d674 req-88ff8073-fea8-470a-bb26-acc5f74ace70 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.914 182096 DEBUG nova.compute.manager [req-bd25e6d7-00c6-4be9-8f9e-af773d16d674 req-88ff8073-fea8-470a-bb26-acc5f74ace70 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] No waiting events found dispatching network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:42:45 compute-0 nova_compute[182092]: 2026-01-23 09:42:45.914 182096 WARNING nova.compute.manager [req-bd25e6d7-00c6-4be9-8f9e-af773d16d674 req-88ff8073-fea8-470a-bb26-acc5f74ace70 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received unexpected event network-vif-plugged-2a029ff6-f6a7-4903-a87d-48339ec92a1b for instance with vm_state active and task_state None.
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.066 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.067 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.067 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.067 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.067 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.074 182096 INFO nova.compute.manager [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Terminating instance
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.079 182096 DEBUG nova.compute.manager [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:42:46 compute-0 kernel: tap063af96a-93 (unregistering): left promiscuous mode
Jan 23 09:42:46 compute-0 NetworkManager[54920]: <info>  [1769161366.1037] device (tap063af96a-93): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.107 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 ovn_controller[94697]: 2026-01-23T09:42:46Z|00699|binding|INFO|Releasing lport 063af96a-932f-4abc-b86a-e046a2f8ba53 from this chassis (sb_readonly=0)
Jan 23 09:42:46 compute-0 ovn_controller[94697]: 2026-01-23T09:42:46Z|00700|binding|INFO|Setting lport 063af96a-932f-4abc-b86a-e046a2f8ba53 down in Southbound
Jan 23 09:42:46 compute-0 ovn_controller[94697]: 2026-01-23T09:42:46Z|00701|binding|INFO|Removing iface tap063af96a-93 ovn-installed in OVS
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.109 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.112 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:1e:7e 10.100.0.6'], port_security=['fa:16:3e:c8:1e:7e 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd2ecc8e0-c714-4bea-a634-4908c7d6cdef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17a214aa-cca5-4eb9-92b6-24247cdd6a0d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d8e2ea8-a5ae-4aa4-917a-02fc18fac44b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6166dd82-cb3d-4477-9fed-bb08c05a2a36, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=063af96a-932f-4abc-b86a-e046a2f8ba53) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.113 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 063af96a-932f-4abc-b86a-e046a2f8ba53 in datapath 17a214aa-cca5-4eb9-92b6-24247cdd6a0d unbound from our chassis
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.114 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17a214aa-cca5-4eb9-92b6-24247cdd6a0d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.115 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9dfef3e2-de67-44cc-b870-46be22238576]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.115 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d namespace which is not needed anymore
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.123 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 23 09:42:46 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a2.scope: Consumed 13.950s CPU time.
Jan 23 09:42:46 compute-0 systemd-machined[153562]: Machine qemu-80-instance-000000a2 terminated.
Jan 23 09:42:46 compute-0 neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d[230704]: [NOTICE]   (230708) : haproxy version is 2.8.14-c23fe91
Jan 23 09:42:46 compute-0 neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d[230704]: [NOTICE]   (230708) : path to executable is /usr/sbin/haproxy
Jan 23 09:42:46 compute-0 neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d[230704]: [ALERT]    (230708) : Current worker (230710) exited with code 143 (Terminated)
Jan 23 09:42:46 compute-0 neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d[230704]: [WARNING]  (230708) : All workers exited. Exiting... (0)
Jan 23 09:42:46 compute-0 systemd[1]: libpod-275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687.scope: Deactivated successfully.
Jan 23 09:42:46 compute-0 podman[231704]: 2026-01-23 09:42:46.214101927 +0000 UTC m=+0.034172138 container died 275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:42:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687-userdata-shm.mount: Deactivated successfully.
Jan 23 09:42:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-352cff5850991215301d4c05f63baa3c5a1d3db642ed53f962a0e267baba74be-merged.mount: Deactivated successfully.
Jan 23 09:42:46 compute-0 podman[231704]: 2026-01-23 09:42:46.231505083 +0000 UTC m=+0.051575294 container cleanup 275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:42:46 compute-0 systemd[1]: libpod-conmon-275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687.scope: Deactivated successfully.
Jan 23 09:42:46 compute-0 podman[231729]: 2026-01-23 09:42:46.266446439 +0000 UTC m=+0.021322450 container remove 275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.269 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9e86f2-0b99-4ac1-b279-ba8ef75259ad]: (4, ('Fri Jan 23 09:42:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d (275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687)\n275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687\nFri Jan 23 09:42:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d (275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687)\n275667eb7498ebc2b25262b54a712c61c917867564274e4a0e3a03e98a7eb687\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.270 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0433216d-4c94-49a5-9471-88bcd946bf93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.271 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17a214aa-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:46 compute-0 kernel: tap17a214aa-c0: left promiscuous mode
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.274 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.282 182096 DEBUG nova.compute.manager [req-5ccdc4db-8788-420e-a021-676f005dced1 req-5a0acff6-4674-4aa3-be0c-ea09394c8ce6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-unplugged-063af96a-932f-4abc-b86a-e046a2f8ba53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.282 182096 DEBUG oslo_concurrency.lockutils [req-5ccdc4db-8788-420e-a021-676f005dced1 req-5a0acff6-4674-4aa3-be0c-ea09394c8ce6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.282 182096 DEBUG oslo_concurrency.lockutils [req-5ccdc4db-8788-420e-a021-676f005dced1 req-5a0acff6-4674-4aa3-be0c-ea09394c8ce6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.283 182096 DEBUG oslo_concurrency.lockutils [req-5ccdc4db-8788-420e-a021-676f005dced1 req-5a0acff6-4674-4aa3-be0c-ea09394c8ce6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.283 182096 DEBUG nova.compute.manager [req-5ccdc4db-8788-420e-a021-676f005dced1 req-5a0acff6-4674-4aa3-be0c-ea09394c8ce6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] No waiting events found dispatching network-vif-unplugged-063af96a-932f-4abc-b86a-e046a2f8ba53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.284 182096 DEBUG nova.compute.manager [req-5ccdc4db-8788-420e-a021-676f005dced1 req-5a0acff6-4674-4aa3-be0c-ea09394c8ce6 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-unplugged-063af96a-932f-4abc-b86a-e046a2f8ba53 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.289 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 NetworkManager[54920]: <info>  [1769161366.2907] manager: (tap063af96a-93): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.293 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[52b1911e-01c1-4abb-939c-5121182c8259]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.304 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[36fa4c45-7812-4221-8398-8f603b09e815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.305 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c48f3b86-ceff-4d4b-b80d-38f06041cdfe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.319 182096 INFO nova.virt.libvirt.driver [-] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Instance destroyed successfully.
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.319 182096 DEBUG nova.objects.instance [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'resources' on Instance uuid d2ecc8e0-c714-4bea-a634-4908c7d6cdef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.321 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[915550e6-d87e-4eb8-8338-3ed73f6531f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476279, 'reachable_time': 16420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231748, 'error': None, 'target': 'ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d17a214aa\x2dcca5\x2d4eb9\x2d92b6\x2d24247cdd6a0d.mount: Deactivated successfully.
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.324 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17a214aa-cca5-4eb9-92b6-24247cdd6a0d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:42:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:46.324 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[89771148-81ed-48af-8484-6a3846b9da83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.336 182096 DEBUG nova.virt.libvirt.vif [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:40:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-600889586',display_name='tempest-TestNetworkBasicOps-server-600889586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-600889586',id=162,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFrgZHvHVK1bq5r866l4n2uvUX1V2Spuw5Jjdnxe33g/d3zcHa1kEGpnp/9SPBpKi+QlvZavi01KTaKOHYfhgn3Q4cq86NAmFETqHk/t/dckIzxeaFSBExNYLfSFt2JKkg==',key_name='tempest-TestNetworkBasicOps-160331337',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:40:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-htfci0rz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:40:57Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d2ecc8e0-c714-4bea-a634-4908c7d6cdef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.337 182096 DEBUG nova.network.os_vif_util [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "063af96a-932f-4abc-b86a-e046a2f8ba53", "address": "fa:16:3e:c8:1e:7e", "network": {"id": "17a214aa-cca5-4eb9-92b6-24247cdd6a0d", "bridge": "br-int", "label": "tempest-network-smoke--264179079", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap063af96a-93", "ovs_interfaceid": "063af96a-932f-4abc-b86a-e046a2f8ba53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.337 182096 DEBUG nova.network.os_vif_util [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:1e:7e,bridge_name='br-int',has_traffic_filtering=True,id=063af96a-932f-4abc-b86a-e046a2f8ba53,network=Network(17a214aa-cca5-4eb9-92b6-24247cdd6a0d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap063af96a-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.337 182096 DEBUG os_vif [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:1e:7e,bridge_name='br-int',has_traffic_filtering=True,id=063af96a-932f-4abc-b86a-e046a2f8ba53,network=Network(17a214aa-cca5-4eb9-92b6-24247cdd6a0d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap063af96a-93') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.339 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.339 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap063af96a-93, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.340 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.341 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.342 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.344 182096 INFO os_vif [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:1e:7e,bridge_name='br-int',has_traffic_filtering=True,id=063af96a-932f-4abc-b86a-e046a2f8ba53,network=Network(17a214aa-cca5-4eb9-92b6-24247cdd6a0d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap063af96a-93')
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.344 182096 INFO nova.virt.libvirt.driver [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Deleting instance files /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef_del
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.345 182096 INFO nova.virt.libvirt.driver [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Deletion of /var/lib/nova/instances/d2ecc8e0-c714-4bea-a634-4908c7d6cdef_del complete
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.393 182096 INFO nova.compute.manager [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Took 0.31 seconds to destroy the instance on the hypervisor.
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.393 182096 DEBUG oslo.service.loopingcall [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.393 182096 DEBUG nova.compute.manager [-] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:42:46 compute-0 nova_compute[182092]: 2026-01-23 09:42:46.393 182096 DEBUG nova.network.neutron [-] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.230 182096 DEBUG nova.network.neutron [-] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.240 182096 INFO nova.compute.manager [-] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Took 0.85 seconds to deallocate network for instance.
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.286 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.287 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.288 182096 DEBUG nova.compute.manager [req-b25bd6d1-e644-465e-b6fe-30c302716d4b req-fd9d8d0f-2408-49b6-89b6-7cbc0a6b23bf 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-deleted-063af96a-932f-4abc-b86a-e046a2f8ba53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.342 182096 DEBUG nova.compute.provider_tree [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.351 182096 DEBUG nova.scheduler.client.report [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.364 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.382 182096 INFO nova.scheduler.client.report [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Deleted allocations for instance d2ecc8e0-c714-4bea-a634-4908c7d6cdef
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.436 182096 DEBUG oslo_concurrency.lockutils [None req-16c20497-7b42-48ce-a779-6493505b5319 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:47 compute-0 nova_compute[182092]: 2026-01-23 09:42:47.571 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.252 182096 DEBUG nova.compute.manager [req-11e1dbd7-f6ac-4de1-9544-f4de42f9954c req-8fab18c8-12c7-4501-8cbe-0e646e340b5e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-changed-063af96a-932f-4abc-b86a-e046a2f8ba53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.252 182096 DEBUG nova.compute.manager [req-11e1dbd7-f6ac-4de1-9544-f4de42f9954c req-8fab18c8-12c7-4501-8cbe-0e646e340b5e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing instance network info cache due to event network-changed-063af96a-932f-4abc-b86a-e046a2f8ba53. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.253 182096 DEBUG oslo_concurrency.lockutils [req-11e1dbd7-f6ac-4de1-9544-f4de42f9954c req-8fab18c8-12c7-4501-8cbe-0e646e340b5e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.253 182096 DEBUG oslo_concurrency.lockutils [req-11e1dbd7-f6ac-4de1-9544-f4de42f9954c req-8fab18c8-12c7-4501-8cbe-0e646e340b5e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.253 182096 DEBUG nova.network.neutron [req-11e1dbd7-f6ac-4de1-9544-f4de42f9954c req-8fab18c8-12c7-4501-8cbe-0e646e340b5e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Refreshing network info cache for port 063af96a-932f-4abc-b86a-e046a2f8ba53 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.355 182096 DEBUG nova.compute.manager [req-64d552af-a785-46ba-893e-4f7324e5618b req-62fc0ec5-d45d-4f6f-8476-5ae7de3df3bc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received event network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.356 182096 DEBUG oslo_concurrency.lockutils [req-64d552af-a785-46ba-893e-4f7324e5618b req-62fc0ec5-d45d-4f6f-8476-5ae7de3df3bc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.356 182096 DEBUG oslo_concurrency.lockutils [req-64d552af-a785-46ba-893e-4f7324e5618b req-62fc0ec5-d45d-4f6f-8476-5ae7de3df3bc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.357 182096 DEBUG oslo_concurrency.lockutils [req-64d552af-a785-46ba-893e-4f7324e5618b req-62fc0ec5-d45d-4f6f-8476-5ae7de3df3bc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d2ecc8e0-c714-4bea-a634-4908c7d6cdef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.357 182096 DEBUG nova.compute.manager [req-64d552af-a785-46ba-893e-4f7324e5618b req-62fc0ec5-d45d-4f6f-8476-5ae7de3df3bc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] No waiting events found dispatching network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.357 182096 WARNING nova.compute.manager [req-64d552af-a785-46ba-893e-4f7324e5618b req-62fc0ec5-d45d-4f6f-8476-5ae7de3df3bc 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Received unexpected event network-vif-plugged-063af96a-932f-4abc-b86a-e046a2f8ba53 for instance with vm_state deleted and task_state None.
Jan 23 09:42:48 compute-0 nova_compute[182092]: 2026-01-23 09:42:48.685 182096 DEBUG nova.network.neutron [req-11e1dbd7-f6ac-4de1-9544-f4de42f9954c req-8fab18c8-12c7-4501-8cbe-0e646e340b5e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:42:49 compute-0 nova_compute[182092]: 2026-01-23 09:42:49.684 182096 DEBUG nova.network.neutron [req-11e1dbd7-f6ac-4de1-9544-f4de42f9954c req-8fab18c8-12c7-4501-8cbe-0e646e340b5e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 23 09:42:49 compute-0 nova_compute[182092]: 2026-01-23 09:42:49.685 182096 DEBUG oslo_concurrency.lockutils [req-11e1dbd7-f6ac-4de1-9544-f4de42f9954c req-8fab18c8-12c7-4501-8cbe-0e646e340b5e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-d2ecc8e0-c714-4bea-a634-4908c7d6cdef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:42:50 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:42:50.823 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:42:51 compute-0 nova_compute[182092]: 2026-01-23 09:42:51.340 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:52 compute-0 nova_compute[182092]: 2026-01-23 09:42:52.572 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:53 compute-0 podman[231762]: 2026-01-23 09:42:53.220204377 +0000 UTC m=+0.058479289 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 09:42:54 compute-0 ovn_controller[94697]: 2026-01-23T09:42:54Z|00702|binding|INFO|Releasing lport c957aec4-5720-4b01-aab9-5f9131b765a9 from this chassis (sb_readonly=0)
Jan 23 09:42:54 compute-0 ovn_controller[94697]: 2026-01-23T09:42:54Z|00703|binding|INFO|Releasing lport f270e0ea-abb3-46b6-8ab7-f3b2d93dd703 from this chassis (sb_readonly=0)
Jan 23 09:42:54 compute-0 nova_compute[182092]: 2026-01-23 09:42:54.194 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:56 compute-0 nova_compute[182092]: 2026-01-23 09:42:56.341 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:56 compute-0 nova_compute[182092]: 2026-01-23 09:42:56.402 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:42:57 compute-0 nova_compute[182092]: 2026-01-23 09:42:57.573 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:00 compute-0 podman[231786]: 2026-01-23 09:43:00.220303148 +0000 UTC m=+0.052002001 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:43:00 compute-0 podman[231787]: 2026-01-23 09:43:00.238251054 +0000 UTC m=+0.067563233 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:43:00 compute-0 nova_compute[182092]: 2026-01-23 09:43:00.866 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:01 compute-0 nova_compute[182092]: 2026-01-23 09:43:01.318 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161366.3173826, d2ecc8e0-c714-4bea-a634-4908c7d6cdef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:43:01 compute-0 nova_compute[182092]: 2026-01-23 09:43:01.318 182096 INFO nova.compute.manager [-] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] VM Stopped (Lifecycle Event)
Jan 23 09:43:01 compute-0 nova_compute[182092]: 2026-01-23 09:43:01.336 182096 DEBUG nova.compute.manager [None req-fc1945e7-574f-4b48-9b2d-7cb46731a10d - - - - - -] [instance: d2ecc8e0-c714-4bea-a634-4908c7d6cdef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:43:01 compute-0 nova_compute[182092]: 2026-01-23 09:43:01.343 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:02 compute-0 nova_compute[182092]: 2026-01-23 09:43:02.575 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:06 compute-0 nova_compute[182092]: 2026-01-23 09:43:06.343 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.225 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "16778216-8d6c-45fc-8e13-8025078ee2a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.225 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.235 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.306 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.306 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.310 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.310 182096 INFO nova.compute.claims [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.412 182096 DEBUG nova.compute.provider_tree [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.422 182096 DEBUG nova.scheduler.client.report [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.439 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.439 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.480 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.480 182096 DEBUG nova.network.neutron [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.497 182096 INFO nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.518 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.576 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.616 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.616 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.617 182096 INFO nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Creating image(s)
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.617 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "/var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.618 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.618 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.628 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.677 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.677 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.678 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.687 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.733 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.734 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.756 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.757 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.757 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.803 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.804 182096 DEBUG nova.virt.disk.api [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Checking if we can resize image /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.804 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.850 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.851 182096 DEBUG nova.virt.disk.api [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Cannot resize image /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.851 182096 DEBUG nova.objects.instance [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'migration_context' on Instance uuid 16778216-8d6c-45fc-8e13-8025078ee2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.867 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.867 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Ensure instance console log exists: /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.868 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.868 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:07 compute-0 nova_compute[182092]: 2026-01-23 09:43:07.868 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:08 compute-0 nova_compute[182092]: 2026-01-23 09:43:08.768 182096 DEBUG nova.policy [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:43:10 compute-0 podman[231841]: 2026-01-23 09:43:10.203717337 +0000 UTC m=+0.038614908 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:43:10 compute-0 podman[231840]: 2026-01-23 09:43:10.204512495 +0000 UTC m=+0.041839362 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.313 182096 DEBUG nova.network.neutron [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Successfully updated port: 1e90751a-bc38-434a-a631-26ea57e33c68 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.330 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.330 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquired lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.330 182096 DEBUG nova.network.neutron [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.344 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.382 182096 DEBUG nova.compute.manager [req-b6f4e3e0-f271-4afb-a9d8-df492e750d01 req-676c6e04-9774-442b-af44-b7bf26e48393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received event network-changed-1e90751a-bc38-434a-a631-26ea57e33c68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.382 182096 DEBUG nova.compute.manager [req-b6f4e3e0-f271-4afb-a9d8-df492e750d01 req-676c6e04-9774-442b-af44-b7bf26e48393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Refreshing instance network info cache due to event network-changed-1e90751a-bc38-434a-a631-26ea57e33c68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.382 182096 DEBUG oslo_concurrency.lockutils [req-b6f4e3e0-f271-4afb-a9d8-df492e750d01 req-676c6e04-9774-442b-af44-b7bf26e48393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:43:11 compute-0 nova_compute[182092]: 2026-01-23 09:43:11.441 182096 DEBUG nova.network.neutron [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.578 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.853 182096 DEBUG nova.network.neutron [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Updating instance_info_cache with network_info: [{"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.875 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Releasing lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.875 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Instance network_info: |[{"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.876 182096 DEBUG oslo_concurrency.lockutils [req-b6f4e3e0-f271-4afb-a9d8-df492e750d01 req-676c6e04-9774-442b-af44-b7bf26e48393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.876 182096 DEBUG nova.network.neutron [req-b6f4e3e0-f271-4afb-a9d8-df492e750d01 req-676c6e04-9774-442b-af44-b7bf26e48393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Refreshing network info cache for port 1e90751a-bc38-434a-a631-26ea57e33c68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.878 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Start _get_guest_xml network_info=[{"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.883 182096 WARNING nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.886 182096 DEBUG nova.virt.libvirt.host [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.887 182096 DEBUG nova.virt.libvirt.host [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.890 182096 DEBUG nova.virt.libvirt.host [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.890 182096 DEBUG nova.virt.libvirt.host [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.891 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.891 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.892 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.892 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.892 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.892 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.892 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.893 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.893 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.893 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.893 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.893 182096 DEBUG nova.virt.hardware [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.896 182096 DEBUG nova.virt.libvirt.vif [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1692204101',display_name='tempest-TestNetworkBasicOps-server-1692204101',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1692204101',id=172,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCHxWwTsx/ithwhQ9jbtr1IpFqNhzT2hC7ku1TUxGaNj2STuOPk7w7CnMSr1nTJuvBgMKueZtd7Lc3FhBjEbE/uejCH0HyDy6IAjfPUEjl476VeP37t/jhA9Gn39Qmdcng==',key_name='tempest-TestNetworkBasicOps-67632650',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-84os2gmi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:43:07Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=16778216-8d6c-45fc-8e13-8025078ee2a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.896 182096 DEBUG nova.network.os_vif_util [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.897 182096 DEBUG nova.network.os_vif_util [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a5:2b,bridge_name='br-int',has_traffic_filtering=True,id=1e90751a-bc38-434a-a631-26ea57e33c68,network=Network(b796a7f2-e983-443b-bcea-75bc6ccd43b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e90751a-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.898 182096 DEBUG nova.objects.instance [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 16778216-8d6c-45fc-8e13-8025078ee2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.910 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <uuid>16778216-8d6c-45fc-8e13-8025078ee2a9</uuid>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <name>instance-000000ac</name>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkBasicOps-server-1692204101</nova:name>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:43:12</nova:creationTime>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:43:12 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:43:12 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:43:12 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:43:12 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:43:12 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:43:12 compute-0 nova_compute[182092]:         <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:43:12 compute-0 nova_compute[182092]:         <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:43:12 compute-0 nova_compute[182092]:         <nova:port uuid="1e90751a-bc38-434a-a631-26ea57e33c68">
Jan 23 09:43:12 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <system>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <entry name="serial">16778216-8d6c-45fc-8e13-8025078ee2a9</entry>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <entry name="uuid">16778216-8d6c-45fc-8e13-8025078ee2a9</entry>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </system>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <os>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   </os>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <features>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   </features>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk.config"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:0d:a5:2b"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <target dev="tap1e90751a-bc"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/console.log" append="off"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <video>
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </video>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:43:12 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:43:12 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:43:12 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:43:12 compute-0 nova_compute[182092]: </domain>
Jan 23 09:43:12 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.911 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Preparing to wait for external event network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.911 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.912 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.912 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.912 182096 DEBUG nova.virt.libvirt.vif [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1692204101',display_name='tempest-TestNetworkBasicOps-server-1692204101',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1692204101',id=172,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCHxWwTsx/ithwhQ9jbtr1IpFqNhzT2hC7ku1TUxGaNj2STuOPk7w7CnMSr1nTJuvBgMKueZtd7Lc3FhBjEbE/uejCH0HyDy6IAjfPUEjl476VeP37t/jhA9Gn39Qmdcng==',key_name='tempest-TestNetworkBasicOps-67632650',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-84os2gmi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:43:07Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=16778216-8d6c-45fc-8e13-8025078ee2a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.912 182096 DEBUG nova.network.os_vif_util [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.913 182096 DEBUG nova.network.os_vif_util [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a5:2b,bridge_name='br-int',has_traffic_filtering=True,id=1e90751a-bc38-434a-a631-26ea57e33c68,network=Network(b796a7f2-e983-443b-bcea-75bc6ccd43b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e90751a-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.913 182096 DEBUG os_vif [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a5:2b,bridge_name='br-int',has_traffic_filtering=True,id=1e90751a-bc38-434a-a631-26ea57e33c68,network=Network(b796a7f2-e983-443b-bcea-75bc6ccd43b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e90751a-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.914 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.914 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.914 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.916 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.916 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e90751a-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.917 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1e90751a-bc, col_values=(('external_ids', {'iface-id': '1e90751a-bc38-434a-a631-26ea57e33c68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:a5:2b', 'vm-uuid': '16778216-8d6c-45fc-8e13-8025078ee2a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:12 compute-0 NetworkManager[54920]: <info>  [1769161392.9189] manager: (tap1e90751a-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.919 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.923 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.924 182096 INFO os_vif [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a5:2b,bridge_name='br-int',has_traffic_filtering=True,id=1e90751a-bc38-434a-a631-26ea57e33c68,network=Network(b796a7f2-e983-443b-bcea-75bc6ccd43b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e90751a-bc')
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.960 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.960 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.960 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No VIF found with MAC fa:16:3e:0d:a5:2b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:43:12 compute-0 nova_compute[182092]: 2026-01-23 09:43:12.961 182096 INFO nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Using config drive
Jan 23 09:43:13 compute-0 nova_compute[182092]: 2026-01-23 09:43:13.379 182096 INFO nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Creating config drive at /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk.config
Jan 23 09:43:13 compute-0 nova_compute[182092]: 2026-01-23 09:43:13.383 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq3j3blvg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:43:13 compute-0 nova_compute[182092]: 2026-01-23 09:43:13.502 182096 DEBUG oslo_concurrency.processutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq3j3blvg" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:43:13 compute-0 kernel: tap1e90751a-bc: entered promiscuous mode
Jan 23 09:43:13 compute-0 NetworkManager[54920]: <info>  [1769161393.5512] manager: (tap1e90751a-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/344)
Jan 23 09:43:13 compute-0 nova_compute[182092]: 2026-01-23 09:43:13.550 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:13 compute-0 ovn_controller[94697]: 2026-01-23T09:43:13Z|00704|binding|INFO|Claiming lport 1e90751a-bc38-434a-a631-26ea57e33c68 for this chassis.
Jan 23 09:43:13 compute-0 ovn_controller[94697]: 2026-01-23T09:43:13Z|00705|binding|INFO|1e90751a-bc38-434a-a631-26ea57e33c68: Claiming fa:16:3e:0d:a5:2b 10.100.0.10
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.561 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:a5:2b 10.100.0.10'], port_security=['fa:16:3e:0d:a5:2b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1502344290', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '16778216-8d6c-45fc-8e13-8025078ee2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b796a7f2-e983-443b-bcea-75bc6ccd43b4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1502344290', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1a496382-70b1-408c-bf2f-e11df7c98661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=846ae243-747c-4e53-a918-379e9172af21, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1e90751a-bc38-434a-a631-26ea57e33c68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.562 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1e90751a-bc38-434a-a631-26ea57e33c68 in datapath b796a7f2-e983-443b-bcea-75bc6ccd43b4 bound to our chassis
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.564 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b796a7f2-e983-443b-bcea-75bc6ccd43b4
Jan 23 09:43:13 compute-0 ovn_controller[94697]: 2026-01-23T09:43:13Z|00706|binding|INFO|Setting lport 1e90751a-bc38-434a-a631-26ea57e33c68 ovn-installed in OVS
Jan 23 09:43:13 compute-0 ovn_controller[94697]: 2026-01-23T09:43:13Z|00707|binding|INFO|Setting lport 1e90751a-bc38-434a-a631-26ea57e33c68 up in Southbound
Jan 23 09:43:13 compute-0 nova_compute[182092]: 2026-01-23 09:43:13.571 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.573 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[516fa229-b1ec-4ad5-a43b-bc0435ac276e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.574 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb796a7f2-e1 in ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.575 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb796a7f2-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.575 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[21645d5c-f6f1-4007-9d28-51e4cd2c8c80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.576 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[94fc3f63-d0f3-4c6b-91e8-9a876de91202]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.586 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[d1736c3d-d689-44ab-8bfd-1f93286054c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 systemd-udevd[231909]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.595 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[368175ed-f865-4aac-afd1-48a8913e5314]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 systemd-machined[153562]: New machine qemu-83-instance-000000ac.
Jan 23 09:43:13 compute-0 NetworkManager[54920]: <info>  [1769161393.6011] device (tap1e90751a-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:43:13 compute-0 NetworkManager[54920]: <info>  [1769161393.6018] device (tap1e90751a-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:43:13 compute-0 systemd[1]: Started Virtual Machine qemu-83-instance-000000ac.
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.619 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[2e633e6e-a0dc-4fbc-8eac-2ee3a5261bdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 podman[231892]: 2026-01-23 09:43:13.623578162 +0000 UTC m=+0.075758284 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, release=1755695350)
Jan 23 09:43:13 compute-0 NetworkManager[54920]: <info>  [1769161393.6250] manager: (tapb796a7f2-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/345)
Jan 23 09:43:13 compute-0 systemd-udevd[231917]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.624 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e1e442-da43-463e-a7a3-f67991e8ae4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.652 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad65bc1-d15e-42bc-a931-e541a61de46f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.655 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[902f57fd-a18d-4362-b0a2-641f415c47d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 NetworkManager[54920]: <info>  [1769161393.6721] device (tapb796a7f2-e0): carrier: link connected
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.675 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9424d8d1-fc69-47a0-9b6b-c860aabe9168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.690 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0028d8ef-9a99-4b85-97de-a1543543d228]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb796a7f2-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:8f:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489915, 'reachable_time': 21600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231941, 'error': None, 'target': 'ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.702 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[39be7370-71df-49c4-8f8d-3d0d8e5c13e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:8ff3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489915, 'tstamp': 489915}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231942, 'error': None, 'target': 'ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.715 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[16ed765e-7096-4838-a5fd-1a83f76fe73e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb796a7f2-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:8f:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489915, 'reachable_time': 21600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231943, 'error': None, 'target': 'ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.741 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[10e3c323-253b-4c3f-93c8-2f0e1c2a5f95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.783 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2ceae995-f069-47e8-8958-b56fa53ec755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.784 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb796a7f2-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.784 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.785 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb796a7f2-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:13 compute-0 NetworkManager[54920]: <info>  [1769161393.7869] manager: (tapb796a7f2-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Jan 23 09:43:13 compute-0 kernel: tapb796a7f2-e0: entered promiscuous mode
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.788 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb796a7f2-e0, col_values=(('external_ids', {'iface-id': 'fe3f7c35-fe6d-4d66-84b6-73f41550b510'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:13 compute-0 ovn_controller[94697]: 2026-01-23T09:43:13Z|00708|binding|INFO|Releasing lport fe3f7c35-fe6d-4d66-84b6-73f41550b510 from this chassis (sb_readonly=0)
Jan 23 09:43:13 compute-0 nova_compute[182092]: 2026-01-23 09:43:13.801 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:13 compute-0 nova_compute[182092]: 2026-01-23 09:43:13.804 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.806 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b796a7f2-e983-443b-bcea-75bc6ccd43b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b796a7f2-e983-443b-bcea-75bc6ccd43b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:43:13 compute-0 nova_compute[182092]: 2026-01-23 09:43:13.806 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.807 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4956b3ba-f1ca-44cb-93a1-0b7f64abcc65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.807 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-b796a7f2-e983-443b-bcea-75bc6ccd43b4
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/b796a7f2-e983-443b-bcea-75bc6ccd43b4.pid.haproxy
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID b796a7f2-e983-443b-bcea-75bc6ccd43b4
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:43:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:13.808 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4', 'env', 'PROCESS_TAG=haproxy-b796a7f2-e983-443b-bcea-75bc6ccd43b4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b796a7f2-e983-443b-bcea-75bc6ccd43b4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.026 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161394.025752, 16778216-8d6c-45fc-8e13-8025078ee2a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.027 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] VM Started (Lifecycle Event)
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.045 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.048 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161394.0258763, 16778216-8d6c-45fc-8e13-8025078ee2a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.048 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] VM Paused (Lifecycle Event)
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.067 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.069 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.088 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:43:14 compute-0 podman[231978]: 2026-01-23 09:43:14.098445807 +0000 UTC m=+0.032244691 container create a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:43:14 compute-0 systemd[1]: Started libpod-conmon-a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3.scope.
Jan 23 09:43:14 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:43:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de51649cac92a31a49ccdd28c360b159bea6d7d37ade1ef30d68c435db88d1aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:43:14 compute-0 podman[231978]: 2026-01-23 09:43:14.149360124 +0000 UTC m=+0.083159039 container init a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:43:14 compute-0 podman[231978]: 2026-01-23 09:43:14.153689283 +0000 UTC m=+0.087488166 container start a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:43:14 compute-0 podman[231978]: 2026-01-23 09:43:14.084718588 +0000 UTC m=+0.018517462 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:43:14 compute-0 neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4[231991]: [NOTICE]   (231995) : New worker (231997) forked
Jan 23 09:43:14 compute-0 neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4[231991]: [NOTICE]   (231995) : Loading success.
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.227 182096 DEBUG nova.network.neutron [req-b6f4e3e0-f271-4afb-a9d8-df492e750d01 req-676c6e04-9774-442b-af44-b7bf26e48393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Updated VIF entry in instance network info cache for port 1e90751a-bc38-434a-a631-26ea57e33c68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.228 182096 DEBUG nova.network.neutron [req-b6f4e3e0-f271-4afb-a9d8-df492e750d01 req-676c6e04-9774-442b-af44-b7bf26e48393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Updating instance_info_cache with network_info: [{"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:43:14 compute-0 nova_compute[182092]: 2026-01-23 09:43:14.240 182096 DEBUG oslo_concurrency.lockutils [req-b6f4e3e0-f271-4afb-a9d8-df492e750d01 req-676c6e04-9774-442b-af44-b7bf26e48393 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.682 182096 DEBUG nova.compute.manager [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received event network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.683 182096 DEBUG oslo_concurrency.lockutils [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.683 182096 DEBUG oslo_concurrency.lockutils [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.683 182096 DEBUG oslo_concurrency.lockutils [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.684 182096 DEBUG nova.compute.manager [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Processing event network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.684 182096 DEBUG nova.compute.manager [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received event network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.684 182096 DEBUG oslo_concurrency.lockutils [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.684 182096 DEBUG oslo_concurrency.lockutils [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.684 182096 DEBUG oslo_concurrency.lockutils [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.685 182096 DEBUG nova.compute.manager [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] No waiting events found dispatching network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.685 182096 WARNING nova.compute.manager [req-53cdb46d-d40d-4a25-ba28-8ec35056ec84 req-4ea8ba49-023c-44fb-b01e-e725e4d8d581 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received unexpected event network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 for instance with vm_state building and task_state spawning.
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.685 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.688 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161395.6878462, 16778216-8d6c-45fc-8e13-8025078ee2a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.688 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] VM Resumed (Lifecycle Event)
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.689 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.691 182096 INFO nova.virt.libvirt.driver [-] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Instance spawned successfully.
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.691 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.707 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.710 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.710 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.710 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.711 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.711 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.711 182096 DEBUG nova.virt.libvirt.driver [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.714 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.755 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.776 182096 INFO nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Took 8.16 seconds to spawn the instance on the hypervisor.
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.777 182096 DEBUG nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.845 182096 INFO nova.compute.manager [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Took 8.57 seconds to build instance.
Jan 23 09:43:15 compute-0 nova_compute[182092]: 2026-01-23 09:43:15.856 182096 DEBUG oslo_concurrency.lockutils [None req-aa093528-4ff1-47a9-be78-b7d8e25f2b85 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:16 compute-0 nova_compute[182092]: 2026-01-23 09:43:16.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:17 compute-0 nova_compute[182092]: 2026-01-23 09:43:17.580 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:17 compute-0 nova_compute[182092]: 2026-01-23 09:43:17.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:17 compute-0 nova_compute[182092]: 2026-01-23 09:43:17.648 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:17 compute-0 nova_compute[182092]: 2026-01-23 09:43:17.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:17 compute-0 nova_compute[182092]: 2026-01-23 09:43:17.919 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:20 compute-0 nova_compute[182092]: 2026-01-23 09:43:20.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:20 compute-0 nova_compute[182092]: 2026-01-23 09:43:20.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:43:20 compute-0 nova_compute[182092]: 2026-01-23 09:43:20.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:43:20 compute-0 nova_compute[182092]: 2026-01-23 09:43:20.852 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:43:20 compute-0 nova_compute[182092]: 2026-01-23 09:43:20.852 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:43:20 compute-0 nova_compute[182092]: 2026-01-23 09:43:20.853 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:43:20 compute-0 nova_compute[182092]: 2026-01-23 09:43:20.853 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1d01d877-250f-4069-b7a9-da76e21520a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:43:21 compute-0 ovn_controller[94697]: 2026-01-23T09:43:21Z|00709|binding|INFO|Releasing lport fe3f7c35-fe6d-4d66-84b6-73f41550b510 from this chassis (sb_readonly=0)
Jan 23 09:43:21 compute-0 ovn_controller[94697]: 2026-01-23T09:43:21Z|00710|binding|INFO|Releasing lport c957aec4-5720-4b01-aab9-5f9131b765a9 from this chassis (sb_readonly=0)
Jan 23 09:43:21 compute-0 ovn_controller[94697]: 2026-01-23T09:43:21Z|00711|binding|INFO|Releasing lport f270e0ea-abb3-46b6-8ab7-f3b2d93dd703 from this chassis (sb_readonly=0)
Jan 23 09:43:21 compute-0 nova_compute[182092]: 2026-01-23 09:43:21.749 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:21 compute-0 nova_compute[182092]: 2026-01-23 09:43:21.965 182096 DEBUG nova.compute.manager [req-3c904146-8a25-4c83-8857-79eb2c036baf req-fc7e0287-a277-4d4e-9d17-a937d278f50c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received event network-changed-1e90751a-bc38-434a-a631-26ea57e33c68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:21 compute-0 nova_compute[182092]: 2026-01-23 09:43:21.966 182096 DEBUG nova.compute.manager [req-3c904146-8a25-4c83-8857-79eb2c036baf req-fc7e0287-a277-4d4e-9d17-a937d278f50c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Refreshing instance network info cache due to event network-changed-1e90751a-bc38-434a-a631-26ea57e33c68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:43:21 compute-0 nova_compute[182092]: 2026-01-23 09:43:21.966 182096 DEBUG oslo_concurrency.lockutils [req-3c904146-8a25-4c83-8857-79eb2c036baf req-fc7e0287-a277-4d4e-9d17-a937d278f50c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:43:21 compute-0 nova_compute[182092]: 2026-01-23 09:43:21.967 182096 DEBUG oslo_concurrency.lockutils [req-3c904146-8a25-4c83-8857-79eb2c036baf req-fc7e0287-a277-4d4e-9d17-a937d278f50c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:43:21 compute-0 nova_compute[182092]: 2026-01-23 09:43:21.967 182096 DEBUG nova.network.neutron [req-3c904146-8a25-4c83-8857-79eb2c036baf req-fc7e0287-a277-4d4e-9d17-a937d278f50c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Refreshing network info cache for port 1e90751a-bc38-434a-a631-26ea57e33c68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.197 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "16778216-8d6c-45fc-8e13-8025078ee2a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.197 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.198 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.198 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.198 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.205 182096 INFO nova.compute.manager [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Terminating instance
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.211 182096 DEBUG nova.compute.manager [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:43:22 compute-0 kernel: tap1e90751a-bc (unregistering): left promiscuous mode
Jan 23 09:43:22 compute-0 NetworkManager[54920]: <info>  [1769161402.2262] device (tap1e90751a-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:43:22 compute-0 ovn_controller[94697]: 2026-01-23T09:43:22Z|00712|binding|INFO|Releasing lport 1e90751a-bc38-434a-a631-26ea57e33c68 from this chassis (sb_readonly=0)
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.235 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:22 compute-0 ovn_controller[94697]: 2026-01-23T09:43:22Z|00713|binding|INFO|Setting lport 1e90751a-bc38-434a-a631-26ea57e33c68 down in Southbound
Jan 23 09:43:22 compute-0 ovn_controller[94697]: 2026-01-23T09:43:22Z|00714|binding|INFO|Removing iface tap1e90751a-bc ovn-installed in OVS
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.238 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.243 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:a5:2b 10.100.0.10'], port_security=['fa:16:3e:0d:a5:2b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1502344290', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '16778216-8d6c-45fc-8e13-8025078ee2a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b796a7f2-e983-443b-bcea-75bc6ccd43b4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1502344290', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1a496382-70b1-408c-bf2f-e11df7c98661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.190'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=846ae243-747c-4e53-a918-379e9172af21, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1e90751a-bc38-434a-a631-26ea57e33c68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.244 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1e90751a-bc38-434a-a631-26ea57e33c68 in datapath b796a7f2-e983-443b-bcea-75bc6ccd43b4 unbound from our chassis
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.246 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b796a7f2-e983-443b-bcea-75bc6ccd43b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.247 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd51c6d-a9a1-40f2-b808-b2fbcf53c6f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.249 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4 namespace which is not needed anymore
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.252 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:22 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ac.scope: Deactivated successfully.
Jan 23 09:43:22 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ac.scope: Consumed 6.985s CPU time.
Jan 23 09:43:22 compute-0 systemd-machined[153562]: Machine qemu-83-instance-000000ac terminated.
Jan 23 09:43:22 compute-0 neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4[231991]: [NOTICE]   (231995) : haproxy version is 2.8.14-c23fe91
Jan 23 09:43:22 compute-0 neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4[231991]: [NOTICE]   (231995) : path to executable is /usr/sbin/haproxy
Jan 23 09:43:22 compute-0 neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4[231991]: [ALERT]    (231995) : Current worker (231997) exited with code 143 (Terminated)
Jan 23 09:43:22 compute-0 neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4[231991]: [WARNING]  (231995) : All workers exited. Exiting... (0)
Jan 23 09:43:22 compute-0 systemd[1]: libpod-a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3.scope: Deactivated successfully.
Jan 23 09:43:22 compute-0 conmon[231991]: conmon a7f25dd13f660bf68b95 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3.scope/container/memory.events
Jan 23 09:43:22 compute-0 podman[232023]: 2026-01-23 09:43:22.3450416 +0000 UTC m=+0.034218273 container died a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:43:22 compute-0 podman[232023]: 2026-01-23 09:43:22.361175674 +0000 UTC m=+0.050352346 container cleanup a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 09:43:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-de51649cac92a31a49ccdd28c360b159bea6d7d37ade1ef30d68c435db88d1aa-merged.mount: Deactivated successfully.
Jan 23 09:43:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3-userdata-shm.mount: Deactivated successfully.
Jan 23 09:43:22 compute-0 systemd[1]: libpod-conmon-a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3.scope: Deactivated successfully.
Jan 23 09:43:22 compute-0 podman[232045]: 2026-01-23 09:43:22.402970382 +0000 UTC m=+0.024695587 container remove a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.406 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6cee49-1d08-4e17-a530-3e531083d7a6]: (4, ('Fri Jan 23 09:43:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4 (a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3)\na7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3\nFri Jan 23 09:43:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4 (a7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3)\na7f25dd13f660bf68b95de044bdc89ce4a1407da288e73ca64b34186511d2dd3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.407 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[208c69b5-445a-40bd-94e6-af48ab0908b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.408 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb796a7f2-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.409 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:22 compute-0 kernel: tapb796a7f2-e0: left promiscuous mode
Jan 23 09:43:22 compute-0 NetworkManager[54920]: <info>  [1769161402.4254] manager: (tap1e90751a-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/347)
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.427 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.428 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bf41990b-a9f2-4a06-a265-2ead12519245]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.437 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a769f2-b63f-4a1e-80f9-d194dcec6af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.438 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[63185e5f-890c-4d9f-9e54-e23cc683457d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.452 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d80422a2-4404-409a-958b-320a7a88cd86]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489910, 'reachable_time': 17018, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232069, 'error': None, 'target': 'ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:22 compute-0 systemd[1]: run-netns-ovnmeta\x2db796a7f2\x2de983\x2d443b\x2dbcea\x2d75bc6ccd43b4.mount: Deactivated successfully.
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.455 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b796a7f2-e983-443b-bcea-75bc6ccd43b4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:43:22 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:22.455 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[3bcef715-9d80-4c00-a1b2-6739f0484974]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.456 182096 INFO nova.virt.libvirt.driver [-] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Instance destroyed successfully.
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.456 182096 DEBUG nova.objects.instance [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'resources' on Instance uuid 16778216-8d6c-45fc-8e13-8025078ee2a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.469 182096 DEBUG nova.virt.libvirt.vif [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:43:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1692204101',display_name='tempest-TestNetworkBasicOps-server-1692204101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1692204101',id=172,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCHxWwTsx/ithwhQ9jbtr1IpFqNhzT2hC7ku1TUxGaNj2STuOPk7w7CnMSr1nTJuvBgMKueZtd7Lc3FhBjEbE/uejCH0HyDy6IAjfPUEjl476VeP37t/jhA9Gn39Qmdcng==',key_name='tempest-TestNetworkBasicOps-67632650',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:43:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-84os2gmi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:43:15Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=16778216-8d6c-45fc-8e13-8025078ee2a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.469 182096 DEBUG nova.network.os_vif_util [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.471 182096 DEBUG nova.network.os_vif_util [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a5:2b,bridge_name='br-int',has_traffic_filtering=True,id=1e90751a-bc38-434a-a631-26ea57e33c68,network=Network(b796a7f2-e983-443b-bcea-75bc6ccd43b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e90751a-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.471 182096 DEBUG os_vif [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a5:2b,bridge_name='br-int',has_traffic_filtering=True,id=1e90751a-bc38-434a-a631-26ea57e33c68,network=Network(b796a7f2-e983-443b-bcea-75bc6ccd43b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e90751a-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.472 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.472 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e90751a-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.473 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.474 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.476 182096 INFO os_vif [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a5:2b,bridge_name='br-int',has_traffic_filtering=True,id=1e90751a-bc38-434a-a631-26ea57e33c68,network=Network(b796a7f2-e983-443b-bcea-75bc6ccd43b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1e90751a-bc')
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.477 182096 INFO nova.virt.libvirt.driver [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Deleting instance files /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9_del
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.477 182096 INFO nova.virt.libvirt.driver [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Deletion of /var/lib/nova/instances/16778216-8d6c-45fc-8e13-8025078ee2a9_del complete
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.565 182096 INFO nova.compute.manager [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.566 182096 DEBUG oslo.service.loopingcall [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.566 182096 DEBUG nova.compute.manager [-] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.566 182096 DEBUG nova.network.neutron [-] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:43:22 compute-0 nova_compute[182092]: 2026-01-23 09:43:22.581 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:23 compute-0 nova_compute[182092]: 2026-01-23 09:43:23.804 182096 DEBUG nova.network.neutron [req-3c904146-8a25-4c83-8857-79eb2c036baf req-fc7e0287-a277-4d4e-9d17-a937d278f50c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Updated VIF entry in instance network info cache for port 1e90751a-bc38-434a-a631-26ea57e33c68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:43:23 compute-0 nova_compute[182092]: 2026-01-23 09:43:23.804 182096 DEBUG nova.network.neutron [req-3c904146-8a25-4c83-8857-79eb2c036baf req-fc7e0287-a277-4d4e-9d17-a937d278f50c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Updating instance_info_cache with network_info: [{"id": "1e90751a-bc38-434a-a631-26ea57e33c68", "address": "fa:16:3e:0d:a5:2b", "network": {"id": "b796a7f2-e983-443b-bcea-75bc6ccd43b4", "bridge": "br-int", "label": "tempest-network-smoke--53720538", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e90751a-bc", "ovs_interfaceid": "1e90751a-bc38-434a-a631-26ea57e33c68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:43:23 compute-0 nova_compute[182092]: 2026-01-23 09:43:23.828 182096 DEBUG oslo_concurrency.lockutils [req-3c904146-8a25-4c83-8857-79eb2c036baf req-fc7e0287-a277-4d4e-9d17-a937d278f50c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-16778216-8d6c-45fc-8e13-8025078ee2a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.030 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updating instance_info_cache with network_info: [{"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.045 182096 DEBUG nova.compute.manager [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received event network-vif-unplugged-1e90751a-bc38-434a-a631-26ea57e33c68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.045 182096 DEBUG oslo_concurrency.lockutils [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.045 182096 DEBUG oslo_concurrency.lockutils [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.045 182096 DEBUG oslo_concurrency.lockutils [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.046 182096 DEBUG nova.compute.manager [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] No waiting events found dispatching network-vif-unplugged-1e90751a-bc38-434a-a631-26ea57e33c68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.046 182096 DEBUG nova.compute.manager [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received event network-vif-unplugged-1e90751a-bc38-434a-a631-26ea57e33c68 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.046 182096 DEBUG nova.compute.manager [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received event network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.046 182096 DEBUG oslo_concurrency.lockutils [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.046 182096 DEBUG oslo_concurrency.lockutils [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.046 182096 DEBUG oslo_concurrency.lockutils [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.047 182096 DEBUG nova.compute.manager [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] No waiting events found dispatching network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.047 182096 WARNING nova.compute.manager [req-ab381593-7f57-49f4-b887-eda117c2b068 req-35d23dd3-2374-4e16-9e2d-e221c4c397bb 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Received unexpected event network-vif-plugged-1e90751a-bc38-434a-a631-26ea57e33c68 for instance with vm_state active and task_state deleting.
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.051 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.051 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.052 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.052 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.079 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.080 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.081 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.082 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.130 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:43:24 compute-0 podman[232077]: 2026-01-23 09:43:24.175452245 +0000 UTC m=+0.066682769 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.180 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.181 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.228 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.461 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.462 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5468MB free_disk=73.18339538574219GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.462 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.463 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.583 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 1d01d877-250f-4069-b7a9-da76e21520a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.583 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 16778216-8d6c-45fc-8e13-8025078ee2a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.583 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.583 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.621 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.674 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.674 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.686 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.700 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.765 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.781 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.801 182096 DEBUG nova.network.neutron [-] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.807 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.807 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.824 182096 INFO nova.compute.manager [-] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Took 2.26 seconds to deallocate network for instance.
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.905 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.905 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:24 compute-0 nova_compute[182092]: 2026-01-23 09:43:24.991 182096 DEBUG nova.compute.provider_tree [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.008 182096 DEBUG nova.scheduler.client.report [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.029 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.051 182096 INFO nova.scheduler.client.report [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Deleted allocations for instance 16778216-8d6c-45fc-8e13-8025078ee2a9
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.298 182096 DEBUG oslo_concurrency.lockutils [None req-7366b8d5-2603-490a-89b9-e673629a7fcb 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "16778216-8d6c-45fc-8e13-8025078ee2a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.405 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:25 compute-0 ovn_controller[94697]: 2026-01-23T09:43:25Z|00715|binding|INFO|Releasing lport c957aec4-5720-4b01-aab9-5f9131b765a9 from this chassis (sb_readonly=0)
Jan 23 09:43:25 compute-0 ovn_controller[94697]: 2026-01-23T09:43:25Z|00716|binding|INFO|Releasing lport f270e0ea-abb3-46b6-8ab7-f3b2d93dd703 from this chassis (sb_readonly=0)
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.559 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:25 compute-0 nova_compute[182092]: 2026-01-23 09:43:25.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:43:26 compute-0 nova_compute[182092]: 2026-01-23 09:43:26.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:27 compute-0 nova_compute[182092]: 2026-01-23 09:43:27.473 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:27 compute-0 nova_compute[182092]: 2026-01-23 09:43:27.582 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:27 compute-0 nova_compute[182092]: 2026-01-23 09:43:27.911 182096 DEBUG nova.compute.manager [req-31cfb6e0-bf76-4804-92c6-3c6cacf664f9 req-29db7b26-2dd7-4afe-bc5b-a652d1ee660c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-changed-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:27 compute-0 nova_compute[182092]: 2026-01-23 09:43:27.912 182096 DEBUG nova.compute.manager [req-31cfb6e0-bf76-4804-92c6-3c6cacf664f9 req-29db7b26-2dd7-4afe-bc5b-a652d1ee660c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Refreshing instance network info cache due to event network-changed-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:43:27 compute-0 nova_compute[182092]: 2026-01-23 09:43:27.912 182096 DEBUG oslo_concurrency.lockutils [req-31cfb6e0-bf76-4804-92c6-3c6cacf664f9 req-29db7b26-2dd7-4afe-bc5b-a652d1ee660c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:43:27 compute-0 nova_compute[182092]: 2026-01-23 09:43:27.912 182096 DEBUG oslo_concurrency.lockutils [req-31cfb6e0-bf76-4804-92c6-3c6cacf664f9 req-29db7b26-2dd7-4afe-bc5b-a652d1ee660c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:43:27 compute-0 nova_compute[182092]: 2026-01-23 09:43:27.912 182096 DEBUG nova.network.neutron [req-31cfb6e0-bf76-4804-92c6-3c6cacf664f9 req-29db7b26-2dd7-4afe-bc5b-a652d1ee660c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Refreshing network info cache for port 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.041 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.042 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.042 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.042 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.042 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.050 182096 INFO nova.compute.manager [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Terminating instance
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.057 182096 DEBUG nova.compute.manager [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:43:28 compute-0 kernel: tap8f10cfa2-5b (unregistering): left promiscuous mode
Jan 23 09:43:28 compute-0 NetworkManager[54920]: <info>  [1769161408.0824] device (tap8f10cfa2-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:43:28 compute-0 ovn_controller[94697]: 2026-01-23T09:43:28Z|00717|binding|INFO|Releasing lport 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d from this chassis (sb_readonly=0)
Jan 23 09:43:28 compute-0 ovn_controller[94697]: 2026-01-23T09:43:28Z|00718|binding|INFO|Setting lport 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d down in Southbound
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.089 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 ovn_controller[94697]: 2026-01-23T09:43:28Z|00719|binding|INFO|Removing iface tap8f10cfa2-5b ovn-installed in OVS
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.091 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.095 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:7d:6f 10.100.0.11'], port_security=['fa:16:3e:10:7d:6f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82773ff7-8c70-4ede-a3ab-2917fd9eda62', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64bd9a93-c91d-4ee4-acef-a479ec6c08af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a85a5b-705d-4753-aa41-a6530a944e47, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.096 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d in datapath 82773ff7-8c70-4ede-a3ab-2917fd9eda62 unbound from our chassis
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.097 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82773ff7-8c70-4ede-a3ab-2917fd9eda62, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.098 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[053adf38-49fd-4965-accc-45076eb2e869]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.099 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62 namespace which is not needed anymore
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.104 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 kernel: tap1a75b58f-84 (unregistering): left promiscuous mode
Jan 23 09:43:28 compute-0 NetworkManager[54920]: <info>  [1769161408.1118] device (tap1a75b58f-84): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.118 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 ovn_controller[94697]: 2026-01-23T09:43:28Z|00720|binding|INFO|Releasing lport 1a75b58f-843f-4939-99e9-00991ea0a602 from this chassis (sb_readonly=0)
Jan 23 09:43:28 compute-0 ovn_controller[94697]: 2026-01-23T09:43:28Z|00721|binding|INFO|Setting lport 1a75b58f-843f-4939-99e9-00991ea0a602 down in Southbound
Jan 23 09:43:28 compute-0 ovn_controller[94697]: 2026-01-23T09:43:28Z|00722|binding|INFO|Removing iface tap1a75b58f-84 ovn-installed in OVS
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.121 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.134 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:30:e1 2001:db8:0:1:f816:3eff:fe60:30e1 2001:db8::f816:3eff:fe60:30e1'], port_security=['fa:16:3e:60:30:e1 2001:db8:0:1:f816:3eff:fe60:30e1 2001:db8::f816:3eff:fe60:30e1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe60:30e1/64 2001:db8::f816:3eff:fe60:30e1/64', 'neutron:device_id': '1d01d877-250f-4069-b7a9-da76e21520a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2482367c-d492-48cd-8a65-fd27ef9491ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64bd9a93-c91d-4ee4-acef-a479ec6c08af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71b900ed-163d-4540-b633-6108e740ce75, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=1a75b58f-843f-4939-99e9-00991ea0a602) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.143 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a9.scope: Deactivated successfully.
Jan 23 09:43:28 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a9.scope: Consumed 12.348s CPU time.
Jan 23 09:43:28 compute-0 systemd-machined[153562]: Machine qemu-82-instance-000000a9 terminated.
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62[231451]: [NOTICE]   (231455) : haproxy version is 2.8.14-c23fe91
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62[231451]: [NOTICE]   (231455) : path to executable is /usr/sbin/haproxy
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62[231451]: [WARNING]  (231455) : Exiting Master process...
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62[231451]: [WARNING]  (231455) : Exiting Master process...
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62[231451]: [ALERT]    (231455) : Current worker (231457) exited with code 143 (Terminated)
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62[231451]: [WARNING]  (231455) : All workers exited. Exiting... (0)
Jan 23 09:43:28 compute-0 systemd[1]: libpod-36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9.scope: Deactivated successfully.
Jan 23 09:43:28 compute-0 podman[232132]: 2026-01-23 09:43:28.212122927 +0000 UTC m=+0.037142226 container died 36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:43:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-3dbdcd4fcf4f51378e65222d3f2f39d7d98614b6db343a76f665d04a22dfc106-merged.mount: Deactivated successfully.
Jan 23 09:43:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9-userdata-shm.mount: Deactivated successfully.
Jan 23 09:43:28 compute-0 podman[232132]: 2026-01-23 09:43:28.232844499 +0000 UTC m=+0.057863810 container cleanup 36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:43:28 compute-0 systemd[1]: libpod-conmon-36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9.scope: Deactivated successfully.
Jan 23 09:43:28 compute-0 NetworkManager[54920]: <info>  [1769161408.2730] manager: (tap8f10cfa2-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/348)
Jan 23 09:43:28 compute-0 podman[232156]: 2026-01-23 09:43:28.281714409 +0000 UTC m=+0.031823818 container remove 36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.287 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8f051852-59c2-44e6-b7b8-c7fb9bd1a1cf]: (4, ('Fri Jan 23 09:43:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62 (36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9)\n36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9\nFri Jan 23 09:43:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62 (36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9)\n36a0d27bafaf992c68353bc9d5fdcd2412aba6df09b6d71f879ce3bb5f18bcf9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.289 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d0f6e0-158c-4909-b43e-b099e270d800]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.290 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82773ff7-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.291 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.305 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 kernel: tap82773ff7-80: left promiscuous mode
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.313 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.315 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3d1d1220-89b4-41b1-bf0d-be6faf725493]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.317 182096 INFO nova.virt.libvirt.driver [-] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Instance destroyed successfully.
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.317 182096 DEBUG nova.objects.instance [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'resources' on Instance uuid 1d01d877-250f-4069-b7a9-da76e21520a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.324 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ceda714d-3367-4f7e-9456-eee9986dae0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.325 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3751001b-079f-47ac-9b12-1bbc5a961088]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.327 182096 DEBUG nova.virt.libvirt.vif [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1786233331',display_name='tempest-TestGettingAddress-server-1786233331',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1786233331',id=169,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxE7lMHOU5E+RbTg/U5SuOKDukhLcobVToS7AocbDXsPH/d6crJkExTAz3YaoZQmlXvcbkDDEgGuBAAgk+Mb1XMNU63YMD3ZP1PRDMF6g9lQQJRFK5O7+2AajE1CrmCcw==',key_name='tempest-TestGettingAddress-1872347893',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:42:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-m076fkhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:42:28Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=1d01d877-250f-4069-b7a9-da76e21520a5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.328 182096 DEBUG nova.network.os_vif_util [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.328 182096 DEBUG nova.network.os_vif_util [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:7d:6f,bridge_name='br-int',has_traffic_filtering=True,id=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d,network=Network(82773ff7-8c70-4ede-a3ab-2917fd9eda62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f10cfa2-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.329 182096 DEBUG os_vif [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:7d:6f,bridge_name='br-int',has_traffic_filtering=True,id=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d,network=Network(82773ff7-8c70-4ede-a3ab-2917fd9eda62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f10cfa2-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.330 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.330 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f10cfa2-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.331 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.333 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.336 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.338 182096 INFO os_vif [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:7d:6f,bridge_name='br-int',has_traffic_filtering=True,id=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d,network=Network(82773ff7-8c70-4ede-a3ab-2917fd9eda62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f10cfa2-5b')
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.338 182096 DEBUG nova.virt.libvirt.vif [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1786233331',display_name='tempest-TestGettingAddress-server-1786233331',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1786233331',id=169,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMxE7lMHOU5E+RbTg/U5SuOKDukhLcobVToS7AocbDXsPH/d6crJkExTAz3YaoZQmlXvcbkDDEgGuBAAgk+Mb1XMNU63YMD3ZP1PRDMF6g9lQQJRFK5O7+2AajE1CrmCcw==',key_name='tempest-TestGettingAddress-1872347893',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:42:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-m076fkhs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:42:28Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=1d01d877-250f-4069-b7a9-da76e21520a5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.339 182096 DEBUG nova.network.os_vif_util [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.339 182096 DEBUG nova.network.os_vif_util [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:30:e1,bridge_name='br-int',has_traffic_filtering=True,id=1a75b58f-843f-4939-99e9-00991ea0a602,network=Network(2482367c-d492-48cd-8a65-fd27ef9491ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a75b58f-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.339 182096 DEBUG os_vif [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:30:e1,bridge_name='br-int',has_traffic_filtering=True,id=1a75b58f-843f-4939-99e9-00991ea0a602,network=Network(2482367c-d492-48cd-8a65-fd27ef9491ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a75b58f-84') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.341 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c991ab5e-db9b-47e2-afda-33e882001e4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485127, 'reachable_time': 18941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232202, 'error': None, 'target': 'ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d82773ff7\x2d8c70\x2d4ede\x2da3ab\x2d2917fd9eda62.mount: Deactivated successfully.
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.343 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.344 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a75b58f-84, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.343 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82773ff7-8c70-4ede-a3ab-2917fd9eda62 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.344 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5d170e-f222-41ea-a59a-94d988ddd84e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.345 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.345 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 1a75b58f-843f-4939-99e9-00991ea0a602 in datapath 2482367c-d492-48cd-8a65-fd27ef9491ab unbound from our chassis
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.346 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2482367c-d492-48cd-8a65-fd27ef9491ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.347 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.348 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5ed3cc-fb09-49b2-aee5-9deb59ebe4b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.348 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab namespace which is not needed anymore
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.348 182096 INFO os_vif [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:30:e1,bridge_name='br-int',has_traffic_filtering=True,id=1a75b58f-843f-4939-99e9-00991ea0a602,network=Network(2482367c-d492-48cd-8a65-fd27ef9491ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a75b58f-84')
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.349 182096 INFO nova.virt.libvirt.driver [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Deleting instance files /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5_del
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.349 182096 INFO nova.virt.libvirt.driver [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Deletion of /var/lib/nova/instances/1d01d877-250f-4069-b7a9-da76e21520a5_del complete
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.405 182096 INFO nova.compute.manager [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.406 182096 DEBUG oslo.service.loopingcall [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.407 182096 DEBUG nova.compute.manager [-] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.407 182096 DEBUG nova.network.neutron [-] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab[231511]: [NOTICE]   (231515) : haproxy version is 2.8.14-c23fe91
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab[231511]: [NOTICE]   (231515) : path to executable is /usr/sbin/haproxy
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab[231511]: [WARNING]  (231515) : Exiting Master process...
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab[231511]: [ALERT]    (231515) : Current worker (231517) exited with code 143 (Terminated)
Jan 23 09:43:28 compute-0 neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab[231511]: [WARNING]  (231515) : All workers exited. Exiting... (0)
Jan 23 09:43:28 compute-0 systemd[1]: libpod-fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52.scope: Deactivated successfully.
Jan 23 09:43:28 compute-0 podman[232217]: 2026-01-23 09:43:28.448351284 +0000 UTC m=+0.034019559 container died fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:43:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52-userdata-shm.mount: Deactivated successfully.
Jan 23 09:43:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-0b44e5b8173979ace5bf6e492073611e9c266bd8b66e1c5a043dc8376dee5943-merged.mount: Deactivated successfully.
Jan 23 09:43:28 compute-0 podman[232217]: 2026-01-23 09:43:28.469749753 +0000 UTC m=+0.055418029 container cleanup fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:43:28 compute-0 systemd[1]: libpod-conmon-fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52.scope: Deactivated successfully.
Jan 23 09:43:28 compute-0 podman[232241]: 2026-01-23 09:43:28.509733628 +0000 UTC m=+0.024915202 container remove fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.513 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[108e0ef0-4acc-4f29-8f2b-d197e049a06a]: (4, ('Fri Jan 23 09:43:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab (fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52)\nfef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52\nFri Jan 23 09:43:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab (fef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52)\nfef2a65829ac2869d0d6e7c5b6826016971b083eaf4d102a481627b832fc7f52\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.514 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[92e1630e-3c1a-4fc4-896e-46905c4bd9b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.515 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2482367c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.516 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 kernel: tap2482367c-d0: left promiscuous mode
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.519 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.521 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[915cb828-da8b-48e6-8118-c60218c1a2e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 nova_compute[182092]: 2026-01-23 09:43:28.530 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.539 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a65d314d-54b6-468f-a37e-b267256e02ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.539 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0e29ed44-e40d-4de4-8a08-78570191f1a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.552 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d13853-4bd8-4b03-8706-5292eecd8989]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485189, 'reachable_time': 26792, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232253, 'error': None, 'target': 'ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.553 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2482367c-d492-48cd-8a65-fd27ef9491ab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:43:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:28.553 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[972f21e5-bb64-421b-b9ef-34dc33274fa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:43:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d2482367c\x2dd492\x2d48cd\x2d8a65\x2dfd27ef9491ab.mount: Deactivated successfully.
Jan 23 09:43:29 compute-0 nova_compute[182092]: 2026-01-23 09:43:29.122 182096 DEBUG nova.compute.manager [req-21233f30-353c-4a8e-8d40-ae726be7673e req-367651df-80d1-4e72-a782-3ca235383d43 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-deleted-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:29 compute-0 nova_compute[182092]: 2026-01-23 09:43:29.123 182096 INFO nova.compute.manager [req-21233f30-353c-4a8e-8d40-ae726be7673e req-367651df-80d1-4e72-a782-3ca235383d43 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Neutron deleted interface 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d; detaching it from the instance and deleting it from the info cache
Jan 23 09:43:29 compute-0 nova_compute[182092]: 2026-01-23 09:43:29.123 182096 DEBUG nova.network.neutron [req-21233f30-353c-4a8e-8d40-ae726be7673e req-367651df-80d1-4e72-a782-3ca235383d43 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updating instance_info_cache with network_info: [{"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:43:29 compute-0 nova_compute[182092]: 2026-01-23 09:43:29.150 182096 DEBUG nova.compute.manager [req-21233f30-353c-4a8e-8d40-ae726be7673e req-367651df-80d1-4e72-a782-3ca235383d43 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Detach interface failed, port_id=8f10cfa2-5bcf-4ed2-b3ba-90adc891327d, reason: Instance 1d01d877-250f-4069-b7a9-da76e21520a5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 09:43:29 compute-0 nova_compute[182092]: 2026-01-23 09:43:29.790 182096 DEBUG nova.network.neutron [req-31cfb6e0-bf76-4804-92c6-3c6cacf664f9 req-29db7b26-2dd7-4afe-bc5b-a652d1ee660c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updated VIF entry in instance network info cache for port 8f10cfa2-5bcf-4ed2-b3ba-90adc891327d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:43:29 compute-0 nova_compute[182092]: 2026-01-23 09:43:29.791 182096 DEBUG nova.network.neutron [req-31cfb6e0-bf76-4804-92c6-3c6cacf664f9 req-29db7b26-2dd7-4afe-bc5b-a652d1ee660c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updating instance_info_cache with network_info: [{"id": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "address": "fa:16:3e:10:7d:6f", "network": {"id": "82773ff7-8c70-4ede-a3ab-2917fd9eda62", "bridge": "br-int", "label": "tempest-network-smoke--309715042", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f10cfa2-5b", "ovs_interfaceid": "8f10cfa2-5bcf-4ed2-b3ba-90adc891327d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1a75b58f-843f-4939-99e9-00991ea0a602", "address": "fa:16:3e:60:30:e1", "network": {"id": "2482367c-d492-48cd-8a65-fd27ef9491ab", "bridge": "br-int", "label": "tempest-network-smoke--203438059", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe60:30e1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a75b58f-84", "ovs_interfaceid": "1a75b58f-843f-4939-99e9-00991ea0a602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:43:29 compute-0 nova_compute[182092]: 2026-01-23 09:43:29.809 182096 DEBUG oslo_concurrency.lockutils [req-31cfb6e0-bf76-4804-92c6-3c6cacf664f9 req-29db7b26-2dd7-4afe-bc5b-a652d1ee660c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-1d01d877-250f-4069-b7a9-da76e21520a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.004 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-unplugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.004 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.005 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.005 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.005 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] No waiting events found dispatching network-vif-unplugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.005 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-unplugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.006 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.006 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.006 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.006 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.007 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] No waiting events found dispatching network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.007 182096 WARNING nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received unexpected event network-vif-plugged-8f10cfa2-5bcf-4ed2-b3ba-90adc891327d for instance with vm_state active and task_state deleting.
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.007 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-unplugged-1a75b58f-843f-4939-99e9-00991ea0a602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.007 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.008 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.008 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.008 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] No waiting events found dispatching network-vif-unplugged-1a75b58f-843f-4939-99e9-00991ea0a602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.008 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-unplugged-1a75b58f-843f-4939-99e9-00991ea0a602 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.009 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.009 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.009 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.009 182096 DEBUG oslo_concurrency.lockutils [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.010 182096 DEBUG nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] No waiting events found dispatching network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.010 182096 WARNING nova.compute.manager [req-954a8e04-9d38-4b0f-b4f3-536ffaa93d49 req-6411e1aa-f72c-4df5-b99b-519540339e66 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received unexpected event network-vif-plugged-1a75b58f-843f-4939-99e9-00991ea0a602 for instance with vm_state active and task_state deleting.
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.156 182096 DEBUG nova.network.neutron [-] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.167 182096 INFO nova.compute.manager [-] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Took 1.76 seconds to deallocate network for instance.
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.223 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.223 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.273 182096 DEBUG nova.compute.provider_tree [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.285 182096 DEBUG nova.scheduler.client.report [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.299 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.316 182096 INFO nova.scheduler.client.report [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Deleted allocations for instance 1d01d877-250f-4069-b7a9-da76e21520a5
Jan 23 09:43:30 compute-0 nova_compute[182092]: 2026-01-23 09:43:30.364 182096 DEBUG oslo_concurrency.lockutils [None req-f74b5c18-231e-4a26-8181-ff6433d4136e 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "1d01d877-250f-4069-b7a9-da76e21520a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:31 compute-0 nova_compute[182092]: 2026-01-23 09:43:31.198 182096 DEBUG nova.compute.manager [req-57efc596-199a-4909-9ae3-ca9a8e8a660e req-2d8fd810-7bab-48d2-a0f2-254cf663a566 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Received event network-vif-deleted-1a75b58f-843f-4939-99e9-00991ea0a602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:43:31 compute-0 podman[232255]: 2026-01-23 09:43:31.222258678 +0000 UTC m=+0.050246857 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:43:31 compute-0 podman[232254]: 2026-01-23 09:43:31.227983323 +0000 UTC m=+0.058928468 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 23 09:43:32 compute-0 nova_compute[182092]: 2026-01-23 09:43:32.586 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:33 compute-0 nova_compute[182092]: 2026-01-23 09:43:33.346 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:35 compute-0 nova_compute[182092]: 2026-01-23 09:43:35.660 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:43:35 compute-0 nova_compute[182092]: 2026-01-23 09:43:35.661 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:43:35 compute-0 nova_compute[182092]: 2026-01-23 09:43:35.676 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:43:36 compute-0 nova_compute[182092]: 2026-01-23 09:43:36.810 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:36 compute-0 nova_compute[182092]: 2026-01-23 09:43:36.950 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:37 compute-0 nova_compute[182092]: 2026-01-23 09:43:37.454 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161402.4536843, 16778216-8d6c-45fc-8e13-8025078ee2a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:43:37 compute-0 nova_compute[182092]: 2026-01-23 09:43:37.454 182096 INFO nova.compute.manager [-] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] VM Stopped (Lifecycle Event)
Jan 23 09:43:37 compute-0 nova_compute[182092]: 2026-01-23 09:43:37.477 182096 DEBUG nova.compute.manager [None req-65a58370-e840-449d-8ede-af55cf2eb679 - - - - - -] [instance: 16778216-8d6c-45fc-8e13-8025078ee2a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:43:37 compute-0 nova_compute[182092]: 2026-01-23 09:43:37.587 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:38 compute-0 nova_compute[182092]: 2026-01-23 09:43:38.348 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:39.873 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:43:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:39.874 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:43:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:39.874 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:43:41 compute-0 podman[232293]: 2026-01-23 09:43:41.201164193 +0000 UTC m=+0.038371786 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:43:41 compute-0 podman[232294]: 2026-01-23 09:43:41.208054084 +0000 UTC m=+0.043564937 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:43:42 compute-0 nova_compute[182092]: 2026-01-23 09:43:42.589 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:43 compute-0 nova_compute[182092]: 2026-01-23 09:43:43.316 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161408.3158414, 1d01d877-250f-4069-b7a9-da76e21520a5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:43:43 compute-0 nova_compute[182092]: 2026-01-23 09:43:43.317 182096 INFO nova.compute.manager [-] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] VM Stopped (Lifecycle Event)
Jan 23 09:43:43 compute-0 nova_compute[182092]: 2026-01-23 09:43:43.334 182096 DEBUG nova.compute.manager [None req-647633aa-82c5-4fd8-bfb7-b89f9c831be4 - - - - - -] [instance: 1d01d877-250f-4069-b7a9-da76e21520a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:43:43 compute-0 nova_compute[182092]: 2026-01-23 09:43:43.349 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:43 compute-0 nova_compute[182092]: 2026-01-23 09:43:43.906 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:43.906 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:43:43 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:43.906 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:43:44 compute-0 podman[232330]: 2026-01-23 09:43:44.200215886 +0000 UTC m=+0.038808872 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, name=ubi9-minimal)
Jan 23 09:43:47 compute-0 nova_compute[182092]: 2026-01-23 09:43:47.589 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:48 compute-0 nova_compute[182092]: 2026-01-23 09:43:48.351 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:52 compute-0 nova_compute[182092]: 2026-01-23 09:43:52.590 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:53 compute-0 nova_compute[182092]: 2026-01-23 09:43:53.353 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:43:53.908 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:43:55 compute-0 podman[232348]: 2026-01-23 09:43:55.218181467 +0000 UTC m=+0.057347747 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 09:43:57 compute-0 nova_compute[182092]: 2026-01-23 09:43:57.591 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:43:58 compute-0 nova_compute[182092]: 2026-01-23 09:43:58.355 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.521 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "d09a7260-4b49-49b1-af38-89443d01e530" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.521 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.534 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.605 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.606 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.610 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.610 182096 INFO nova.compute.claims [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.690 182096 DEBUG nova.compute.provider_tree [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.701 182096 DEBUG nova.scheduler.client.report [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.718 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.718 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.762 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.762 182096 DEBUG nova.network.neutron [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.778 182096 INFO nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.792 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.886 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.886 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.887 182096 INFO nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Creating image(s)
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.887 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "/var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.887 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.888 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.898 182096 DEBUG nova.policy [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.900 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.946 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.946 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.947 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:00 compute-0 nova_compute[182092]: 2026-01-23 09:44:00.956 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.002 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.003 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.026 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.026 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.027 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.072 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.073 182096 DEBUG nova.virt.disk.api [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Checking if we can resize image /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.073 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.116 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.117 182096 DEBUG nova.virt.disk.api [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Cannot resize image /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.118 182096 DEBUG nova.objects.instance [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'migration_context' on Instance uuid d09a7260-4b49-49b1-af38-89443d01e530 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.131 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.131 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Ensure instance console log exists: /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.131 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.132 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:01 compute-0 nova_compute[182092]: 2026-01-23 09:44:01.132 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:02 compute-0 podman[232389]: 2026-01-23 09:44:02.206527079 +0000 UTC m=+0.037593248 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:44:02 compute-0 podman[232388]: 2026-01-23 09:44:02.212241855 +0000 UTC m=+0.044826537 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:44:02 compute-0 nova_compute[182092]: 2026-01-23 09:44:02.592 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:02 compute-0 nova_compute[182092]: 2026-01-23 09:44:02.869 182096 DEBUG nova.network.neutron [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Successfully created port: 2a379fd2-6ab8-422f-a21a-a30471e8e815 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.357 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.399 182096 DEBUG nova.network.neutron [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Successfully updated port: 2a379fd2-6ab8-422f-a21a-a30471e8e815 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.410 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.411 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquired lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.411 182096 DEBUG nova.network.neutron [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.474 182096 DEBUG nova.compute.manager [req-5160e04d-1181-4413-9961-4d103799a277 req-14cb5d45-2d95-44fe-b093-5daae3337a5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received event network-changed-2a379fd2-6ab8-422f-a21a-a30471e8e815 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.474 182096 DEBUG nova.compute.manager [req-5160e04d-1181-4413-9961-4d103799a277 req-14cb5d45-2d95-44fe-b093-5daae3337a5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Refreshing instance network info cache due to event network-changed-2a379fd2-6ab8-422f-a21a-a30471e8e815. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.474 182096 DEBUG oslo_concurrency.lockutils [req-5160e04d-1181-4413-9961-4d103799a277 req-14cb5d45-2d95-44fe-b093-5daae3337a5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:44:03 compute-0 nova_compute[182092]: 2026-01-23 09:44:03.541 182096 DEBUG nova.network.neutron [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.098 182096 DEBUG nova.network.neutron [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Updating instance_info_cache with network_info: [{"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.114 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Releasing lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.114 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Instance network_info: |[{"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.114 182096 DEBUG oslo_concurrency.lockutils [req-5160e04d-1181-4413-9961-4d103799a277 req-14cb5d45-2d95-44fe-b093-5daae3337a5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.114 182096 DEBUG nova.network.neutron [req-5160e04d-1181-4413-9961-4d103799a277 req-14cb5d45-2d95-44fe-b093-5daae3337a5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Refreshing network info cache for port 2a379fd2-6ab8-422f-a21a-a30471e8e815 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.116 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Start _get_guest_xml network_info=[{"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.119 182096 WARNING nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.127 182096 DEBUG nova.virt.libvirt.host [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.127 182096 DEBUG nova.virt.libvirt.host [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.130 182096 DEBUG nova.virt.libvirt.host [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.130 182096 DEBUG nova.virt.libvirt.host [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.131 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.131 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.132 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.132 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.132 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.132 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.132 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.133 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.133 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.133 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.133 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.133 182096 DEBUG nova.virt.hardware [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.136 182096 DEBUG nova.virt.libvirt.vif [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-526420849',display_name='tempest-TestNetworkBasicOps-server-526420849',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-526420849',id=175,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHxJ5SlOQu/cul4tOUHtgxIEIKU2C/4h0tnfAYLBFbPk04+A3gKim7moLWYEeH1OOboFWp6edIyPCAu6RCxa71HOUt/Q3c8UiacVet6PQXx/cUk76QffqLiVpkh4JgAUwA==',key_name='tempest-TestNetworkBasicOps-695768523',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-bc354ly3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:44:00Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d09a7260-4b49-49b1-af38-89443d01e530,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.136 182096 DEBUG nova.network.os_vif_util [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.137 182096 DEBUG nova.network.os_vif_util [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:c4:4d,bridge_name='br-int',has_traffic_filtering=True,id=2a379fd2-6ab8-422f-a21a-a30471e8e815,network=Network(b5f8e399-cddf-49e5-a1c4-0f7ba6831cad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a379fd2-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.138 182096 DEBUG nova.objects.instance [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'pci_devices' on Instance uuid d09a7260-4b49-49b1-af38-89443d01e530 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.148 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <uuid>d09a7260-4b49-49b1-af38-89443d01e530</uuid>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <name>instance-000000af</name>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkBasicOps-server-526420849</nova:name>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:44:04</nova:creationTime>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:44:04 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:44:04 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:44:04 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:44:04 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:44:04 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:44:04 compute-0 nova_compute[182092]:         <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:44:04 compute-0 nova_compute[182092]:         <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:44:04 compute-0 nova_compute[182092]:         <nova:port uuid="2a379fd2-6ab8-422f-a21a-a30471e8e815">
Jan 23 09:44:04 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <system>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <entry name="serial">d09a7260-4b49-49b1-af38-89443d01e530</entry>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <entry name="uuid">d09a7260-4b49-49b1-af38-89443d01e530</entry>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </system>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <os>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   </os>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <features>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   </features>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk.config"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:96:c4:4d"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <target dev="tap2a379fd2-6a"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/console.log" append="off"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <video>
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </video>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:44:04 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:44:04 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:44:04 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:44:04 compute-0 nova_compute[182092]: </domain>
Jan 23 09:44:04 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.149 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Preparing to wait for external event network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.149 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "d09a7260-4b49-49b1-af38-89443d01e530-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.149 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.150 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.150 182096 DEBUG nova.virt.libvirt.vif [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-526420849',display_name='tempest-TestNetworkBasicOps-server-526420849',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-526420849',id=175,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHxJ5SlOQu/cul4tOUHtgxIEIKU2C/4h0tnfAYLBFbPk04+A3gKim7moLWYEeH1OOboFWp6edIyPCAu6RCxa71HOUt/Q3c8UiacVet6PQXx/cUk76QffqLiVpkh4JgAUwA==',key_name='tempest-TestNetworkBasicOps-695768523',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-bc354ly3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:44:00Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d09a7260-4b49-49b1-af38-89443d01e530,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.150 182096 DEBUG nova.network.os_vif_util [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.151 182096 DEBUG nova.network.os_vif_util [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:c4:4d,bridge_name='br-int',has_traffic_filtering=True,id=2a379fd2-6ab8-422f-a21a-a30471e8e815,network=Network(b5f8e399-cddf-49e5-a1c4-0f7ba6831cad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a379fd2-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.151 182096 DEBUG os_vif [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:c4:4d,bridge_name='br-int',has_traffic_filtering=True,id=2a379fd2-6ab8-422f-a21a-a30471e8e815,network=Network(b5f8e399-cddf-49e5-a1c4-0f7ba6831cad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a379fd2-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.151 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.152 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.152 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.154 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.154 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a379fd2-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.154 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a379fd2-6a, col_values=(('external_ids', {'iface-id': '2a379fd2-6ab8-422f-a21a-a30471e8e815', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:c4:4d', 'vm-uuid': 'd09a7260-4b49-49b1-af38-89443d01e530'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.155 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:04 compute-0 NetworkManager[54920]: <info>  [1769161444.1563] manager: (tap2a379fd2-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.158 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.161 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.161 182096 INFO os_vif [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:c4:4d,bridge_name='br-int',has_traffic_filtering=True,id=2a379fd2-6ab8-422f-a21a-a30471e8e815,network=Network(b5f8e399-cddf-49e5-a1c4-0f7ba6831cad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a379fd2-6a')
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.192 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.193 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.193 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No VIF found with MAC fa:16:3e:96:c4:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.193 182096 INFO nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Using config drive
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.855 182096 INFO nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Creating config drive at /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk.config
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.859 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0aqsqle0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:04 compute-0 nova_compute[182092]: 2026-01-23 09:44:04.976 182096 DEBUG oslo_concurrency.processutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0aqsqle0" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:05 compute-0 kernel: tap2a379fd2-6a: entered promiscuous mode
Jan 23 09:44:05 compute-0 NetworkManager[54920]: <info>  [1769161445.0121] manager: (tap2a379fd2-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/350)
Jan 23 09:44:05 compute-0 ovn_controller[94697]: 2026-01-23T09:44:05Z|00723|binding|INFO|Claiming lport 2a379fd2-6ab8-422f-a21a-a30471e8e815 for this chassis.
Jan 23 09:44:05 compute-0 ovn_controller[94697]: 2026-01-23T09:44:05Z|00724|binding|INFO|2a379fd2-6ab8-422f-a21a-a30471e8e815: Claiming fa:16:3e:96:c4:4d 10.100.0.9
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.014 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.019 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.029 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:c4:4d 10.100.0.9'], port_security=['fa:16:3e:96:c4:4d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'd09a7260-4b49-49b1-af38-89443d01e530', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9f1d386-1dad-48ce-b879-946e975f50a1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=432252ba-29ea-4365-9f6a-2fb348dad102, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2a379fd2-6ab8-422f-a21a-a30471e8e815) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.030 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2a379fd2-6ab8-422f-a21a-a30471e8e815 in datapath b5f8e399-cddf-49e5-a1c4-0f7ba6831cad bound to our chassis
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.031 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5f8e399-cddf-49e5-a1c4-0f7ba6831cad
Jan 23 09:44:05 compute-0 systemd-udevd[232447]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.042 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea16dba-f20a-4075-9eb0-b23a1963607c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.042 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5f8e399-c1 in ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.044 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5f8e399-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.044 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[aa784310-f321-48b2-9ce5-9b2194713e4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.044 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e43693-1295-494e-9f3b-b0005865b2da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 systemd-machined[153562]: New machine qemu-84-instance-000000af.
Jan 23 09:44:05 compute-0 NetworkManager[54920]: <info>  [1769161445.0522] device (tap2a379fd2-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.052 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[181fd3b1-e9e6-4c8d-a0c2-6584ce6f36de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 NetworkManager[54920]: <info>  [1769161445.0532] device (tap2a379fd2-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:44:05 compute-0 systemd[1]: Started Virtual Machine qemu-84-instance-000000af.
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.076 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.075 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[38ddd35f-9e0a-4724-88e0-069084c933e2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.083 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:05 compute-0 ovn_controller[94697]: 2026-01-23T09:44:05Z|00725|binding|INFO|Setting lport 2a379fd2-6ab8-422f-a21a-a30471e8e815 ovn-installed in OVS
Jan 23 09:44:05 compute-0 ovn_controller[94697]: 2026-01-23T09:44:05Z|00726|binding|INFO|Setting lport 2a379fd2-6ab8-422f-a21a-a30471e8e815 up in Southbound
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.087 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.101 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[5bed26b5-920b-44ed-a3a0-1e7b267201b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 NetworkManager[54920]: <info>  [1769161445.1046] manager: (tapb5f8e399-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/351)
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.105 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[37619f7f-0f7e-4193-8762-999e5b8e55aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.136 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[dc72b468-3ea2-43c0-8f22-3fa03f2b6c45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.140 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e266b500-387d-4e78-b073-4769389d2a79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 NetworkManager[54920]: <info>  [1769161445.1590] device (tapb5f8e399-c0): carrier: link connected
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.163 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f75190-e3bc-4cba-a92b-4c89e79debcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.177 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[66d800b8-bd9f-4228-9cbc-7268858f24a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5f8e399-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:15:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495064, 'reachable_time': 16085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232471, 'error': None, 'target': 'ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.190 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6284c156-5077-467b-8ab1-0cbfc1991ec0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7d:15ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495064, 'tstamp': 495064}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232472, 'error': None, 'target': 'ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.202 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[628f562b-e0d8-4ebd-ad72-d012567801b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5f8e399-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:15:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495064, 'reachable_time': 16085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232473, 'error': None, 'target': 'ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.225 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a552b35a-3986-4588-9c91-aad9918bf336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.265 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3b38468e-fd12-47fb-8d19-07435811fceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.266 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5f8e399-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.266 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.267 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5f8e399-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:05 compute-0 kernel: tapb5f8e399-c0: entered promiscuous mode
Jan 23 09:44:05 compute-0 NetworkManager[54920]: <info>  [1769161445.2685] manager: (tapb5f8e399-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.270 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5f8e399-c0, col_values=(('external_ids', {'iface-id': 'af5cd699-3229-480b-a2f1-358eeaa30e2d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:05 compute-0 ovn_controller[94697]: 2026-01-23T09:44:05Z|00727|binding|INFO|Releasing lport af5cd699-3229-480b-a2f1-358eeaa30e2d from this chassis (sb_readonly=0)
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.268 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.286 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5f8e399-cddf-49e5-a1c4-0f7ba6831cad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5f8e399-cddf-49e5-a1c4-0f7ba6831cad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.286 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.287 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c35c72e6-5388-4356-88e5-eea7ad1840f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.287 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/b5f8e399-cddf-49e5-a1c4-0f7ba6831cad.pid.haproxy
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID b5f8e399-cddf-49e5-a1c4-0f7ba6831cad
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:44:05 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:05.288 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad', 'env', 'PROCESS_TAG=haproxy-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5f8e399-cddf-49e5-a1c4-0f7ba6831cad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.536 182096 DEBUG nova.compute.manager [req-dc54e8c6-c2b3-45e1-820e-3a216d261580 req-f42308e8-c6ba-4bb6-ac18-de2ff47c2934 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received event network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.537 182096 DEBUG oslo_concurrency.lockutils [req-dc54e8c6-c2b3-45e1-820e-3a216d261580 req-f42308e8-c6ba-4bb6-ac18-de2ff47c2934 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d09a7260-4b49-49b1-af38-89443d01e530-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.537 182096 DEBUG oslo_concurrency.lockutils [req-dc54e8c6-c2b3-45e1-820e-3a216d261580 req-f42308e8-c6ba-4bb6-ac18-de2ff47c2934 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.538 182096 DEBUG oslo_concurrency.lockutils [req-dc54e8c6-c2b3-45e1-820e-3a216d261580 req-f42308e8-c6ba-4bb6-ac18-de2ff47c2934 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.538 182096 DEBUG nova.compute.manager [req-dc54e8c6-c2b3-45e1-820e-3a216d261580 req-f42308e8-c6ba-4bb6-ac18-de2ff47c2934 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Processing event network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:44:05 compute-0 podman[232501]: 2026-01-23 09:44:05.562391456 +0000 UTC m=+0.031905994 container create ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:44:05 compute-0 systemd[1]: Started libpod-conmon-ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06.scope.
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.595 182096 DEBUG nova.network.neutron [req-5160e04d-1181-4413-9961-4d103799a277 req-14cb5d45-2d95-44fe-b093-5daae3337a5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Updated VIF entry in instance network info cache for port 2a379fd2-6ab8-422f-a21a-a30471e8e815. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.595 182096 DEBUG nova.network.neutron [req-5160e04d-1181-4413-9961-4d103799a277 req-14cb5d45-2d95-44fe-b093-5daae3337a5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Updating instance_info_cache with network_info: [{"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:44:05 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:44:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb81b739c8885b674ac987b826fc722db3ad7f1b32246baec037b688c71dc30d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.611 182096 DEBUG oslo_concurrency.lockutils [req-5160e04d-1181-4413-9961-4d103799a277 req-14cb5d45-2d95-44fe-b093-5daae3337a5b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:44:05 compute-0 podman[232501]: 2026-01-23 09:44:05.613386802 +0000 UTC m=+0.082901342 container init ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:44:05 compute-0 podman[232501]: 2026-01-23 09:44:05.620186544 +0000 UTC m=+0.089701083 container start ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:44:05 compute-0 podman[232501]: 2026-01-23 09:44:05.549282346 +0000 UTC m=+0.018796905 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:44:05 compute-0 neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad[232515]: [NOTICE]   (232523) : New worker (232526) forked
Jan 23 09:44:05 compute-0 neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad[232515]: [NOTICE]   (232523) : Loading success.
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.656 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.658 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161445.6574938, d09a7260-4b49-49b1-af38-89443d01e530 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.658 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] VM Started (Lifecycle Event)
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.660 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.663 182096 INFO nova.virt.libvirt.driver [-] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Instance spawned successfully.
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.663 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.683 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.687 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.687 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.687 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.687 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.688 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.688 182096 DEBUG nova.virt.libvirt.driver [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.691 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.722 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.723 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161445.6575704, d09a7260-4b49-49b1-af38-89443d01e530 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.723 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] VM Paused (Lifecycle Event)
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.739 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.742 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161445.659036, d09a7260-4b49-49b1-af38-89443d01e530 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.743 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] VM Resumed (Lifecycle Event)
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.757 182096 INFO nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Took 4.87 seconds to spawn the instance on the hypervisor.
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.757 182096 DEBUG nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.762 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.764 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.784 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.824 182096 INFO nova.compute.manager [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Took 5.25 seconds to build instance.
Jan 23 09:44:05 compute-0 nova_compute[182092]: 2026-01-23 09:44:05.840 182096 DEBUG oslo_concurrency.lockutils [None req-72e257b1-e830-4778-b643-c269c6c69665 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:07 compute-0 nova_compute[182092]: 2026-01-23 09:44:07.593 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:07 compute-0 nova_compute[182092]: 2026-01-23 09:44:07.800 182096 DEBUG nova.compute.manager [req-94248496-1c65-4aaa-8748-ec100f3af22f req-822a8ef5-4815-44a6-ad06-24df17d36d88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received event network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:07 compute-0 nova_compute[182092]: 2026-01-23 09:44:07.800 182096 DEBUG oslo_concurrency.lockutils [req-94248496-1c65-4aaa-8748-ec100f3af22f req-822a8ef5-4815-44a6-ad06-24df17d36d88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d09a7260-4b49-49b1-af38-89443d01e530-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:07 compute-0 nova_compute[182092]: 2026-01-23 09:44:07.801 182096 DEBUG oslo_concurrency.lockutils [req-94248496-1c65-4aaa-8748-ec100f3af22f req-822a8ef5-4815-44a6-ad06-24df17d36d88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:07 compute-0 nova_compute[182092]: 2026-01-23 09:44:07.801 182096 DEBUG oslo_concurrency.lockutils [req-94248496-1c65-4aaa-8748-ec100f3af22f req-822a8ef5-4815-44a6-ad06-24df17d36d88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:07 compute-0 nova_compute[182092]: 2026-01-23 09:44:07.801 182096 DEBUG nova.compute.manager [req-94248496-1c65-4aaa-8748-ec100f3af22f req-822a8ef5-4815-44a6-ad06-24df17d36d88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] No waiting events found dispatching network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:44:07 compute-0 nova_compute[182092]: 2026-01-23 09:44:07.802 182096 WARNING nova.compute.manager [req-94248496-1c65-4aaa-8748-ec100f3af22f req-822a8ef5-4815-44a6-ad06-24df17d36d88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received unexpected event network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 for instance with vm_state active and task_state None.
Jan 23 09:44:08 compute-0 NetworkManager[54920]: <info>  [1769161448.9014] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 23 09:44:08 compute-0 NetworkManager[54920]: <info>  [1769161448.9020] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 23 09:44:08 compute-0 nova_compute[182092]: 2026-01-23 09:44:08.901 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:08 compute-0 nova_compute[182092]: 2026-01-23 09:44:08.994 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:08 compute-0 ovn_controller[94697]: 2026-01-23T09:44:08Z|00728|binding|INFO|Releasing lport af5cd699-3229-480b-a2f1-358eeaa30e2d from this chassis (sb_readonly=0)
Jan 23 09:44:09 compute-0 nova_compute[182092]: 2026-01-23 09:44:09.004 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:09 compute-0 nova_compute[182092]: 2026-01-23 09:44:09.155 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:09 compute-0 nova_compute[182092]: 2026-01-23 09:44:09.190 182096 DEBUG nova.compute.manager [req-fee018b4-f220-427c-87af-f4e8660b7263 req-94d69ef9-2bcd-44d2-a3c4-1bbf6201695c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received event network-changed-2a379fd2-6ab8-422f-a21a-a30471e8e815 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:09 compute-0 nova_compute[182092]: 2026-01-23 09:44:09.190 182096 DEBUG nova.compute.manager [req-fee018b4-f220-427c-87af-f4e8660b7263 req-94d69ef9-2bcd-44d2-a3c4-1bbf6201695c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Refreshing instance network info cache due to event network-changed-2a379fd2-6ab8-422f-a21a-a30471e8e815. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:44:09 compute-0 nova_compute[182092]: 2026-01-23 09:44:09.190 182096 DEBUG oslo_concurrency.lockutils [req-fee018b4-f220-427c-87af-f4e8660b7263 req-94d69ef9-2bcd-44d2-a3c4-1bbf6201695c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:44:09 compute-0 nova_compute[182092]: 2026-01-23 09:44:09.191 182096 DEBUG oslo_concurrency.lockutils [req-fee018b4-f220-427c-87af-f4e8660b7263 req-94d69ef9-2bcd-44d2-a3c4-1bbf6201695c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:44:09 compute-0 nova_compute[182092]: 2026-01-23 09:44:09.191 182096 DEBUG nova.network.neutron [req-fee018b4-f220-427c-87af-f4e8660b7263 req-94d69ef9-2bcd-44d2-a3c4-1bbf6201695c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Refreshing network info cache for port 2a379fd2-6ab8-422f-a21a-a30471e8e815 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:44:10 compute-0 nova_compute[182092]: 2026-01-23 09:44:10.244 182096 DEBUG nova.network.neutron [req-fee018b4-f220-427c-87af-f4e8660b7263 req-94d69ef9-2bcd-44d2-a3c4-1bbf6201695c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Updated VIF entry in instance network info cache for port 2a379fd2-6ab8-422f-a21a-a30471e8e815. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:44:10 compute-0 nova_compute[182092]: 2026-01-23 09:44:10.245 182096 DEBUG nova.network.neutron [req-fee018b4-f220-427c-87af-f4e8660b7263 req-94d69ef9-2bcd-44d2-a3c4-1bbf6201695c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Updating instance_info_cache with network_info: [{"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:44:10 compute-0 nova_compute[182092]: 2026-01-23 09:44:10.258 182096 DEBUG oslo_concurrency.lockutils [req-fee018b4-f220-427c-87af-f4e8660b7263 req-94d69ef9-2bcd-44d2-a3c4-1bbf6201695c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:44:12 compute-0 podman[232533]: 2026-01-23 09:44:12.20737449 +0000 UTC m=+0.039444938 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:44:12 compute-0 podman[232532]: 2026-01-23 09:44:12.207495309 +0000 UTC m=+0.039849272 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:44:12 compute-0 nova_compute[182092]: 2026-01-23 09:44:12.594 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:14 compute-0 nova_compute[182092]: 2026-01-23 09:44:14.156 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:15 compute-0 podman[232569]: 2026-01-23 09:44:15.197169488 +0000 UTC m=+0.036469270 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Jan 23 09:44:17 compute-0 ovn_controller[94697]: 2026-01-23T09:44:17Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:96:c4:4d 10.100.0.9
Jan 23 09:44:17 compute-0 ovn_controller[94697]: 2026-01-23T09:44:17Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:96:c4:4d 10.100.0.9
Jan 23 09:44:17 compute-0 nova_compute[182092]: 2026-01-23 09:44:17.597 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:17 compute-0 nova_compute[182092]: 2026-01-23 09:44:17.661 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:18 compute-0 nova_compute[182092]: 2026-01-23 09:44:18.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:18 compute-0 nova_compute[182092]: 2026-01-23 09:44:18.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:18 compute-0 nova_compute[182092]: 2026-01-23 09:44:18.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:19 compute-0 nova_compute[182092]: 2026-01-23 09:44:19.160 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.669 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.669 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.669 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.686 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.686 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.687 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.687 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.723 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.770 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.771 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:21 compute-0 nova_compute[182092]: 2026-01-23 09:44:21.817 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.014 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.015 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5540MB free_disk=73.18334197998047GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.015 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.015 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.078 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance d09a7260-4b49-49b1-af38-89443d01e530 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.078 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.078 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.113 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.127 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.143 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.143 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:22 compute-0 nova_compute[182092]: 2026-01-23 09:44:22.599 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:23 compute-0 nova_compute[182092]: 2026-01-23 09:44:23.675 182096 INFO nova.compute.manager [None req-fa21a479-c412-485a-b5a4-66c53ce1c2ce 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Get console output
Jan 23 09:44:23 compute-0 nova_compute[182092]: 2026-01-23 09:44:23.679 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:44:24 compute-0 nova_compute[182092]: 2026-01-23 09:44:24.162 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:25 compute-0 nova_compute[182092]: 2026-01-23 09:44:25.122 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:25 compute-0 ovn_controller[94697]: 2026-01-23T09:44:25Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:96:c4:4d 10.100.0.9
Jan 23 09:44:26 compute-0 podman[232606]: 2026-01-23 09:44:26.220429284 +0000 UTC m=+0.054309839 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 09:44:26 compute-0 nova_compute[182092]: 2026-01-23 09:44:26.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:26 compute-0 nova_compute[182092]: 2026-01-23 09:44:26.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:44:27 compute-0 nova_compute[182092]: 2026-01-23 09:44:27.600 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:27 compute-0 ovn_controller[94697]: 2026-01-23T09:44:27Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:96:c4:4d 10.100.0.9
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.108 182096 DEBUG nova.compute.manager [req-ffc08c2b-95c6-4c4c-8838-f410e6c85b4f req-162b106d-ebbf-47bc-a05c-4ecb823b7ad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received event network-changed-2a379fd2-6ab8-422f-a21a-a30471e8e815 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.108 182096 DEBUG nova.compute.manager [req-ffc08c2b-95c6-4c4c-8838-f410e6c85b4f req-162b106d-ebbf-47bc-a05c-4ecb823b7ad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Refreshing instance network info cache due to event network-changed-2a379fd2-6ab8-422f-a21a-a30471e8e815. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.108 182096 DEBUG oslo_concurrency.lockutils [req-ffc08c2b-95c6-4c4c-8838-f410e6c85b4f req-162b106d-ebbf-47bc-a05c-4ecb823b7ad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.109 182096 DEBUG oslo_concurrency.lockutils [req-ffc08c2b-95c6-4c4c-8838-f410e6c85b4f req-162b106d-ebbf-47bc-a05c-4ecb823b7ad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.109 182096 DEBUG nova.network.neutron [req-ffc08c2b-95c6-4c4c-8838-f410e6c85b4f req-162b106d-ebbf-47bc-a05c-4ecb823b7ad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Refreshing network info cache for port 2a379fd2-6ab8-422f-a21a-a30471e8e815 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.135 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.135 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.136 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.160 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "d09a7260-4b49-49b1-af38-89443d01e530" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.161 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.161 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "d09a7260-4b49-49b1-af38-89443d01e530-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.161 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.161 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.168 182096 INFO nova.compute.manager [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Terminating instance
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.175 182096 DEBUG nova.compute.manager [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:44:28 compute-0 kernel: tap2a379fd2-6a (unregistering): left promiscuous mode
Jan 23 09:44:28 compute-0 NetworkManager[54920]: <info>  [1769161468.2036] device (tap2a379fd2-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.207 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:28 compute-0 ovn_controller[94697]: 2026-01-23T09:44:28Z|00729|binding|INFO|Releasing lport 2a379fd2-6ab8-422f-a21a-a30471e8e815 from this chassis (sb_readonly=0)
Jan 23 09:44:28 compute-0 ovn_controller[94697]: 2026-01-23T09:44:28Z|00730|binding|INFO|Setting lport 2a379fd2-6ab8-422f-a21a-a30471e8e815 down in Southbound
Jan 23 09:44:28 compute-0 ovn_controller[94697]: 2026-01-23T09:44:28Z|00731|binding|INFO|Removing iface tap2a379fd2-6a ovn-installed in OVS
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.210 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.217 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:c4:4d 10.100.0.9'], port_security=['fa:16:3e:96:c4:4d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'd09a7260-4b49-49b1-af38-89443d01e530', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9f1d386-1dad-48ce-b879-946e975f50a1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=432252ba-29ea-4365-9f6a-2fb348dad102, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=2a379fd2-6ab8-422f-a21a-a30471e8e815) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.218 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 2a379fd2-6ab8-422f-a21a-a30471e8e815 in datapath b5f8e399-cddf-49e5-a1c4-0f7ba6831cad unbound from our chassis
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.219 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5f8e399-cddf-49e5-a1c4-0f7ba6831cad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.220 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[98eb0269-c280-4db6-b235-c63248e4cd58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.221 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad namespace which is not needed anymore
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.223 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:28 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000af.scope: Deactivated successfully.
Jan 23 09:44:28 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000af.scope: Consumed 11.412s CPU time.
Jan 23 09:44:28 compute-0 systemd-machined[153562]: Machine qemu-84-instance-000000af terminated.
Jan 23 09:44:28 compute-0 neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad[232515]: [NOTICE]   (232523) : haproxy version is 2.8.14-c23fe91
Jan 23 09:44:28 compute-0 neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad[232515]: [NOTICE]   (232523) : path to executable is /usr/sbin/haproxy
Jan 23 09:44:28 compute-0 neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad[232515]: [WARNING]  (232523) : Exiting Master process...
Jan 23 09:44:28 compute-0 neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad[232515]: [ALERT]    (232523) : Current worker (232526) exited with code 143 (Terminated)
Jan 23 09:44:28 compute-0 neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad[232515]: [WARNING]  (232523) : All workers exited. Exiting... (0)
Jan 23 09:44:28 compute-0 systemd[1]: libpod-ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06.scope: Deactivated successfully.
Jan 23 09:44:28 compute-0 podman[232649]: 2026-01-23 09:44:28.313739009 +0000 UTC m=+0.033505700 container died ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:44:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06-userdata-shm.mount: Deactivated successfully.
Jan 23 09:44:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-cb81b739c8885b674ac987b826fc722db3ad7f1b32246baec037b688c71dc30d-merged.mount: Deactivated successfully.
Jan 23 09:44:28 compute-0 podman[232649]: 2026-01-23 09:44:28.33321878 +0000 UTC m=+0.052985471 container cleanup ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:44:28 compute-0 systemd[1]: libpod-conmon-ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06.scope: Deactivated successfully.
Jan 23 09:44:28 compute-0 podman[232672]: 2026-01-23 09:44:28.374938728 +0000 UTC m=+0.025768259 container remove ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.378 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[08f1b6bf-e6f9-48ae-ab18-ce2f051163ca]: (4, ('Fri Jan 23 09:44:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad (ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06)\nffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06\nFri Jan 23 09:44:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad (ffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06)\nffd18bf4f0dde7eba3bcd1b89cd8fcb66147fd39d909c38f6161a0f0dc96de06\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.381 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d872dca3-34c9-486c-ad27-c99e49bddb25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.382 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5f8e399-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:28 compute-0 kernel: tapb5f8e399-c0: left promiscuous mode
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.386 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:28 compute-0 NetworkManager[54920]: <info>  [1769161468.3884] manager: (tap2a379fd2-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.401 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.403 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[42322aad-27dd-4ed7-bb67-1d36beaa49a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.412 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f991058f-0e80-471f-b518-ab95ac044100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.413 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[69c9e772-da7d-4f40-95fb-b9545896d509]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.425 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[54f198fe-0335-48a6-b229-dea9821a067f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495058, 'reachable_time': 15191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232700, 'error': None, 'target': 'ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.425 182096 INFO nova.virt.libvirt.driver [-] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Instance destroyed successfully.
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.425 182096 DEBUG nova.objects.instance [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'resources' on Instance uuid d09a7260-4b49-49b1-af38-89443d01e530 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.427 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5f8e399-cddf-49e5-a1c4-0f7ba6831cad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:44:28 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:28.427 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[0e890993-5ec4-480f-bd0d-966f27649695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:28 compute-0 systemd[1]: run-netns-ovnmeta\x2db5f8e399\x2dcddf\x2d49e5\x2da1c4\x2d0f7ba6831cad.mount: Deactivated successfully.
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.436 182096 DEBUG nova.virt.libvirt.vif [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:43:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-526420849',display_name='tempest-TestNetworkBasicOps-server-526420849',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-526420849',id=175,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHxJ5SlOQu/cul4tOUHtgxIEIKU2C/4h0tnfAYLBFbPk04+A3gKim7moLWYEeH1OOboFWp6edIyPCAu6RCxa71HOUt/Q3c8UiacVet6PQXx/cUk76QffqLiVpkh4JgAUwA==',key_name='tempest-TestNetworkBasicOps-695768523',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:44:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-bc354ly3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:44:05Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=d09a7260-4b49-49b1-af38-89443d01e530,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.436 182096 DEBUG nova.network.os_vif_util [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.437 182096 DEBUG nova.network.os_vif_util [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:96:c4:4d,bridge_name='br-int',has_traffic_filtering=True,id=2a379fd2-6ab8-422f-a21a-a30471e8e815,network=Network(b5f8e399-cddf-49e5-a1c4-0f7ba6831cad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a379fd2-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.437 182096 DEBUG os_vif [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:c4:4d,bridge_name='br-int',has_traffic_filtering=True,id=2a379fd2-6ab8-422f-a21a-a30471e8e815,network=Network(b5f8e399-cddf-49e5-a1c4-0f7ba6831cad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a379fd2-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.439 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.439 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a379fd2-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.440 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.442 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.444 182096 INFO os_vif [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:c4:4d,bridge_name='br-int',has_traffic_filtering=True,id=2a379fd2-6ab8-422f-a21a-a30471e8e815,network=Network(b5f8e399-cddf-49e5-a1c4-0f7ba6831cad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a379fd2-6a')
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.444 182096 INFO nova.virt.libvirt.driver [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Deleting instance files /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530_del
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.444 182096 INFO nova.virt.libvirt.driver [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Deletion of /var/lib/nova/instances/d09a7260-4b49-49b1-af38-89443d01e530_del complete
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.498 182096 INFO nova.compute.manager [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.499 182096 DEBUG oslo.service.loopingcall [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.499 182096 DEBUG nova.compute.manager [-] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:44:28 compute-0 nova_compute[182092]: 2026-01-23 09:44:28.499 182096 DEBUG nova.network.neutron [-] [instance: d09a7260-4b49-49b1-af38-89443d01e530] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.258 182096 DEBUG nova.network.neutron [-] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.271 182096 INFO nova.compute.manager [-] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Took 0.77 seconds to deallocate network for instance.
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.315 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.315 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.330 182096 DEBUG nova.compute.manager [req-12084a7d-ff4f-4568-9383-e8725f184622 req-36626593-e757-4c94-b635-22efd315f50e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received event network-vif-deleted-2a379fd2-6ab8-422f-a21a-a30471e8e815 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.351 182096 DEBUG nova.compute.provider_tree [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.359 182096 DEBUG nova.scheduler.client.report [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.372 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.388 182096 INFO nova.scheduler.client.report [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Deleted allocations for instance d09a7260-4b49-49b1-af38-89443d01e530
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.435 182096 DEBUG oslo_concurrency.lockutils [None req-7b3a793f-24c0-4d22-8fbc-3e02ab6fb303 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.444 182096 DEBUG nova.network.neutron [req-ffc08c2b-95c6-4c4c-8838-f410e6c85b4f req-162b106d-ebbf-47bc-a05c-4ecb823b7ad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Updated VIF entry in instance network info cache for port 2a379fd2-6ab8-422f-a21a-a30471e8e815. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.444 182096 DEBUG nova.network.neutron [req-ffc08c2b-95c6-4c4c-8838-f410e6c85b4f req-162b106d-ebbf-47bc-a05c-4ecb823b7ad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Updating instance_info_cache with network_info: [{"id": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "address": "fa:16:3e:96:c4:4d", "network": {"id": "b5f8e399-cddf-49e5-a1c4-0f7ba6831cad", "bridge": "br-int", "label": "tempest-network-smoke--2079083846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a379fd2-6a", "ovs_interfaceid": "2a379fd2-6ab8-422f-a21a-a30471e8e815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:44:29 compute-0 nova_compute[182092]: 2026-01-23 09:44:29.455 182096 DEBUG oslo_concurrency.lockutils [req-ffc08c2b-95c6-4c4c-8838-f410e6c85b4f req-162b106d-ebbf-47bc-a05c-4ecb823b7ad9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-d09a7260-4b49-49b1-af38-89443d01e530" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.178 182096 DEBUG nova.compute.manager [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received event network-vif-unplugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.178 182096 DEBUG oslo_concurrency.lockutils [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d09a7260-4b49-49b1-af38-89443d01e530-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.178 182096 DEBUG oslo_concurrency.lockutils [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.178 182096 DEBUG oslo_concurrency.lockutils [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.179 182096 DEBUG nova.compute.manager [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] No waiting events found dispatching network-vif-unplugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.179 182096 WARNING nova.compute.manager [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received unexpected event network-vif-unplugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 for instance with vm_state deleted and task_state None.
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.179 182096 DEBUG nova.compute.manager [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received event network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.179 182096 DEBUG oslo_concurrency.lockutils [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "d09a7260-4b49-49b1-af38-89443d01e530-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.179 182096 DEBUG oslo_concurrency.lockutils [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.180 182096 DEBUG oslo_concurrency.lockutils [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "d09a7260-4b49-49b1-af38-89443d01e530-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.180 182096 DEBUG nova.compute.manager [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] No waiting events found dispatching network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:44:30 compute-0 nova_compute[182092]: 2026-01-23 09:44:30.180 182096 WARNING nova.compute.manager [req-75c7c378-adb6-4906-94e9-c403d630a0c9 req-c34d5424-e48f-4016-93de-9c1224ad4d97 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Received unexpected event network-vif-plugged-2a379fd2-6ab8-422f-a21a-a30471e8e815 for instance with vm_state deleted and task_state None.
Jan 23 09:44:32 compute-0 nova_compute[182092]: 2026-01-23 09:44:32.602 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:44:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:44:33 compute-0 podman[232705]: 2026-01-23 09:44:33.204246973 +0000 UTC m=+0.039006233 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:44:33 compute-0 podman[232704]: 2026-01-23 09:44:33.207165697 +0000 UTC m=+0.043275993 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 23 09:44:33 compute-0 nova_compute[182092]: 2026-01-23 09:44:33.440 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:34 compute-0 nova_compute[182092]: 2026-01-23 09:44:34.327 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:34 compute-0 nova_compute[182092]: 2026-01-23 09:44:34.465 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:34 compute-0 nova_compute[182092]: 2026-01-23 09:44:34.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:44:36 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:36.139 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:37 compute-0 nova_compute[182092]: 2026-01-23 09:44:37.604 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:38 compute-0 nova_compute[182092]: 2026-01-23 09:44:38.441 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:39.875 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:39.875 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:39.876 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:42 compute-0 nova_compute[182092]: 2026-01-23 09:44:42.605 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:43 compute-0 podman[232743]: 2026-01-23 09:44:43.19824016 +0000 UTC m=+0.032497439 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:44:43 compute-0 podman[232742]: 2026-01-23 09:44:43.228267901 +0000 UTC m=+0.063013831 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:44:43 compute-0 nova_compute[182092]: 2026-01-23 09:44:43.425 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161468.423647, d09a7260-4b49-49b1-af38-89443d01e530 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:44:43 compute-0 nova_compute[182092]: 2026-01-23 09:44:43.425 182096 INFO nova.compute.manager [-] [instance: d09a7260-4b49-49b1-af38-89443d01e530] VM Stopped (Lifecycle Event)
Jan 23 09:44:43 compute-0 nova_compute[182092]: 2026-01-23 09:44:43.439 182096 DEBUG nova.compute.manager [None req-48fac682-9559-49db-b3c9-9e14155340b3 - - - - - -] [instance: d09a7260-4b49-49b1-af38-89443d01e530] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:43 compute-0 nova_compute[182092]: 2026-01-23 09:44:43.442 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:46 compute-0 podman[232780]: 2026-01-23 09:44:46.198196366 +0000 UTC m=+0.037574403 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 23 09:44:47 compute-0 nova_compute[182092]: 2026-01-23 09:44:47.606 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.335 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.336 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.436 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.443 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.503 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.503 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.508 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.508 182096 INFO nova.compute.claims [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.585 182096 DEBUG nova.compute.provider_tree [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.596 182096 DEBUG nova.scheduler.client.report [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.611 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.611 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.652 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.652 182096 DEBUG nova.network.neutron [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.662 182096 INFO nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.672 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.739 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.740 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.740 182096 INFO nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Creating image(s)
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.741 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "/var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.741 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.741 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "/var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.751 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.800 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.801 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.802 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.811 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.854 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.855 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.874 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.874 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.875 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.920 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.921 182096 DEBUG nova.virt.disk.api [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Checking if we can resize image /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.921 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.967 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.968 182096 DEBUG nova.virt.disk.api [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Cannot resize image /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.968 182096 DEBUG nova.objects.instance [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'migration_context' on Instance uuid 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.981 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.981 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Ensure instance console log exists: /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.981 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.982 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:48 compute-0 nova_compute[182092]: 2026-01-23 09:44:48.982 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:49 compute-0 nova_compute[182092]: 2026-01-23 09:44:49.809 182096 DEBUG nova.policy [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8aa2911d0bc0474cb77214528548d308', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:44:52 compute-0 nova_compute[182092]: 2026-01-23 09:44:52.607 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:52 compute-0 nova_compute[182092]: 2026-01-23 09:44:52.834 182096 DEBUG nova.network.neutron [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Successfully created port: e94cd37f-e05e-4331-8160-f8e0c35a36c5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:44:53 compute-0 nova_compute[182092]: 2026-01-23 09:44:53.444 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:53 compute-0 nova_compute[182092]: 2026-01-23 09:44:53.728 182096 DEBUG nova.network.neutron [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Successfully updated port: e94cd37f-e05e-4331-8160-f8e0c35a36c5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:44:53 compute-0 nova_compute[182092]: 2026-01-23 09:44:53.740 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:44:53 compute-0 nova_compute[182092]: 2026-01-23 09:44:53.740 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquired lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:44:53 compute-0 nova_compute[182092]: 2026-01-23 09:44:53.741 182096 DEBUG nova.network.neutron [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:44:53 compute-0 nova_compute[182092]: 2026-01-23 09:44:53.847 182096 DEBUG nova.network.neutron [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.377 182096 DEBUG nova.network.neutron [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updating instance_info_cache with network_info: [{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.392 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Releasing lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.392 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Instance network_info: |[{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.393 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Start _get_guest_xml network_info=[{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.397 182096 WARNING nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.401 182096 DEBUG nova.virt.libvirt.host [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.401 182096 DEBUG nova.virt.libvirt.host [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.405 182096 DEBUG nova.virt.libvirt.host [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.405 182096 DEBUG nova.virt.libvirt.host [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.406 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.406 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.407 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.407 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.407 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.407 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.407 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.408 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.408 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.408 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.408 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.408 182096 DEBUG nova.virt.hardware [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.411 182096 DEBUG nova.virt.libvirt.vif [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:44:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2111643825',display_name='tempest-TestNetworkBasicOps-server-2111643825',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2111643825',id=177,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGFy75fyvrHxbyqAAhFr4J6Ziy6GqjQ0wUhuA1DpaoQawmDAyCd9Du7n18nse+F7aDNri/78W+z5Oz5kv9iU7goXYxpO5WxLH93rooOVUPnBhmb6XDAVWi1PUwaWxSoCrg==',key_name='tempest-TestNetworkBasicOps-682509342',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-w122agb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:44:48Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.411 182096 DEBUG nova.network.os_vif_util [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.412 182096 DEBUG nova.network.os_vif_util [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:1e:2c,bridge_name='br-int',has_traffic_filtering=True,id=e94cd37f-e05e-4331-8160-f8e0c35a36c5,network=Network(26aa0032-133a-4cef-adf4-15be17c564a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94cd37f-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.412 182096 DEBUG nova.objects.instance [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.421 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <uuid>21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5</uuid>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <name>instance-000000b1</name>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <nova:name>tempest-TestNetworkBasicOps-server-2111643825</nova:name>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:44:54</nova:creationTime>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:44:54 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:44:54 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:44:54 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:44:54 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:44:54 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:44:54 compute-0 nova_compute[182092]:         <nova:user uuid="8aa2911d0bc0474cb77214528548d308">tempest-TestNetworkBasicOps-80327574-project-member</nova:user>
Jan 23 09:44:54 compute-0 nova_compute[182092]:         <nova:project uuid="f0bc8cb17ec0443f927dd0bcb35e4b73">tempest-TestNetworkBasicOps-80327574</nova:project>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:44:54 compute-0 nova_compute[182092]:         <nova:port uuid="e94cd37f-e05e-4331-8160-f8e0c35a36c5">
Jan 23 09:44:54 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <system>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <entry name="serial">21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5</entry>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <entry name="uuid">21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5</entry>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </system>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <os>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   </os>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <features>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   </features>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk.config"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:f8:1e:2c"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <target dev="tape94cd37f-e0"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/console.log" append="off"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <video>
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </video>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:44:54 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:44:54 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:44:54 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:44:54 compute-0 nova_compute[182092]: </domain>
Jan 23 09:44:54 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.422 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Preparing to wait for external event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.422 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.422 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.422 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.423 182096 DEBUG nova.virt.libvirt.vif [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:44:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2111643825',display_name='tempest-TestNetworkBasicOps-server-2111643825',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2111643825',id=177,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGFy75fyvrHxbyqAAhFr4J6Ziy6GqjQ0wUhuA1DpaoQawmDAyCd9Du7n18nse+F7aDNri/78W+z5Oz5kv9iU7goXYxpO5WxLH93rooOVUPnBhmb6XDAVWi1PUwaWxSoCrg==',key_name='tempest-TestNetworkBasicOps-682509342',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-w122agb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:44:48Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.423 182096 DEBUG nova.network.os_vif_util [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.423 182096 DEBUG nova.network.os_vif_util [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:1e:2c,bridge_name='br-int',has_traffic_filtering=True,id=e94cd37f-e05e-4331-8160-f8e0c35a36c5,network=Network(26aa0032-133a-4cef-adf4-15be17c564a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94cd37f-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.424 182096 DEBUG os_vif [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:1e:2c,bridge_name='br-int',has_traffic_filtering=True,id=e94cd37f-e05e-4331-8160-f8e0c35a36c5,network=Network(26aa0032-133a-4cef-adf4-15be17c564a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94cd37f-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.424 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.424 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.425 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.426 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.426 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape94cd37f-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.427 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape94cd37f-e0, col_values=(('external_ids', {'iface-id': 'e94cd37f-e05e-4331-8160-f8e0c35a36c5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:1e:2c', 'vm-uuid': '21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.428 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:54 compute-0 NetworkManager[54920]: <info>  [1769161494.4296] manager: (tape94cd37f-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.430 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.432 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.433 182096 INFO os_vif [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:1e:2c,bridge_name='br-int',has_traffic_filtering=True,id=e94cd37f-e05e-4331-8160-f8e0c35a36c5,network=Network(26aa0032-133a-4cef-adf4-15be17c564a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94cd37f-e0')
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.476 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.477 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.477 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] No VIF found with MAC fa:16:3e:f8:1e:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.477 182096 INFO nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Using config drive
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.762 182096 INFO nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Creating config drive at /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk.config
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.766 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcvhb34xs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.884 182096 DEBUG oslo_concurrency.processutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcvhb34xs" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:44:54 compute-0 kernel: tape94cd37f-e0: entered promiscuous mode
Jan 23 09:44:54 compute-0 NetworkManager[54920]: <info>  [1769161494.9320] manager: (tape94cd37f-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/357)
Jan 23 09:44:54 compute-0 ovn_controller[94697]: 2026-01-23T09:44:54Z|00732|binding|INFO|Claiming lport e94cd37f-e05e-4331-8160-f8e0c35a36c5 for this chassis.
Jan 23 09:44:54 compute-0 ovn_controller[94697]: 2026-01-23T09:44:54Z|00733|binding|INFO|e94cd37f-e05e-4331-8160-f8e0c35a36c5: Claiming fa:16:3e:f8:1e:2c 10.100.0.9
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.936 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.941 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.945 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:1e:2c 10.100.0.9'], port_security=['fa:16:3e:f8:1e:2c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26aa0032-133a-4cef-adf4-15be17c564a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5c360d80-c223-4268-b489-ca9bf10523e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb68eb38-a889-4b0c-8963-7dc56140e204, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=e94cd37f-e05e-4331-8160-f8e0c35a36c5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.946 103978 INFO neutron.agent.ovn.metadata.agent [-] Port e94cd37f-e05e-4331-8160-f8e0c35a36c5 in datapath 26aa0032-133a-4cef-adf4-15be17c564a2 bound to our chassis
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.947 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 26aa0032-133a-4cef-adf4-15be17c564a2
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.956 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d02bd25e-6b0a-4b1a-9b94-b3078f1a4720]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.957 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap26aa0032-11 in ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.958 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap26aa0032-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.958 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[21a65ac8-cb17-441f-8326-847597d5d065]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.959 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[23be0de8-fc66-40ef-b3b7-b138d2bcc654]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:54 compute-0 systemd-udevd[232834]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.969 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[ce100848-86ee-4db0-84f9-e56efc4372f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:54 compute-0 NetworkManager[54920]: <info>  [1769161494.9714] device (tape94cd37f-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:44:54 compute-0 NetworkManager[54920]: <info>  [1769161494.9721] device (tape94cd37f-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:44:54 compute-0 systemd-machined[153562]: New machine qemu-85-instance-000000b1.
Jan 23 09:44:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:54.996 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[91d2a877-18d8-4891-956f-fcb10cf469cd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:54 compute-0 systemd[1]: Started Virtual Machine qemu-85-instance-000000b1.
Jan 23 09:44:54 compute-0 nova_compute[182092]: 2026-01-23 09:44:54.997 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:55 compute-0 ovn_controller[94697]: 2026-01-23T09:44:55Z|00734|binding|INFO|Setting lport e94cd37f-e05e-4331-8160-f8e0c35a36c5 ovn-installed in OVS
Jan 23 09:44:55 compute-0 ovn_controller[94697]: 2026-01-23T09:44:55Z|00735|binding|INFO|Setting lport e94cd37f-e05e-4331-8160-f8e0c35a36c5 up in Southbound
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.001 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.018 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[28a471e4-7037-45b4-ad8e-7c7f4cbd5f80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 NetworkManager[54920]: <info>  [1769161495.0263] manager: (tap26aa0032-10): new Veth device (/org/freedesktop/NetworkManager/Devices/358)
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.023 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[375b6228-4198-4e57-b737-c99167294398]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.045 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[e5fb9f14-2a62-44a4-9d3c-65f800b8f43b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.049 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[460a0927-9742-4625-bcc8-f41eeb414b0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 NetworkManager[54920]: <info>  [1769161495.0651] device (tap26aa0032-10): carrier: link connected
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.068 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[b088ba2a-8809-4800-b9a8-dfcb7981dbbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.081 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5a6fa7-7f8f-455e-abdf-753f5add1daa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap26aa0032-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:61:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 216], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500055, 'reachable_time': 35757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232859, 'error': None, 'target': 'ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.090 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f22a6e75-1018-4509-bcd6-84f65d9e69b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:6109'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500055, 'tstamp': 500055}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232860, 'error': None, 'target': 'ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.101 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4bcc0131-8dcd-41c5-8ab7-11ddc282edba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap26aa0032-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:61:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 216], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500055, 'reachable_time': 35757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232861, 'error': None, 'target': 'ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.120 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d3871b03-9f7c-4e17-bc65-c776ba936a3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.158 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b3591812-d5b8-42f7-ae72-ce3cdb0569e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.159 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26aa0032-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.160 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.160 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26aa0032-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:55 compute-0 kernel: tap26aa0032-10: entered promiscuous mode
Jan 23 09:44:55 compute-0 NetworkManager[54920]: <info>  [1769161495.1623] manager: (tap26aa0032-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.163 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.168 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap26aa0032-10, col_values=(('external_ids', {'iface-id': '9d5de415-a257-4f36-8349-608bd8f2c79d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:44:55 compute-0 ovn_controller[94697]: 2026-01-23T09:44:55Z|00736|binding|INFO|Releasing lport 9d5de415-a257-4f36-8349-608bd8f2c79d from this chassis (sb_readonly=0)
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.169 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.179 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/26aa0032-133a-4cef-adf4-15be17c564a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/26aa0032-133a-4cef-adf4-15be17c564a2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.180 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4863cebb-9a08-4ce2-a299-d19ab0c00278]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.180 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-26aa0032-133a-4cef-adf4-15be17c564a2
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/26aa0032-133a-4cef-adf4-15be17c564a2.pid.haproxy
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 26aa0032-133a-4cef-adf4-15be17c564a2
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:44:55 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:44:55.181 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2', 'env', 'PROCESS_TAG=haproxy-26aa0032-133a-4cef-adf4-15be17c564a2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/26aa0032-133a-4cef-adf4-15be17c564a2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.182 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:55 compute-0 podman[232889]: 2026-01-23 09:44:55.450408414 +0000 UTC m=+0.030380145 container create 8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 09:44:55 compute-0 systemd[1]: Started libpod-conmon-8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f.scope.
Jan 23 09:44:55 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:44:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed9073c0d273cda80aed710a1dc8a72e61f0537559dad73bc97fe2b1dafb8598/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:44:55 compute-0 podman[232889]: 2026-01-23 09:44:55.514471854 +0000 UTC m=+0.094443585 container init 8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:44:55 compute-0 podman[232889]: 2026-01-23 09:44:55.519080965 +0000 UTC m=+0.099052697 container start 8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:44:55 compute-0 podman[232889]: 2026-01-23 09:44:55.436880746 +0000 UTC m=+0.016852477 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:44:55 compute-0 neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2[232901]: [NOTICE]   (232905) : New worker (232907) forked
Jan 23 09:44:55 compute-0 neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2[232901]: [NOTICE]   (232905) : Loading success.
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.645 182096 DEBUG nova.compute.manager [req-5a46ec33-4659-486b-afef-bd4870e225d6 req-11385791-9bd0-4348-8121-8387f94baed2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.645 182096 DEBUG nova.compute.manager [req-5a46ec33-4659-486b-afef-bd4870e225d6 req-11385791-9bd0-4348-8121-8387f94baed2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing instance network info cache due to event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.646 182096 DEBUG oslo_concurrency.lockutils [req-5a46ec33-4659-486b-afef-bd4870e225d6 req-11385791-9bd0-4348-8121-8387f94baed2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.646 182096 DEBUG oslo_concurrency.lockutils [req-5a46ec33-4659-486b-afef-bd4870e225d6 req-11385791-9bd0-4348-8121-8387f94baed2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:44:55 compute-0 nova_compute[182092]: 2026-01-23 09:44:55.647 182096 DEBUG nova.network.neutron [req-5a46ec33-4659-486b-afef-bd4870e225d6 req-11385791-9bd0-4348-8121-8387f94baed2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:44:56 compute-0 nova_compute[182092]: 2026-01-23 09:44:56.073 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161496.073052, 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:44:56 compute-0 nova_compute[182092]: 2026-01-23 09:44:56.073 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] VM Started (Lifecycle Event)
Jan 23 09:44:56 compute-0 nova_compute[182092]: 2026-01-23 09:44:56.104 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:56 compute-0 nova_compute[182092]: 2026-01-23 09:44:56.107 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161496.0756967, 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:44:56 compute-0 nova_compute[182092]: 2026-01-23 09:44:56.107 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] VM Paused (Lifecycle Event)
Jan 23 09:44:56 compute-0 nova_compute[182092]: 2026-01-23 09:44:56.136 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:56 compute-0 nova_compute[182092]: 2026-01-23 09:44:56.138 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:44:56 compute-0 nova_compute[182092]: 2026-01-23 09:44:56.151 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:44:57 compute-0 podman[232919]: 2026-01-23 09:44:57.218887111 +0000 UTC m=+0.056673024 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.345 182096 DEBUG nova.compute.manager [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.346 182096 DEBUG oslo_concurrency.lockutils [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.346 182096 DEBUG oslo_concurrency.lockutils [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.346 182096 DEBUG oslo_concurrency.lockutils [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.346 182096 DEBUG nova.compute.manager [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Processing event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.347 182096 DEBUG nova.compute.manager [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.347 182096 DEBUG oslo_concurrency.lockutils [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.347 182096 DEBUG oslo_concurrency.lockutils [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.347 182096 DEBUG oslo_concurrency.lockutils [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.347 182096 DEBUG nova.compute.manager [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] No waiting events found dispatching network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.348 182096 WARNING nova.compute.manager [req-4da37257-6880-4a4d-8ec8-34a66f330970 req-ede8344e-2401-4ea2-90ce-79483fd8ee67 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received unexpected event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 for instance with vm_state building and task_state spawning.
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.348 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.351 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161497.350836, 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.351 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] VM Resumed (Lifecycle Event)
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.352 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.354 182096 INFO nova.virt.libvirt.driver [-] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Instance spawned successfully.
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.354 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.367 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.371 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.373 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.373 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.374 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.374 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.374 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.375 182096 DEBUG nova.virt.libvirt.driver [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.415 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.450 182096 INFO nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Took 8.71 seconds to spawn the instance on the hypervisor.
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.450 182096 DEBUG nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.533 182096 INFO nova.compute.manager [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Took 9.06 seconds to build instance.
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.547 182096 DEBUG oslo_concurrency.lockutils [None req-424fd05a-3524-4e70-8486-90db12866cf6 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:44:57 compute-0 nova_compute[182092]: 2026-01-23 09:44:57.609 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:44:58 compute-0 nova_compute[182092]: 2026-01-23 09:44:58.778 182096 DEBUG nova.network.neutron [req-5a46ec33-4659-486b-afef-bd4870e225d6 req-11385791-9bd0-4348-8121-8387f94baed2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updated VIF entry in instance network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:44:58 compute-0 nova_compute[182092]: 2026-01-23 09:44:58.779 182096 DEBUG nova.network.neutron [req-5a46ec33-4659-486b-afef-bd4870e225d6 req-11385791-9bd0-4348-8121-8387f94baed2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updating instance_info_cache with network_info: [{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:44:58 compute-0 nova_compute[182092]: 2026-01-23 09:44:58.794 182096 DEBUG oslo_concurrency.lockutils [req-5a46ec33-4659-486b-afef-bd4870e225d6 req-11385791-9bd0-4348-8121-8387f94baed2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:44:59 compute-0 nova_compute[182092]: 2026-01-23 09:44:59.430 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:01 compute-0 nova_compute[182092]: 2026-01-23 09:45:01.020 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:01 compute-0 NetworkManager[54920]: <info>  [1769161501.0207] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Jan 23 09:45:01 compute-0 NetworkManager[54920]: <info>  [1769161501.0215] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Jan 23 09:45:01 compute-0 nova_compute[182092]: 2026-01-23 09:45:01.115 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:01 compute-0 ovn_controller[94697]: 2026-01-23T09:45:01Z|00737|binding|INFO|Releasing lport 9d5de415-a257-4f36-8349-608bd8f2c79d from this chassis (sb_readonly=0)
Jan 23 09:45:01 compute-0 nova_compute[182092]: 2026-01-23 09:45:01.123 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:01 compute-0 nova_compute[182092]: 2026-01-23 09:45:01.361 182096 DEBUG nova.compute.manager [req-ef34e6ab-dc15-482f-ba0c-bbad33d5d78a req-6fd6eb3f-e332-4239-82be-d6113d04cf3c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:01 compute-0 nova_compute[182092]: 2026-01-23 09:45:01.361 182096 DEBUG nova.compute.manager [req-ef34e6ab-dc15-482f-ba0c-bbad33d5d78a req-6fd6eb3f-e332-4239-82be-d6113d04cf3c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing instance network info cache due to event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:45:01 compute-0 nova_compute[182092]: 2026-01-23 09:45:01.361 182096 DEBUG oslo_concurrency.lockutils [req-ef34e6ab-dc15-482f-ba0c-bbad33d5d78a req-6fd6eb3f-e332-4239-82be-d6113d04cf3c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:45:01 compute-0 nova_compute[182092]: 2026-01-23 09:45:01.362 182096 DEBUG oslo_concurrency.lockutils [req-ef34e6ab-dc15-482f-ba0c-bbad33d5d78a req-6fd6eb3f-e332-4239-82be-d6113d04cf3c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:45:01 compute-0 nova_compute[182092]: 2026-01-23 09:45:01.362 182096 DEBUG nova.network.neutron [req-ef34e6ab-dc15-482f-ba0c-bbad33d5d78a req-6fd6eb3f-e332-4239-82be-d6113d04cf3c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:45:01 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:45:02 compute-0 nova_compute[182092]: 2026-01-23 09:45:02.612 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:02 compute-0 nova_compute[182092]: 2026-01-23 09:45:02.779 182096 DEBUG nova.network.neutron [req-ef34e6ab-dc15-482f-ba0c-bbad33d5d78a req-6fd6eb3f-e332-4239-82be-d6113d04cf3c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updated VIF entry in instance network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:45:02 compute-0 nova_compute[182092]: 2026-01-23 09:45:02.779 182096 DEBUG nova.network.neutron [req-ef34e6ab-dc15-482f-ba0c-bbad33d5d78a req-6fd6eb3f-e332-4239-82be-d6113d04cf3c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updating instance_info_cache with network_info: [{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:45:02 compute-0 nova_compute[182092]: 2026-01-23 09:45:02.794 182096 DEBUG oslo_concurrency.lockutils [req-ef34e6ab-dc15-482f-ba0c-bbad33d5d78a req-6fd6eb3f-e332-4239-82be-d6113d04cf3c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:45:04 compute-0 podman[232946]: 2026-01-23 09:45:04.214211868 +0000 UTC m=+0.048890089 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:45:04 compute-0 podman[232945]: 2026-01-23 09:45:04.238250256 +0000 UTC m=+0.074957763 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:45:04 compute-0 ovn_controller[94697]: 2026-01-23T09:45:04Z|00738|binding|INFO|Releasing lport 9d5de415-a257-4f36-8349-608bd8f2c79d from this chassis (sb_readonly=0)
Jan 23 09:45:04 compute-0 nova_compute[182092]: 2026-01-23 09:45:04.420 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:04 compute-0 nova_compute[182092]: 2026-01-23 09:45:04.431 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:07 compute-0 nova_compute[182092]: 2026-01-23 09:45:07.613 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:09 compute-0 nova_compute[182092]: 2026-01-23 09:45:09.433 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:09 compute-0 ovn_controller[94697]: 2026-01-23T09:45:09Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f8:1e:2c 10.100.0.9
Jan 23 09:45:09 compute-0 ovn_controller[94697]: 2026-01-23T09:45:09Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f8:1e:2c 10.100.0.9
Jan 23 09:45:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:10.078 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:45:10 compute-0 nova_compute[182092]: 2026-01-23 09:45:10.079 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:10.081 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:45:12 compute-0 nova_compute[182092]: 2026-01-23 09:45:12.232 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:12 compute-0 nova_compute[182092]: 2026-01-23 09:45:12.614 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:14 compute-0 podman[232996]: 2026-01-23 09:45:14.207833586 +0000 UTC m=+0.039615551 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:45:14 compute-0 podman[232997]: 2026-01-23 09:45:14.208410644 +0000 UTC m=+0.037929293 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 23 09:45:14 compute-0 nova_compute[182092]: 2026-01-23 09:45:14.436 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:14 compute-0 nova_compute[182092]: 2026-01-23 09:45:14.963 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:16.083 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:45:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:16.182 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:9e:45 10.100.0.2 2001:db8::f816:3eff:fe6a:9e45'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe6a:9e45/64', 'neutron:device_id': 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b15eafd7-caf2-4600-afb8-dc5a43f9194c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9f7f3dca-a3f6-4bfe-9810-13603b340b1c) old=Port_Binding(mac=['fa:16:3e:6a:9e:45 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:45:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:16.183 103978 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9f7f3dca-a3f6-4bfe-9810-13603b340b1c in datapath 136fe3f2-f077-4093-8452-cdc02e0c2016 updated
Jan 23 09:45:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:16.184 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 136fe3f2-f077-4093-8452-cdc02e0c2016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:45:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:16.185 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[0980edbb-c869-4772-9935-0f4d94082a17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:17 compute-0 podman[233034]: 2026-01-23 09:45:17.206466608 +0000 UTC m=+0.042113442 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter)
Jan 23 09:45:17 compute-0 nova_compute[182092]: 2026-01-23 09:45:17.615 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:17 compute-0 nova_compute[182092]: 2026-01-23 09:45:17.657 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:18 compute-0 nova_compute[182092]: 2026-01-23 09:45:18.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:18.986 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:9e:45 10.100.0.2 2001:db8:0:1:f816:3eff:fe6a:9e45 2001:db8::f816:3eff:fe6a:9e45'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe6a:9e45/64 2001:db8::f816:3eff:fe6a:9e45/64', 'neutron:device_id': 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b15eafd7-caf2-4600-afb8-dc5a43f9194c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9f7f3dca-a3f6-4bfe-9810-13603b340b1c) old=Port_Binding(mac=['fa:16:3e:6a:9e:45 10.100.0.2 2001:db8::f816:3eff:fe6a:9e45'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe6a:9e45/64', 'neutron:device_id': 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:45:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:18.987 103978 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9f7f3dca-a3f6-4bfe-9810-13603b340b1c in datapath 136fe3f2-f077-4093-8452-cdc02e0c2016 updated
Jan 23 09:45:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:18.988 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 136fe3f2-f077-4093-8452-cdc02e0c2016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:45:18 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:18.988 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fe779938-0eb3-47fa-847d-da3acaa2334e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:19 compute-0 nova_compute[182092]: 2026-01-23 09:45:19.438 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:19 compute-0 nova_compute[182092]: 2026-01-23 09:45:19.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:20 compute-0 nova_compute[182092]: 2026-01-23 09:45:20.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.679 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.680 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.680 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.680 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.739 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.788 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.788 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:45:21 compute-0 nova_compute[182092]: 2026-01-23 09:45:21.836 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.045 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.046 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5532MB free_disk=73.18339538574219GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.046 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.047 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.288 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.289 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.289 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.322 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.336 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.355 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.355 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:22 compute-0 nova_compute[182092]: 2026-01-23 09:45:22.617 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:23 compute-0 nova_compute[182092]: 2026-01-23 09:45:23.357 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:23 compute-0 nova_compute[182092]: 2026-01-23 09:45:23.357 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:45:23 compute-0 nova_compute[182092]: 2026-01-23 09:45:23.357 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:45:23 compute-0 nova_compute[182092]: 2026-01-23 09:45:23.537 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:45:23 compute-0 nova_compute[182092]: 2026-01-23 09:45:23.537 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:45:23 compute-0 nova_compute[182092]: 2026-01-23 09:45:23.537 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:45:23 compute-0 nova_compute[182092]: 2026-01-23 09:45:23.538 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:45:24 compute-0 nova_compute[182092]: 2026-01-23 09:45:24.442 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:25 compute-0 nova_compute[182092]: 2026-01-23 09:45:25.006 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updating instance_info_cache with network_info: [{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:45:25 compute-0 nova_compute[182092]: 2026-01-23 09:45:25.027 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:45:25 compute-0 nova_compute[182092]: 2026-01-23 09:45:25.027 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:45:25 compute-0 nova_compute[182092]: 2026-01-23 09:45:25.028 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:26 compute-0 nova_compute[182092]: 2026-01-23 09:45:26.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:27 compute-0 nova_compute[182092]: 2026-01-23 09:45:27.619 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:28 compute-0 podman[233059]: 2026-01-23 09:45:28.220633619 +0000 UTC m=+0.058569512 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:45:28 compute-0 nova_compute[182092]: 2026-01-23 09:45:28.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:45:28 compute-0 nova_compute[182092]: 2026-01-23 09:45:28.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:45:29 compute-0 nova_compute[182092]: 2026-01-23 09:45:29.445 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:32 compute-0 nova_compute[182092]: 2026-01-23 09:45:32.366 182096 INFO nova.compute.manager [None req-875366ab-a392-411c-b46f-675e204aea61 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Get console output
Jan 23 09:45:32 compute-0 nova_compute[182092]: 2026-01-23 09:45:32.369 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:45:32 compute-0 nova_compute[182092]: 2026-01-23 09:45:32.620 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.194 182096 DEBUG nova.compute.manager [req-8acae367-5f6d-4854-a693-0f807456a32a req-c2d4c5d7-4e22-4529-a4f5-29ab4388aa36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.194 182096 DEBUG nova.compute.manager [req-8acae367-5f6d-4854-a693-0f807456a32a req-c2d4c5d7-4e22-4529-a4f5-29ab4388aa36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing instance network info cache due to event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.195 182096 DEBUG oslo_concurrency.lockutils [req-8acae367-5f6d-4854-a693-0f807456a32a req-c2d4c5d7-4e22-4529-a4f5-29ab4388aa36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.195 182096 DEBUG oslo_concurrency.lockutils [req-8acae367-5f6d-4854-a693-0f807456a32a req-c2d4c5d7-4e22-4529-a4f5-29ab4388aa36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.195 182096 DEBUG nova.network.neutron [req-8acae367-5f6d-4854-a693-0f807456a32a req-c2d4c5d7-4e22-4529-a4f5-29ab4388aa36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.262 182096 DEBUG nova.compute.manager [req-f0d6737d-8b9c-48c8-aeb6-59ef20d655a1 req-c4da50d4-441e-4db5-95e4-7ed69c7dba6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-unplugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.262 182096 DEBUG oslo_concurrency.lockutils [req-f0d6737d-8b9c-48c8-aeb6-59ef20d655a1 req-c4da50d4-441e-4db5-95e4-7ed69c7dba6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.262 182096 DEBUG oslo_concurrency.lockutils [req-f0d6737d-8b9c-48c8-aeb6-59ef20d655a1 req-c4da50d4-441e-4db5-95e4-7ed69c7dba6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.262 182096 DEBUG oslo_concurrency.lockutils [req-f0d6737d-8b9c-48c8-aeb6-59ef20d655a1 req-c4da50d4-441e-4db5-95e4-7ed69c7dba6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.263 182096 DEBUG nova.compute.manager [req-f0d6737d-8b9c-48c8-aeb6-59ef20d655a1 req-c4da50d4-441e-4db5-95e4-7ed69c7dba6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] No waiting events found dispatching network-vif-unplugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:45:33 compute-0 nova_compute[182092]: 2026-01-23 09:45:33.263 182096 WARNING nova.compute.manager [req-f0d6737d-8b9c-48c8-aeb6-59ef20d655a1 req-c4da50d4-441e-4db5-95e4-7ed69c7dba6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received unexpected event network-vif-unplugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 for instance with vm_state active and task_state None.
Jan 23 09:45:34 compute-0 nova_compute[182092]: 2026-01-23 09:45:34.447 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:35 compute-0 podman[233083]: 2026-01-23 09:45:35.207489056 +0000 UTC m=+0.041444219 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 09:45:35 compute-0 podman[233084]: 2026-01-23 09:45:35.213092943 +0000 UTC m=+0.044289745 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.369 182096 DEBUG nova.network.neutron [req-8acae367-5f6d-4854-a693-0f807456a32a req-c2d4c5d7-4e22-4529-a4f5-29ab4388aa36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updated VIF entry in instance network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.369 182096 DEBUG nova.network.neutron [req-8acae367-5f6d-4854-a693-0f807456a32a req-c2d4c5d7-4e22-4529-a4f5-29ab4388aa36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updating instance_info_cache with network_info: [{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.389 182096 INFO nova.compute.manager [None req-d794e1ad-97fb-40c0-91e2-ef7ff9256e2a 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Get console output
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.393 182096 DEBUG nova.compute.manager [req-ec6ea6ba-c785-4a43-91da-bf1bc00245a0 req-7c40a896-c99d-4cf6-907f-1c21edcc7165 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.394 182096 DEBUG oslo_concurrency.lockutils [req-ec6ea6ba-c785-4a43-91da-bf1bc00245a0 req-7c40a896-c99d-4cf6-907f-1c21edcc7165 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.394 182096 DEBUG oslo_concurrency.lockutils [req-ec6ea6ba-c785-4a43-91da-bf1bc00245a0 req-7c40a896-c99d-4cf6-907f-1c21edcc7165 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.394 182096 DEBUG oslo_concurrency.lockutils [req-ec6ea6ba-c785-4a43-91da-bf1bc00245a0 req-7c40a896-c99d-4cf6-907f-1c21edcc7165 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.394 182096 DEBUG nova.compute.manager [req-ec6ea6ba-c785-4a43-91da-bf1bc00245a0 req-7c40a896-c99d-4cf6-907f-1c21edcc7165 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] No waiting events found dispatching network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.394 182096 WARNING nova.compute.manager [req-ec6ea6ba-c785-4a43-91da-bf1bc00245a0 req-7c40a896-c99d-4cf6-907f-1c21edcc7165 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received unexpected event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 for instance with vm_state active and task_state None.
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.392 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:45:35 compute-0 nova_compute[182092]: 2026-01-23 09:45:35.399 182096 DEBUG oslo_concurrency.lockutils [req-8acae367-5f6d-4854-a693-0f807456a32a req-c2d4c5d7-4e22-4529-a4f5-29ab4388aa36 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:45:37 compute-0 nova_compute[182092]: 2026-01-23 09:45:37.621 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:37 compute-0 nova_compute[182092]: 2026-01-23 09:45:37.938 182096 DEBUG nova.compute.manager [req-9d87346b-5579-4aee-b95a-5f21018c630c req-b7bd1a2d-2d45-4b7a-9cb6-aaf22ce9d4a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:37 compute-0 nova_compute[182092]: 2026-01-23 09:45:37.939 182096 DEBUG nova.compute.manager [req-9d87346b-5579-4aee-b95a-5f21018c630c req-b7bd1a2d-2d45-4b7a-9cb6-aaf22ce9d4a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing instance network info cache due to event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:45:37 compute-0 nova_compute[182092]: 2026-01-23 09:45:37.939 182096 DEBUG oslo_concurrency.lockutils [req-9d87346b-5579-4aee-b95a-5f21018c630c req-b7bd1a2d-2d45-4b7a-9cb6-aaf22ce9d4a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:45:37 compute-0 nova_compute[182092]: 2026-01-23 09:45:37.940 182096 DEBUG oslo_concurrency.lockutils [req-9d87346b-5579-4aee-b95a-5f21018c630c req-b7bd1a2d-2d45-4b7a-9cb6-aaf22ce9d4a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:45:37 compute-0 nova_compute[182092]: 2026-01-23 09:45:37.940 182096 DEBUG nova.network.neutron [req-9d87346b-5579-4aee-b95a-5f21018c630c req-b7bd1a2d-2d45-4b7a-9cb6-aaf22ce9d4a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:45:39 compute-0 nova_compute[182092]: 2026-01-23 09:45:39.449 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:39.875 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:39.876 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:39.876 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.795 182096 DEBUG nova.network.neutron [req-9d87346b-5579-4aee-b95a-5f21018c630c req-b7bd1a2d-2d45-4b7a-9cb6-aaf22ce9d4a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updated VIF entry in instance network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.796 182096 DEBUG nova.network.neutron [req-9d87346b-5579-4aee-b95a-5f21018c630c req-b7bd1a2d-2d45-4b7a-9cb6-aaf22ce9d4a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updating instance_info_cache with network_info: [{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.804 182096 INFO nova.compute.manager [None req-22697b4d-98f4-4a43-aad3-97db470408da 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Get console output
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.807 210059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.809 182096 DEBUG nova.compute.manager [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.809 182096 DEBUG oslo_concurrency.lockutils [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.809 182096 DEBUG oslo_concurrency.lockutils [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.810 182096 DEBUG oslo_concurrency.lockutils [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.810 182096 DEBUG nova.compute.manager [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] No waiting events found dispatching network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.810 182096 WARNING nova.compute.manager [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received unexpected event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 for instance with vm_state active and task_state None.
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.810 182096 DEBUG nova.compute.manager [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.810 182096 DEBUG oslo_concurrency.lockutils [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.811 182096 DEBUG oslo_concurrency.lockutils [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.811 182096 DEBUG oslo_concurrency.lockutils [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.811 182096 DEBUG nova.compute.manager [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] No waiting events found dispatching network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.811 182096 WARNING nova.compute.manager [req-ad9a7813-8c0e-4466-813c-8e8e02c740fc req-49648723-ca19-461b-8612-71491baa6634 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received unexpected event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 for instance with vm_state active and task_state None.
Jan 23 09:45:40 compute-0 nova_compute[182092]: 2026-01-23 09:45:40.818 182096 DEBUG oslo_concurrency.lockutils [req-9d87346b-5579-4aee-b95a-5f21018c630c req-b7bd1a2d-2d45-4b7a-9cb6-aaf22ce9d4a1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:45:42 compute-0 nova_compute[182092]: 2026-01-23 09:45:42.622 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:44 compute-0 nova_compute[182092]: 2026-01-23 09:45:44.451 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:45 compute-0 podman[233121]: 2026-01-23 09:45:45.202167695 +0000 UTC m=+0.040273252 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Jan 23 09:45:45 compute-0 podman[233122]: 2026-01-23 09:45:45.207251722 +0000 UTC m=+0.043580979 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:45:47 compute-0 nova_compute[182092]: 2026-01-23 09:45:47.623 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:48 compute-0 podman[233159]: 2026-01-23 09:45:48.199135178 +0000 UTC m=+0.037115106 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.049 182096 DEBUG nova.compute.manager [req-102dc95d-cccb-4cca-8e3f-e19f7ea51dbc req-3e2a377b-7ea8-4fad-99e7-d8e85d156c88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.049 182096 DEBUG nova.compute.manager [req-102dc95d-cccb-4cca-8e3f-e19f7ea51dbc req-3e2a377b-7ea8-4fad-99e7-d8e85d156c88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing instance network info cache due to event network-changed-e94cd37f-e05e-4331-8160-f8e0c35a36c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.049 182096 DEBUG oslo_concurrency.lockutils [req-102dc95d-cccb-4cca-8e3f-e19f7ea51dbc req-3e2a377b-7ea8-4fad-99e7-d8e85d156c88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.050 182096 DEBUG oslo_concurrency.lockutils [req-102dc95d-cccb-4cca-8e3f-e19f7ea51dbc req-3e2a377b-7ea8-4fad-99e7-d8e85d156c88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.050 182096 DEBUG nova.network.neutron [req-102dc95d-cccb-4cca-8e3f-e19f7ea51dbc req-3e2a377b-7ea8-4fad-99e7-d8e85d156c88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Refreshing network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.136 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.136 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.136 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.137 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.137 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.144 182096 INFO nova.compute.manager [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Terminating instance
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.149 182096 DEBUG nova.compute.manager [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:45:49 compute-0 kernel: tape94cd37f-e0 (unregistering): left promiscuous mode
Jan 23 09:45:49 compute-0 NetworkManager[54920]: <info>  [1769161549.1683] device (tape94cd37f-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:45:49 compute-0 ovn_controller[94697]: 2026-01-23T09:45:49Z|00739|binding|INFO|Releasing lport e94cd37f-e05e-4331-8160-f8e0c35a36c5 from this chassis (sb_readonly=0)
Jan 23 09:45:49 compute-0 ovn_controller[94697]: 2026-01-23T09:45:49Z|00740|binding|INFO|Setting lport e94cd37f-e05e-4331-8160-f8e0c35a36c5 down in Southbound
Jan 23 09:45:49 compute-0 ovn_controller[94697]: 2026-01-23T09:45:49Z|00741|binding|INFO|Removing iface tape94cd37f-e0 ovn-installed in OVS
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.176 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.187 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:1e:2c 10.100.0.9'], port_security=['fa:16:3e:f8:1e:2c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26aa0032-133a-4cef-adf4-15be17c564a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f0bc8cb17ec0443f927dd0bcb35e4b73', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5c360d80-c223-4268-b489-ca9bf10523e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb68eb38-a889-4b0c-8963-7dc56140e204, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=e94cd37f-e05e-4331-8160-f8e0c35a36c5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.188 103978 INFO neutron.agent.ovn.metadata.agent [-] Port e94cd37f-e05e-4331-8160-f8e0c35a36c5 in datapath 26aa0032-133a-4cef-adf4-15be17c564a2 unbound from our chassis
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.189 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 26aa0032-133a-4cef-adf4-15be17c564a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.189 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5cdc1c-c1ea-48dd-9803-59e7a92efdba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.190 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2 namespace which is not needed anymore
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.191 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:49 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Jan 23 09:45:49 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b1.scope: Consumed 13.088s CPU time.
Jan 23 09:45:49 compute-0 systemd-machined[153562]: Machine qemu-85-instance-000000b1 terminated.
Jan 23 09:45:49 compute-0 neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2[232901]: [NOTICE]   (232905) : haproxy version is 2.8.14-c23fe91
Jan 23 09:45:49 compute-0 neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2[232901]: [NOTICE]   (232905) : path to executable is /usr/sbin/haproxy
Jan 23 09:45:49 compute-0 neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2[232901]: [WARNING]  (232905) : Exiting Master process...
Jan 23 09:45:49 compute-0 neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2[232901]: [ALERT]    (232905) : Current worker (232907) exited with code 143 (Terminated)
Jan 23 09:45:49 compute-0 neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2[232901]: [WARNING]  (232905) : All workers exited. Exiting... (0)
Jan 23 09:45:49 compute-0 systemd[1]: libpod-8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f.scope: Deactivated successfully.
Jan 23 09:45:49 compute-0 podman[233198]: 2026-01-23 09:45:49.286637106 +0000 UTC m=+0.032157429 container died 8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 09:45:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f-userdata-shm.mount: Deactivated successfully.
Jan 23 09:45:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-ed9073c0d273cda80aed710a1dc8a72e61f0537559dad73bc97fe2b1dafb8598-merged.mount: Deactivated successfully.
Jan 23 09:45:49 compute-0 podman[233198]: 2026-01-23 09:45:49.302947623 +0000 UTC m=+0.048467945 container cleanup 8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:45:49 compute-0 systemd[1]: libpod-conmon-8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f.scope: Deactivated successfully.
Jan 23 09:45:49 compute-0 podman[233221]: 2026-01-23 09:45:49.34409936 +0000 UTC m=+0.025153281 container remove 8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.347 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fb8bb68d-78b4-477d-89ae-f6c36bd38ed4]: (4, ('Fri Jan 23 09:45:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2 (8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f)\n8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f\nFri Jan 23 09:45:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2 (8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f)\n8276d1ee760c7d7b0b086ff3107936c2b169fe34b420965dda637936e461028f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.348 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac797ae-e665-4bbf-a6f3-f525e951a2a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.349 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26aa0032-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.351 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:49 compute-0 kernel: tap26aa0032-10: left promiscuous mode
Jan 23 09:45:49 compute-0 NetworkManager[54920]: <info>  [1769161549.3674] manager: (tape94cd37f-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/362)
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.369 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.371 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1532457c-6728-4a37-aaf7-6ef89ac350f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.379 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[2ced1c9d-719f-438f-9b91-a72679dcd690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.380 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[91156ee1-1f37-465e-9fd0-298c8fbfe4c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.394 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[074d56e3-b38e-43a0-b1b3-863275dbfa12]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500050, 'reachable_time': 32830, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233246, 'error': None, 'target': 'ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d26aa0032\x2d133a\x2d4cef\x2dadf4\x2d15be17c564a2.mount: Deactivated successfully.
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.396 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-26aa0032-133a-4cef-adf4-15be17c564a2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:45:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:49.396 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[01a2cec5-da7b-4cce-b656-bf15012a3cd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.398 182096 INFO nova.virt.libvirt.driver [-] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Instance destroyed successfully.
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.398 182096 DEBUG nova.objects.instance [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lazy-loading 'resources' on Instance uuid 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.416 182096 DEBUG nova.virt.libvirt.vif [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:44:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2111643825',display_name='tempest-TestNetworkBasicOps-server-2111643825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2111643825',id=177,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGFy75fyvrHxbyqAAhFr4J6Ziy6GqjQ0wUhuA1DpaoQawmDAyCd9Du7n18nse+F7aDNri/78W+z5Oz5kv9iU7goXYxpO5WxLH93rooOVUPnBhmb6XDAVWi1PUwaWxSoCrg==',key_name='tempest-TestNetworkBasicOps-682509342',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:44:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f0bc8cb17ec0443f927dd0bcb35e4b73',ramdisk_id='',reservation_id='r-w122agb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-80327574',owner_user_name='tempest-TestNetworkBasicOps-80327574-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:44:57Z,user_data=None,user_id='8aa2911d0bc0474cb77214528548d308',uuid=21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.416 182096 DEBUG nova.network.os_vif_util [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converting VIF {"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.417 182096 DEBUG nova.network.os_vif_util [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:1e:2c,bridge_name='br-int',has_traffic_filtering=True,id=e94cd37f-e05e-4331-8160-f8e0c35a36c5,network=Network(26aa0032-133a-4cef-adf4-15be17c564a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94cd37f-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.417 182096 DEBUG os_vif [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:1e:2c,bridge_name='br-int',has_traffic_filtering=True,id=e94cd37f-e05e-4331-8160-f8e0c35a36c5,network=Network(26aa0032-133a-4cef-adf4-15be17c564a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94cd37f-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.419 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.419 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape94cd37f-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.420 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.422 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.424 182096 INFO os_vif [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:1e:2c,bridge_name='br-int',has_traffic_filtering=True,id=e94cd37f-e05e-4331-8160-f8e0c35a36c5,network=Network(26aa0032-133a-4cef-adf4-15be17c564a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape94cd37f-e0')
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.424 182096 INFO nova.virt.libvirt.driver [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Deleting instance files /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5_del
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.425 182096 INFO nova.virt.libvirt.driver [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Deletion of /var/lib/nova/instances/21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5_del complete
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.432 182096 DEBUG nova.compute.manager [req-f74b8f20-b8bc-454a-a969-64b3193fd694 req-92a9d1c8-6f3c-4456-8db9-6951146e301e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-unplugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.433 182096 DEBUG oslo_concurrency.lockutils [req-f74b8f20-b8bc-454a-a969-64b3193fd694 req-92a9d1c8-6f3c-4456-8db9-6951146e301e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.433 182096 DEBUG oslo_concurrency.lockutils [req-f74b8f20-b8bc-454a-a969-64b3193fd694 req-92a9d1c8-6f3c-4456-8db9-6951146e301e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.433 182096 DEBUG oslo_concurrency.lockutils [req-f74b8f20-b8bc-454a-a969-64b3193fd694 req-92a9d1c8-6f3c-4456-8db9-6951146e301e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.433 182096 DEBUG nova.compute.manager [req-f74b8f20-b8bc-454a-a969-64b3193fd694 req-92a9d1c8-6f3c-4456-8db9-6951146e301e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] No waiting events found dispatching network-vif-unplugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.433 182096 DEBUG nova.compute.manager [req-f74b8f20-b8bc-454a-a969-64b3193fd694 req-92a9d1c8-6f3c-4456-8db9-6951146e301e 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-unplugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.477 182096 INFO nova.compute.manager [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.478 182096 DEBUG oslo.service.loopingcall [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.478 182096 DEBUG nova.compute.manager [-] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:45:49 compute-0 nova_compute[182092]: 2026-01-23 09:45:49.478 182096 DEBUG nova.network.neutron [-] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.077 182096 DEBUG nova.network.neutron [-] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.106 182096 INFO nova.compute.manager [-] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Took 0.63 seconds to deallocate network for instance.
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.178 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.179 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.181 182096 DEBUG nova.compute.manager [req-b171bd39-4cd6-4b81-a180-66137fc09ec4 req-7b56b55e-bd28-4f9a-a292-8b5e9e4e7873 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-deleted-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.236 182096 DEBUG nova.compute.provider_tree [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.253 182096 DEBUG nova.scheduler.client.report [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.270 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.302 182096 INFO nova.scheduler.client.report [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Deleted allocations for instance 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.361 182096 DEBUG nova.network.neutron [req-102dc95d-cccb-4cca-8e3f-e19f7ea51dbc req-3e2a377b-7ea8-4fad-99e7-d8e85d156c88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updated VIF entry in instance network info cache for port e94cd37f-e05e-4331-8160-f8e0c35a36c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.361 182096 DEBUG nova.network.neutron [req-102dc95d-cccb-4cca-8e3f-e19f7ea51dbc req-3e2a377b-7ea8-4fad-99e7-d8e85d156c88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Updating instance_info_cache with network_info: [{"id": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "address": "fa:16:3e:f8:1e:2c", "network": {"id": "26aa0032-133a-4cef-adf4-15be17c564a2", "bridge": "br-int", "label": "tempest-network-smoke--611014026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f0bc8cb17ec0443f927dd0bcb35e4b73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape94cd37f-e0", "ovs_interfaceid": "e94cd37f-e05e-4331-8160-f8e0c35a36c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.378 182096 DEBUG oslo_concurrency.lockutils [req-102dc95d-cccb-4cca-8e3f-e19f7ea51dbc req-3e2a377b-7ea8-4fad-99e7-d8e85d156c88 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:45:50 compute-0 nova_compute[182092]: 2026-01-23 09:45:50.379 182096 DEBUG oslo_concurrency.lockutils [None req-337ad4e6-bcac-4a1d-980e-cab86fe38f95 8aa2911d0bc0474cb77214528548d308 f0bc8cb17ec0443f927dd0bcb35e4b73 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:51 compute-0 nova_compute[182092]: 2026-01-23 09:45:51.542 182096 DEBUG nova.compute.manager [req-5b000139-45d0-4fa9-9982-ba2edc485fcb req-3a2e120a-66e7-49e8-8382-e2bec8a8cb18 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:45:51 compute-0 nova_compute[182092]: 2026-01-23 09:45:51.542 182096 DEBUG oslo_concurrency.lockutils [req-5b000139-45d0-4fa9-9982-ba2edc485fcb req-3a2e120a-66e7-49e8-8382-e2bec8a8cb18 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:45:51 compute-0 nova_compute[182092]: 2026-01-23 09:45:51.542 182096 DEBUG oslo_concurrency.lockutils [req-5b000139-45d0-4fa9-9982-ba2edc485fcb req-3a2e120a-66e7-49e8-8382-e2bec8a8cb18 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:45:51 compute-0 nova_compute[182092]: 2026-01-23 09:45:51.542 182096 DEBUG oslo_concurrency.lockutils [req-5b000139-45d0-4fa9-9982-ba2edc485fcb req-3a2e120a-66e7-49e8-8382-e2bec8a8cb18 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:45:51 compute-0 nova_compute[182092]: 2026-01-23 09:45:51.543 182096 DEBUG nova.compute.manager [req-5b000139-45d0-4fa9-9982-ba2edc485fcb req-3a2e120a-66e7-49e8-8382-e2bec8a8cb18 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] No waiting events found dispatching network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:45:51 compute-0 nova_compute[182092]: 2026-01-23 09:45:51.543 182096 WARNING nova.compute.manager [req-5b000139-45d0-4fa9-9982-ba2edc485fcb req-3a2e120a-66e7-49e8-8382-e2bec8a8cb18 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Received unexpected event network-vif-plugged-e94cd37f-e05e-4331-8160-f8e0c35a36c5 for instance with vm_state deleted and task_state None.
Jan 23 09:45:52 compute-0 nova_compute[182092]: 2026-01-23 09:45:52.625 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:53 compute-0 nova_compute[182092]: 2026-01-23 09:45:53.874 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:53 compute-0 nova_compute[182092]: 2026-01-23 09:45:53.989 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:54.215 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:45:54 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:54.216 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:45:54 compute-0 nova_compute[182092]: 2026-01-23 09:45:54.216 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:54 compute-0 nova_compute[182092]: 2026-01-23 09:45:54.420 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:57 compute-0 nova_compute[182092]: 2026-01-23 09:45:57.626 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:45:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:45:58.217 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:45:59 compute-0 podman[233253]: 2026-01-23 09:45:59.225151762 +0000 UTC m=+0.060328037 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 09:45:59 compute-0 nova_compute[182092]: 2026-01-23 09:45:59.421 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:02 compute-0 nova_compute[182092]: 2026-01-23 09:46:02.629 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.396 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161549.395597, 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.397 182096 INFO nova.compute.manager [-] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] VM Stopped (Lifecycle Event)
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.420 182096 DEBUG nova.compute.manager [None req-2d2c1411-ca60-4c02-81cd-4b5c383d28c9 - - - - - -] [instance: 21b9e003-f9ca-40e6-9a1b-fe7968a2f9a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.422 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.527 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.528 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.562 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.661 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.661 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.667 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.667 182096 INFO nova.compute.claims [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.785 182096 DEBUG nova.compute.provider_tree [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.794 182096 DEBUG nova.scheduler.client.report [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.812 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.813 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.887 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.887 182096 DEBUG nova.network.neutron [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.905 182096 INFO nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:46:04 compute-0 nova_compute[182092]: 2026-01-23 09:46:04.921 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.046 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.047 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.047 182096 INFO nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Creating image(s)
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.048 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "/var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.048 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.049 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.058 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.106 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.106 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.107 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.116 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.162 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.163 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.185 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.186 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.186 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.231 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.232 182096 DEBUG nova.virt.disk.api [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Checking if we can resize image /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.233 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.278 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.279 182096 DEBUG nova.virt.disk.api [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Cannot resize image /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.279 182096 DEBUG nova.objects.instance [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'migration_context' on Instance uuid 78ec13a1-543d-4ea8-8050-65fa6257d9fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.291 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.291 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Ensure instance console log exists: /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.292 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.292 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.292 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:05 compute-0 nova_compute[182092]: 2026-01-23 09:46:05.586 182096 DEBUG nova.policy [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:46:06 compute-0 podman[233293]: 2026-01-23 09:46:06.197991932 +0000 UTC m=+0.034814519 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:46:06 compute-0 podman[233292]: 2026-01-23 09:46:06.206277245 +0000 UTC m=+0.044432253 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 09:46:07 compute-0 nova_compute[182092]: 2026-01-23 09:46:07.629 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:08 compute-0 nova_compute[182092]: 2026-01-23 09:46:08.625 182096 DEBUG nova.network.neutron [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Successfully created port: 01da91c5-3e43-4507-80c4-3878ddbe17d2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:46:09 compute-0 nova_compute[182092]: 2026-01-23 09:46:09.423 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:09 compute-0 nova_compute[182092]: 2026-01-23 09:46:09.817 182096 DEBUG nova.network.neutron [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Successfully updated port: 01da91c5-3e43-4507-80c4-3878ddbe17d2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:46:09 compute-0 nova_compute[182092]: 2026-01-23 09:46:09.832 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:46:09 compute-0 nova_compute[182092]: 2026-01-23 09:46:09.832 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquired lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:46:09 compute-0 nova_compute[182092]: 2026-01-23 09:46:09.833 182096 DEBUG nova.network.neutron [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:46:09 compute-0 nova_compute[182092]: 2026-01-23 09:46:09.910 182096 DEBUG nova.compute.manager [req-b0dcb3b1-df01-479d-b658-d28ab50986f7 req-7753a73c-477d-44bb-a4a6-061d5622fb0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-changed-01da91c5-3e43-4507-80c4-3878ddbe17d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:46:09 compute-0 nova_compute[182092]: 2026-01-23 09:46:09.911 182096 DEBUG nova.compute.manager [req-b0dcb3b1-df01-479d-b658-d28ab50986f7 req-7753a73c-477d-44bb-a4a6-061d5622fb0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Refreshing instance network info cache due to event network-changed-01da91c5-3e43-4507-80c4-3878ddbe17d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:46:09 compute-0 nova_compute[182092]: 2026-01-23 09:46:09.911 182096 DEBUG oslo_concurrency.lockutils [req-b0dcb3b1-df01-479d-b658-d28ab50986f7 req-7753a73c-477d-44bb-a4a6-061d5622fb0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:46:10 compute-0 nova_compute[182092]: 2026-01-23 09:46:10.845 182096 DEBUG nova.network.neutron [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:46:12 compute-0 nova_compute[182092]: 2026-01-23 09:46:12.631 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.823 182096 DEBUG nova.network.neutron [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updating instance_info_cache with network_info: [{"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.836 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Releasing lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.836 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Instance network_info: |[{"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.837 182096 DEBUG oslo_concurrency.lockutils [req-b0dcb3b1-df01-479d-b658-d28ab50986f7 req-7753a73c-477d-44bb-a4a6-061d5622fb0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.837 182096 DEBUG nova.network.neutron [req-b0dcb3b1-df01-479d-b658-d28ab50986f7 req-7753a73c-477d-44bb-a4a6-061d5622fb0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Refreshing network info cache for port 01da91c5-3e43-4507-80c4-3878ddbe17d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.839 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Start _get_guest_xml network_info=[{"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.844 182096 WARNING nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.848 182096 DEBUG nova.virt.libvirt.host [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.848 182096 DEBUG nova.virt.libvirt.host [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.853 182096 DEBUG nova.virt.libvirt.host [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.853 182096 DEBUG nova.virt.libvirt.host [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.854 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.854 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.855 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.855 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.855 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.855 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.855 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.856 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.856 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.856 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.856 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.856 182096 DEBUG nova.virt.hardware [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.859 182096 DEBUG nova.virt.libvirt.vif [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1431643373',display_name='tempest-TestGettingAddress-server-1431643373',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1431643373',id=180,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1WmpVwZgssx0dKhDaYyOZoXvrh//REcNU1I6fcdxkeIdHI6/R2eKf0MVdctH9uq6G3nY4ECs0iB1WYuY4olFiFPn/iZzzrfzhp9S8favg4NKplZPC7FtUDcxIHvirdfQ==',key_name='tempest-TestGettingAddress-1521133497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-8zpw90zg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:46:04Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=78ec13a1-543d-4ea8-8050-65fa6257d9fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.859 182096 DEBUG nova.network.os_vif_util [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.860 182096 DEBUG nova.network.os_vif_util [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:c0:07,bridge_name='br-int',has_traffic_filtering=True,id=01da91c5-3e43-4507-80c4-3878ddbe17d2,network=Network(136fe3f2-f077-4093-8452-cdc02e0c2016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01da91c5-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.861 182096 DEBUG nova.objects.instance [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'pci_devices' on Instance uuid 78ec13a1-543d-4ea8-8050-65fa6257d9fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.872 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <uuid>78ec13a1-543d-4ea8-8050-65fa6257d9fb</uuid>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <name>instance-000000b4</name>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <nova:name>tempest-TestGettingAddress-server-1431643373</nova:name>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:46:13</nova:creationTime>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:46:13 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:46:13 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:46:13 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:46:13 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:46:13 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:46:13 compute-0 nova_compute[182092]:         <nova:user uuid="2223cd913aab4f7cbffc6e9c703c6acc">tempest-TestGettingAddress-10741833-project-member</nova:user>
Jan 23 09:46:13 compute-0 nova_compute[182092]:         <nova:project uuid="d4181f6c647942e881af13381cc2f253">tempest-TestGettingAddress-10741833</nova:project>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:46:13 compute-0 nova_compute[182092]:         <nova:port uuid="01da91c5-3e43-4507-80c4-3878ddbe17d2">
Jan 23 09:46:13 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fefd:c007" ipVersion="6"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fefd:c007" ipVersion="6"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <system>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <entry name="serial">78ec13a1-543d-4ea8-8050-65fa6257d9fb</entry>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <entry name="uuid">78ec13a1-543d-4ea8-8050-65fa6257d9fb</entry>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </system>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <os>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   </os>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <features>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   </features>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.config"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:fd:c0:07"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <target dev="tap01da91c5-3e"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/console.log" append="off"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <video>
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </video>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:46:13 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:46:13 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:46:13 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:46:13 compute-0 nova_compute[182092]: </domain>
Jan 23 09:46:13 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.873 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Preparing to wait for external event network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.873 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.874 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.874 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.874 182096 DEBUG nova.virt.libvirt.vif [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1431643373',display_name='tempest-TestGettingAddress-server-1431643373',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1431643373',id=180,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1WmpVwZgssx0dKhDaYyOZoXvrh//REcNU1I6fcdxkeIdHI6/R2eKf0MVdctH9uq6G3nY4ECs0iB1WYuY4olFiFPn/iZzzrfzhp9S8favg4NKplZPC7FtUDcxIHvirdfQ==',key_name='tempest-TestGettingAddress-1521133497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-8zpw90zg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:46:04Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=78ec13a1-543d-4ea8-8050-65fa6257d9fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.875 182096 DEBUG nova.network.os_vif_util [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.875 182096 DEBUG nova.network.os_vif_util [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:c0:07,bridge_name='br-int',has_traffic_filtering=True,id=01da91c5-3e43-4507-80c4-3878ddbe17d2,network=Network(136fe3f2-f077-4093-8452-cdc02e0c2016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01da91c5-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.875 182096 DEBUG os_vif [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:c0:07,bridge_name='br-int',has_traffic_filtering=True,id=01da91c5-3e43-4507-80c4-3878ddbe17d2,network=Network(136fe3f2-f077-4093-8452-cdc02e0c2016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01da91c5-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.876 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.876 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.876 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.878 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.879 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01da91c5-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.879 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01da91c5-3e, col_values=(('external_ids', {'iface-id': '01da91c5-3e43-4507-80c4-3878ddbe17d2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:c0:07', 'vm-uuid': '78ec13a1-543d-4ea8-8050-65fa6257d9fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.881 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:13 compute-0 NetworkManager[54920]: <info>  [1769161573.8820] manager: (tap01da91c5-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/363)
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.883 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.886 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.887 182096 INFO os_vif [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:c0:07,bridge_name='br-int',has_traffic_filtering=True,id=01da91c5-3e43-4507-80c4-3878ddbe17d2,network=Network(136fe3f2-f077-4093-8452-cdc02e0c2016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01da91c5-3e')
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.922 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.922 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.922 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No VIF found with MAC fa:16:3e:fd:c0:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:46:13 compute-0 nova_compute[182092]: 2026-01-23 09:46:13.923 182096 INFO nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Using config drive
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.235 182096 INFO nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Creating config drive at /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.config
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.239 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyfvet0cr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.359 182096 DEBUG oslo_concurrency.processutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyfvet0cr" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:46:14 compute-0 kernel: tap01da91c5-3e: entered promiscuous mode
Jan 23 09:46:14 compute-0 NetworkManager[54920]: <info>  [1769161574.4148] manager: (tap01da91c5-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/364)
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.416 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 ovn_controller[94697]: 2026-01-23T09:46:14Z|00742|binding|INFO|Claiming lport 01da91c5-3e43-4507-80c4-3878ddbe17d2 for this chassis.
Jan 23 09:46:14 compute-0 ovn_controller[94697]: 2026-01-23T09:46:14Z|00743|binding|INFO|01da91c5-3e43-4507-80c4-3878ddbe17d2: Claiming fa:16:3e:fd:c0:07 10.100.0.14 2001:db8:0:1:f816:3eff:fefd:c007 2001:db8::f816:3eff:fefd:c007
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.420 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.422 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 NetworkManager[54920]: <info>  [1769161574.4256] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Jan 23 09:46:14 compute-0 NetworkManager[54920]: <info>  [1769161574.4271] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.430 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:c0:07 10.100.0.14 2001:db8:0:1:f816:3eff:fefd:c007 2001:db8::f816:3eff:fefd:c007'], port_security=['fa:16:3e:fd:c0:07 10.100.0.14 2001:db8:0:1:f816:3eff:fefd:c007 2001:db8::f816:3eff:fefd:c007'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28 2001:db8:0:1:f816:3eff:fefd:c007/64 2001:db8::f816:3eff:fefd:c007/64', 'neutron:device_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f4795df8-adb2-4ea9-adaf-b20039ae5a87', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b15eafd7-caf2-4600-afb8-dc5a43f9194c, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=01da91c5-3e43-4507-80c4-3878ddbe17d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.431 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 01da91c5-3e43-4507-80c4-3878ddbe17d2 in datapath 136fe3f2-f077-4093-8452-cdc02e0c2016 bound to our chassis
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.432 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 136fe3f2-f077-4093-8452-cdc02e0c2016
Jan 23 09:46:14 compute-0 systemd-udevd[233348]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.442 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4998faf2-0631-445f-914a-41536ef26e1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.443 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap136fe3f2-f1 in ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.444 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap136fe3f2-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.444 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[013f56be-0f3d-4cd0-92cd-64f8ec000a5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 systemd-machined[153562]: New machine qemu-86-instance-000000b4.
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.445 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f5029a2f-5aae-43f4-b98f-ba73d0bf3cda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 NetworkManager[54920]: <info>  [1769161574.4535] device (tap01da91c5-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:46:14 compute-0 NetworkManager[54920]: <info>  [1769161574.4543] device (tap01da91c5-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.457 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[e3df16ce-7e1c-476b-a5c2-d1ae6bf4ebf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 systemd[1]: Started Virtual Machine qemu-86-instance-000000b4.
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.486 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e95b27-1355-4bb0-9e9f-777ab29502c3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.511 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[506635ba-3b4c-44f0-8558-3415be7c9e6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 NetworkManager[54920]: <info>  [1769161574.5363] manager: (tap136fe3f2-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/367)
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.538 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.538 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a46ce4e2-167b-4fc6-933d-a8c17bf0c37c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.558 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 ovn_controller[94697]: 2026-01-23T09:46:14Z|00744|binding|INFO|Setting lport 01da91c5-3e43-4507-80c4-3878ddbe17d2 ovn-installed in OVS
Jan 23 09:46:14 compute-0 ovn_controller[94697]: 2026-01-23T09:46:14Z|00745|binding|INFO|Setting lport 01da91c5-3e43-4507-80c4-3878ddbe17d2 up in Southbound
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.570 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.572 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[860442c1-91e8-478c-99a3-a2c856f31ce9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.574 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[9448c00a-777d-49fc-9216-3fd343442f44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 NetworkManager[54920]: <info>  [1769161574.5929] device (tap136fe3f2-f0): carrier: link connected
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.596 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[087d14b9-0c58-47e7-9dbd-07f7224d1d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.610 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9c0240-294b-4991-8ea0-c180a90572f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap136fe3f2-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:9e:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508007, 'reachable_time': 34039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233376, 'error': None, 'target': 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.622 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3d0e9976-e627-4a08-968f-4234b49e9984]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:9e45'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508007, 'tstamp': 508007}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233377, 'error': None, 'target': 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.635 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[27e7055c-bf36-4edb-b884-c67154156d8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap136fe3f2-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:9e:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508007, 'reachable_time': 34039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233378, 'error': None, 'target': 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.659 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[4a7e0b76-ce80-4c9e-ad54-72754be0e2b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.706 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d2addc94-a57a-4c4c-a259-e25ea366d48e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.707 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap136fe3f2-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.707 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.707 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap136fe3f2-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.709 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 kernel: tap136fe3f2-f0: entered promiscuous mode
Jan 23 09:46:14 compute-0 NetworkManager[54920]: <info>  [1769161574.7096] manager: (tap136fe3f2-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/368)
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.713 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap136fe3f2-f0, col_values=(('external_ids', {'iface-id': '9f7f3dca-a3f6-4bfe-9810-13603b340b1c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.714 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 ovn_controller[94697]: 2026-01-23T09:46:14Z|00746|binding|INFO|Releasing lport 9f7f3dca-a3f6-4bfe-9810-13603b340b1c from this chassis (sb_readonly=0)
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.717 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/136fe3f2-f077-4093-8452-cdc02e0c2016.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/136fe3f2-f077-4093-8452-cdc02e0c2016.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.717 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5abf4b-3c22-4edd-b192-ff28ad899009]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.718 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-136fe3f2-f077-4093-8452-cdc02e0c2016
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/136fe3f2-f077-4093-8452-cdc02e0c2016.pid.haproxy
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 136fe3f2-f077-4093-8452-cdc02e0c2016
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:46:14 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:14.718 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'env', 'PROCESS_TAG=haproxy-136fe3f2-f077-4093-8452-cdc02e0c2016', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/136fe3f2-f077-4093-8452-cdc02e0c2016.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.726 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.822 182096 DEBUG nova.compute.manager [req-f781efa3-2f0e-4f23-890f-4a614db3ddd4 req-4cd01ed4-16fd-45ae-9f89-774fb314c11f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.828 182096 DEBUG oslo_concurrency.lockutils [req-f781efa3-2f0e-4f23-890f-4a614db3ddd4 req-4cd01ed4-16fd-45ae-9f89-774fb314c11f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.828 182096 DEBUG oslo_concurrency.lockutils [req-f781efa3-2f0e-4f23-890f-4a614db3ddd4 req-4cd01ed4-16fd-45ae-9f89-774fb314c11f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.829 182096 DEBUG oslo_concurrency.lockutils [req-f781efa3-2f0e-4f23-890f-4a614db3ddd4 req-4cd01ed4-16fd-45ae-9f89-774fb314c11f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:14 compute-0 nova_compute[182092]: 2026-01-23 09:46:14.829 182096 DEBUG nova.compute.manager [req-f781efa3-2f0e-4f23-890f-4a614db3ddd4 req-4cd01ed4-16fd-45ae-9f89-774fb314c11f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Processing event network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:46:14 compute-0 podman[233408]: 2026-01-23 09:46:14.998283307 +0000 UTC m=+0.032135516 container create 077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 09:46:15 compute-0 systemd[1]: Started libpod-conmon-077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56.scope.
Jan 23 09:46:15 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:46:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f6a0808d0dce8bf896430a811aef7eff1ef94bf49cd28fe7be490c38b347359/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:46:15 compute-0 podman[233408]: 2026-01-23 09:46:15.060815891 +0000 UTC m=+0.094668110 container init 077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 09:46:15 compute-0 podman[233408]: 2026-01-23 09:46:15.064839077 +0000 UTC m=+0.098691286 container start 077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 09:46:15 compute-0 podman[233408]: 2026-01-23 09:46:14.983531702 +0000 UTC m=+0.017383921 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:46:15 compute-0 neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016[233421]: [NOTICE]   (233425) : New worker (233427) forked
Jan 23 09:46:15 compute-0 neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016[233421]: [NOTICE]   (233425) : Loading success.
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.370 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161575.369608, 78ec13a1-543d-4ea8-8050-65fa6257d9fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.370 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] VM Started (Lifecycle Event)
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.372 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.376 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.379 182096 INFO nova.virt.libvirt.driver [-] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Instance spawned successfully.
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.379 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.393 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.397 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.400 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.400 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.400 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.401 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.401 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.402 182096 DEBUG nova.virt.libvirt.driver [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.421 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.422 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161575.3703964, 78ec13a1-543d-4ea8-8050-65fa6257d9fb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.422 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] VM Paused (Lifecycle Event)
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.438 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.440 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161575.3744059, 78ec13a1-543d-4ea8-8050-65fa6257d9fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.440 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] VM Resumed (Lifecycle Event)
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.451 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.453 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.459 182096 INFO nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Took 10.41 seconds to spawn the instance on the hypervisor.
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.459 182096 DEBUG nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.467 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.528 182096 INFO nova.compute.manager [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Took 10.91 seconds to build instance.
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.548 182096 DEBUG oslo_concurrency.lockutils [None req-e31e22b4-605e-46b6-abc6-0a8fe9496c40 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.822 182096 DEBUG nova.network.neutron [req-b0dcb3b1-df01-479d-b658-d28ab50986f7 req-7753a73c-477d-44bb-a4a6-061d5622fb0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updated VIF entry in instance network info cache for port 01da91c5-3e43-4507-80c4-3878ddbe17d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.823 182096 DEBUG nova.network.neutron [req-b0dcb3b1-df01-479d-b658-d28ab50986f7 req-7753a73c-477d-44bb-a4a6-061d5622fb0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updating instance_info_cache with network_info: [{"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:46:15 compute-0 nova_compute[182092]: 2026-01-23 09:46:15.840 182096 DEBUG oslo_concurrency.lockutils [req-b0dcb3b1-df01-479d-b658-d28ab50986f7 req-7753a73c-477d-44bb-a4a6-061d5622fb0b 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:46:16 compute-0 podman[233439]: 2026-01-23 09:46:16.209187677 +0000 UTC m=+0.041795161 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 09:46:16 compute-0 podman[233440]: 2026-01-23 09:46:16.218365823 +0000 UTC m=+0.048563113 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:46:16 compute-0 nova_compute[182092]: 2026-01-23 09:46:16.898 182096 DEBUG nova.compute.manager [req-03656102-d07d-4b5a-8b46-698d2be4c499 req-eb79235b-67bf-400f-992c-d1af0b1e5532 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:46:16 compute-0 nova_compute[182092]: 2026-01-23 09:46:16.898 182096 DEBUG oslo_concurrency.lockutils [req-03656102-d07d-4b5a-8b46-698d2be4c499 req-eb79235b-67bf-400f-992c-d1af0b1e5532 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:16 compute-0 nova_compute[182092]: 2026-01-23 09:46:16.898 182096 DEBUG oslo_concurrency.lockutils [req-03656102-d07d-4b5a-8b46-698d2be4c499 req-eb79235b-67bf-400f-992c-d1af0b1e5532 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:16 compute-0 nova_compute[182092]: 2026-01-23 09:46:16.898 182096 DEBUG oslo_concurrency.lockutils [req-03656102-d07d-4b5a-8b46-698d2be4c499 req-eb79235b-67bf-400f-992c-d1af0b1e5532 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:16 compute-0 nova_compute[182092]: 2026-01-23 09:46:16.899 182096 DEBUG nova.compute.manager [req-03656102-d07d-4b5a-8b46-698d2be4c499 req-eb79235b-67bf-400f-992c-d1af0b1e5532 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] No waiting events found dispatching network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:46:16 compute-0 nova_compute[182092]: 2026-01-23 09:46:16.899 182096 WARNING nova.compute.manager [req-03656102-d07d-4b5a-8b46-698d2be4c499 req-eb79235b-67bf-400f-992c-d1af0b1e5532 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received unexpected event network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 for instance with vm_state active and task_state None.
Jan 23 09:46:17 compute-0 nova_compute[182092]: 2026-01-23 09:46:17.633 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:18 compute-0 nova_compute[182092]: 2026-01-23 09:46:18.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:18 compute-0 nova_compute[182092]: 2026-01-23 09:46:18.882 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:19 compute-0 podman[233477]: 2026-01-23 09:46:19.216394717 +0000 UTC m=+0.049499909 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 23 09:46:19 compute-0 nova_compute[182092]: 2026-01-23 09:46:19.468 182096 DEBUG nova.compute.manager [req-41248ef5-a827-4217-947e-4f8d01a338af req-ff506ac3-68d7-4390-90d2-ebe6eaf82ac9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-changed-01da91c5-3e43-4507-80c4-3878ddbe17d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:46:19 compute-0 nova_compute[182092]: 2026-01-23 09:46:19.469 182096 DEBUG nova.compute.manager [req-41248ef5-a827-4217-947e-4f8d01a338af req-ff506ac3-68d7-4390-90d2-ebe6eaf82ac9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Refreshing instance network info cache due to event network-changed-01da91c5-3e43-4507-80c4-3878ddbe17d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:46:19 compute-0 nova_compute[182092]: 2026-01-23 09:46:19.469 182096 DEBUG oslo_concurrency.lockutils [req-41248ef5-a827-4217-947e-4f8d01a338af req-ff506ac3-68d7-4390-90d2-ebe6eaf82ac9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:46:19 compute-0 nova_compute[182092]: 2026-01-23 09:46:19.470 182096 DEBUG oslo_concurrency.lockutils [req-41248ef5-a827-4217-947e-4f8d01a338af req-ff506ac3-68d7-4390-90d2-ebe6eaf82ac9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:46:19 compute-0 nova_compute[182092]: 2026-01-23 09:46:19.470 182096 DEBUG nova.network.neutron [req-41248ef5-a827-4217-947e-4f8d01a338af req-ff506ac3-68d7-4390-90d2-ebe6eaf82ac9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Refreshing network info cache for port 01da91c5-3e43-4507-80c4-3878ddbe17d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:46:20 compute-0 nova_compute[182092]: 2026-01-23 09:46:20.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:20 compute-0 nova_compute[182092]: 2026-01-23 09:46:20.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:21 compute-0 nova_compute[182092]: 2026-01-23 09:46:21.139 182096 DEBUG nova.network.neutron [req-41248ef5-a827-4217-947e-4f8d01a338af req-ff506ac3-68d7-4390-90d2-ebe6eaf82ac9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updated VIF entry in instance network info cache for port 01da91c5-3e43-4507-80c4-3878ddbe17d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:46:21 compute-0 nova_compute[182092]: 2026-01-23 09:46:21.139 182096 DEBUG nova.network.neutron [req-41248ef5-a827-4217-947e-4f8d01a338af req-ff506ac3-68d7-4390-90d2-ebe6eaf82ac9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updating instance_info_cache with network_info: [{"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:46:21 compute-0 nova_compute[182092]: 2026-01-23 09:46:21.157 182096 DEBUG oslo_concurrency.lockutils [req-41248ef5-a827-4217-947e-4f8d01a338af req-ff506ac3-68d7-4390-90d2-ebe6eaf82ac9 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:46:21 compute-0 nova_compute[182092]: 2026-01-23 09:46:21.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.635 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.665 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.665 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.665 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.665 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.705 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.753 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.754 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:46:22 compute-0 nova_compute[182092]: 2026-01-23 09:46:22.811 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.028 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.029 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5559MB free_disk=73.2113151550293GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.029 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.030 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.082 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 78ec13a1-543d-4ea8-8050-65fa6257d9fb actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.082 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.083 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.113 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.122 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.139 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.139 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:23 compute-0 nova_compute[182092]: 2026-01-23 09:46:23.885 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:25 compute-0 nova_compute[182092]: 2026-01-23 09:46:25.140 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:25 compute-0 nova_compute[182092]: 2026-01-23 09:46:25.140 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:46:25 compute-0 nova_compute[182092]: 2026-01-23 09:46:25.141 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:46:25 compute-0 nova_compute[182092]: 2026-01-23 09:46:25.543 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:46:25 compute-0 nova_compute[182092]: 2026-01-23 09:46:25.543 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:46:25 compute-0 nova_compute[182092]: 2026-01-23 09:46:25.543 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:46:25 compute-0 nova_compute[182092]: 2026-01-23 09:46:25.544 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 78ec13a1-543d-4ea8-8050-65fa6257d9fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:46:26 compute-0 ovn_controller[94697]: 2026-01-23T09:46:26Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:c0:07 10.100.0.14
Jan 23 09:46:26 compute-0 ovn_controller[94697]: 2026-01-23T09:46:26Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:c0:07 10.100.0.14
Jan 23 09:46:27 compute-0 nova_compute[182092]: 2026-01-23 09:46:27.551 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updating instance_info_cache with network_info: [{"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:46:27 compute-0 nova_compute[182092]: 2026-01-23 09:46:27.565 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:46:27 compute-0 nova_compute[182092]: 2026-01-23 09:46:27.566 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:46:27 compute-0 nova_compute[182092]: 2026-01-23 09:46:27.566 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:27 compute-0 nova_compute[182092]: 2026-01-23 09:46:27.637 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:27 compute-0 nova_compute[182092]: 2026-01-23 09:46:27.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:28 compute-0 nova_compute[182092]: 2026-01-23 09:46:28.888 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:30 compute-0 podman[233513]: 2026-01-23 09:46:30.217752454 +0000 UTC m=+0.055556550 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:46:30 compute-0 nova_compute[182092]: 2026-01-23 09:46:30.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:30 compute-0 nova_compute[182092]: 2026-01-23 09:46:30.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:46:32 compute-0 nova_compute[182092]: 2026-01-23 09:46:32.638 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.004 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'name': 'tempest-TestGettingAddress-server-1431643373', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b4', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd4181f6c647942e881af13381cc2f253', 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'hostId': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.005 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.024 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.write.bytes volume: 72921088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.024 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '051d0e6b-0889-47ec-93bf-bd5b029b14ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72921088, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.005507', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6687a7a8-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': '4bf04256e4c59cb1adbdebf77cd9575788dc5a32310f49fde0f423e87135cdce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.005507', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6687b5ea-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': '371ee0d0ee19469fb0188244784cc59b8ed166a8411aaffd00f44094724109fe'}]}, 'timestamp': '2026-01-23 09:46:33.025208', '_unique_id': 'cc6d179452fe4090a5d1582ab93167ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.026 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.027 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1431643373>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1431643373>]
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.027 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.027 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1431643373>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1431643373>]
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.028 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1431643373>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1431643373>]
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.029 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 78ec13a1-543d-4ea8-8050-65fa6257d9fb / tap01da91c5-3e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.029 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6a7814a-3d69-49e1-b6f7-17026d231aa1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.028320', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '66887b74-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': '9408351e4f8216485d2c499732af14acbb2dbe04c455f657a3f54c8cfdee8f38'}]}, 'timestamp': '2026-01-23 09:46:33.030238', '_unique_id': '308880c3a6064689947f73d3f58f230a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.030 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.031 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.read.bytes volume: 29993472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81200745-343e-496f-a53c-52b9ca3150a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29993472, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.031768', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6688c20a-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': 'b8006e2900d71f9c84c51cf2e47f2b3f39a124ce992c0560427eedc8d8d90d53'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.031768', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6688cb38-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': '5aaa0c9e29bd83eeacd731e1d8b302b8e3d48117d5e9e6301b86247bef3ee8a4'}]}, 'timestamp': '2026-01-23 09:46:33.032264', '_unique_id': 'b69bfcc881d14b6a9c92594d89249320'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.032 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.033 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b41919c9-10ca-4a16-845d-cba4ed06f476', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.033726', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668911e2-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': 'b4cbb77f57070bd0f3d8198f13f3465442755af9f0c2c30131d4e758838079b7'}]}, 'timestamp': '2026-01-23 09:46:33.034113', '_unique_id': '13ce52df95be4691a055daf503087661'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.034 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.035 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/cpu volume: 9970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f94cd8db-cab5-440a-b4f0-7e42ef48e0f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9970000000, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'timestamp': '2026-01-23T09:46:33.035547', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '668af764-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.575569891, 'message_signature': 'e72e7214c66182c73d7d6d8f0ad67357df7f8a9a34462af8d97abbf39e2e7d37'}]}, 'timestamp': '2026-01-23 09:46:33.046511', '_unique_id': 'a39cd8d9e9a04ce88bc12e7ed0018af8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.046 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.047 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.054 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.054 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd79661fb-1982-48cb-8279-edcb2a80862a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.047941', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '668c3520-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.577323146, 'message_signature': 'e39886f62e6007320749397535f0008fa8eb386d79e60eed396cd3a0ccfc9b2e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.047941', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '668c3fe8-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.577323146, 'message_signature': '922b6f2ed94dac9aabdde28d6232c8d2783f643589474a86d8348c607c6492c9'}]}, 'timestamp': '2026-01-23 09:46:33.054914', '_unique_id': 'a98e4c6fe0ad47d6b1ae35ef44232f6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.055 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.056 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.incoming.bytes volume: 1798 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69ea187d-3b23-49bc-9958-4893d51b5d73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1798, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.056329', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668c80e8-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': 'fa12eb3a293c41c1b8132a46fbdd834d3341f6e127723ed00350ff9d0e3d30da'}]}, 'timestamp': '2026-01-23 09:46:33.056590', '_unique_id': 'ec6f12d1a4004f54be0173ee308e68b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0a4da4a-f6f9-4396-8deb-c58b3970f34d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.057985', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668cc18e-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': '472d36af5d0d9733a8a63d9350182813574dbfe221c0386e3d0531a6a99a001f'}]}, 'timestamp': '2026-01-23 09:46:33.058244', '_unique_id': 'dc5f28fa28624ae0acaea12b221e6297'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.058 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.059 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91368c82-be70-4c4a-bebd-7b4b152e7c2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.059613', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668d01ee-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': '78b566285b30ad4ec9ab52937602ab278f798d48a3a0c4aa6a56fb2dd0eff046'}]}, 'timestamp': '2026-01-23 09:46:33.059901', '_unique_id': '98b37417efb746da86a316fd25efbf53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.060 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.outgoing.packets volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54daf852-95da-42a7-920a-ddf98a62d788', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 22, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.061280', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668d421c-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': '2500ce66acc3d059ab6e3570433bb00c0e5d16d9cefa5a2f4dc908ad7aa50bd3'}]}, 'timestamp': '2026-01-23 09:46:33.061533', '_unique_id': '22d4f9ad9cbe47a399960ec7b38a4ab4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.061 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.062 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.read.requests volume: 1074 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbed2547-f7f6-4797-95f0-77a793ed9803', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1074, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.062931', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '668d82b8-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': '09e8d1a86d344dcf343fbe58648c8d82ff883deebd469fc2ce378e705c46db2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.062931', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '668d8bb4-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': 'ff8b180c03550b5bbb730ff96418c23edf009abeb76aae0eeaaabd790c2d3c77'}]}, 'timestamp': '2026-01-23 09:46:33.063407', '_unique_id': '2071f377409c4896a9b834de86d93f4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.063 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.064 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fc84d69-35b4-483e-a8ef-b1e43b6e3fce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.064813', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '668dcc32-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.577323146, 'message_signature': '65a7a207e69ff23ab6dd1d4249031e967ab7f58878f93b99a85f1c567ad97f24'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.064813', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '668dd52e-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.577323146, 'message_signature': 'c5398ef5fb19166c16c8d96c918badb62ecab9aa7f74a669751a4baf5b7db6e2'}]}, 'timestamp': '2026-01-23 09:46:33.065287', '_unique_id': 'a9a0c4809a6240589dfdefff1438b83b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.065 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.066 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.outgoing.bytes volume: 2236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8236a71d-539a-4778-b468-129b2a3dbaca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2236, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.066697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668e16a6-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': 'a0acfe44afa57438750f539eb7ff9070ff2192d9c0c34dbe438f9132a2cbd991'}]}, 'timestamp': '2026-01-23 09:46:33.066974', '_unique_id': '1e9aeead74be41cf925772b91c83e25e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.067 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.068 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.068 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '229a8641-6a19-41fe-9bdc-c8e5cd7fb354', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'timestamp': '2026-01-23T09:46:33.068337', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '668e55e4-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.575569891, 'message_signature': '78a45d8e8fc244135ff9b77fe3031f3e293fca50ad73188d98fd2b512285a74c'}]}, 'timestamp': '2026-01-23 09:46:33.068588', '_unique_id': 'b5e4b13f7c7246218bb324cdb56953ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.069 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdd1ff6c-3414-4305-9714-3371eb7e1fcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.070004', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668e977a-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': '3acdef3c2d6e2ee644df0bb39eb6be1e447e0a46039aaf4686815cfcd8e21b9c'}]}, 'timestamp': '2026-01-23 09:46:33.070272', '_unique_id': 'bde830cfaa4d4678af322e02129ea4ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.070 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.071 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77c84484-4ced-46c8-828d-6fa850011e66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 320, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.071774', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '668edc80-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': 'e71f234feb2839f67c98b05aec2431f96aec39a7db8d9929ea66d4ab0f71fb85'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.071774', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '668ee57c-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': '10ec2892fb29e4a627021579c064ad4113cfe889183ab29231dd61eed9c227bd'}]}, 'timestamp': '2026-01-23 09:46:33.072257', '_unique_id': 'e1e0596875f84bcbb8985aa2e6f9cbe1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.072 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.073 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.073 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.read.latency volume: 194402084 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.073 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.read.latency volume: 114441552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82f363e4-d285-4f68-965e-a71ac755cd77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 194402084, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.073643', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '668f2654-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': '76028cdb4a715447232583e3163ec6edf0e3b91afc9abbbf29a17928d45d0040'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 114441552, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.073643', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '668f2f78-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': '47ca80e5e0f4721efcedc43d4ad4998756816069faf26d9ffc657c3151c1dc2c'}]}, 'timestamp': '2026-01-23 09:46:33.074150', '_unique_id': 'e10a202ffa5d46e9813b3671e757b69a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.074 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.075 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4817339-bdbc-44a7-b5dd-3de085fda770', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.075549', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668f705a-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': 'bb97b4f0294eafeca69972ef5f9863101b805c76139cbdfe52db2c2398921a36'}]}, 'timestamp': '2026-01-23 09:46:33.075837', '_unique_id': 'ea73c6def812494c9b6f1052cf34442e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.076 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17c194ea-8cd6-452e-8f92-9660daf50da5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': 'instance-000000b4-78ec13a1-543d-4ea8-8050-65fa6257d9fb-tap01da91c5-3e', 'timestamp': '2026-01-23T09:46:33.077194', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'tap01da91c5-3e', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:c0:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap01da91c5-3e'}, 'message_id': '668fafca-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.557702495, 'message_signature': 'c0498c8d78af7cbabaf791aebd0b9c0bdfd72b554a2b6752e8451597a825e34d'}]}, 'timestamp': '2026-01-23 09:46:33.077446', '_unique_id': 'bca2e61532c641bb98f031246b0882c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.077 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.078 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.write.latency volume: 316247894 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da2ba371-804f-4e9f-a37b-992d9dc52b0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 316247894, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.078842', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '668ff052-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': 'a0821bbd3b9f847027bb8ae02c81383f3608d73e4af2d462e679d41e0fe1216c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.078842', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '668ff944-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.534894947, 'message_signature': '0c22b58a6bd5a42dc696cf035f3f4c0bcf516ff54062858c72fe2026ddccaf0d'}]}, 'timestamp': '2026-01-23 09:46:33.079317', '_unique_id': 'd71219731b47458a877ad4b9b2a4d90e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.079 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.080 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.080 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.080 12 DEBUG ceilometer.compute.pollsters [-] 78ec13a1-543d-4ea8-8050-65fa6257d9fb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '954f6a13-947a-45eb-b1eb-92e5b1d87b40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-vda', 'timestamp': '2026-01-23T09:46:33.080731', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66903a76-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.577323146, 'message_signature': 'a0db5ee0151c8fbb492b90dc2bdd7c7ee64453fddb8e1b3300cde007717fa0df'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_name': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_name': None, 'resource_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb-sda', 'timestamp': '2026-01-23T09:46:33.080731', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1431643373', 'name': 'instance-000000b4', 'instance_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'instance_type': 'm1.nano', 'host': 'c2b1674ac944b89c6c34ba426bbbe4ff7120696f4e6007c3012828f3', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '98e818ca-8ca1-4177-8a64-bde266c399d2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}, 'image_ref': '84bf9744-ebe0-4357-9697-347a3a1a297e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6690435e-f840-11f0-a9a3-fa163ea8bbc7', 'monotonic_time': 5098.577323146, 'message_signature': 'fa505b37625879c35efb8571a79cc2a787f30533ba36b1096b180d3dc1587568'}]}, 'timestamp': '2026-01-23 09:46:33.081213', '_unique_id': 'bb3e8e0eebf94ab98097ecd7887f09ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     yield
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.081 12 ERROR oslo_messaging.notify.messaging 
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 23 09:46:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:46:33.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1431643373>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1431643373>]
Jan 23 09:46:33 compute-0 nova_compute[182092]: 2026-01-23 09:46:33.891 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:37 compute-0 podman[233538]: 2026-01-23 09:46:37.207552397 +0000 UTC m=+0.041794264 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:46:37 compute-0 podman[233537]: 2026-01-23 09:46:37.212211531 +0000 UTC m=+0.048820121 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:46:37 compute-0 nova_compute[182092]: 2026-01-23 09:46:37.639 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:37 compute-0 nova_compute[182092]: 2026-01-23 09:46:37.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.006 182096 DEBUG nova.compute.manager [req-03eae5d8-146f-4350-9d89-69b9c1aa63bc req-376f0f05-2fcd-46ab-afce-3ead60564e78 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-changed-01da91c5-3e43-4507-80c4-3878ddbe17d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.006 182096 DEBUG nova.compute.manager [req-03eae5d8-146f-4350-9d89-69b9c1aa63bc req-376f0f05-2fcd-46ab-afce-3ead60564e78 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Refreshing instance network info cache due to event network-changed-01da91c5-3e43-4507-80c4-3878ddbe17d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.006 182096 DEBUG oslo_concurrency.lockutils [req-03eae5d8-146f-4350-9d89-69b9c1aa63bc req-376f0f05-2fcd-46ab-afce-3ead60564e78 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.006 182096 DEBUG oslo_concurrency.lockutils [req-03eae5d8-146f-4350-9d89-69b9c1aa63bc req-376f0f05-2fcd-46ab-afce-3ead60564e78 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.007 182096 DEBUG nova.network.neutron [req-03eae5d8-146f-4350-9d89-69b9c1aa63bc req-376f0f05-2fcd-46ab-afce-3ead60564e78 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Refreshing network info cache for port 01da91c5-3e43-4507-80c4-3878ddbe17d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.086 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.087 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.087 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.087 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.087 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.095 182096 INFO nova.compute.manager [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Terminating instance
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.101 182096 DEBUG nova.compute.manager [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:46:38 compute-0 kernel: tap01da91c5-3e (unregistering): left promiscuous mode
Jan 23 09:46:38 compute-0 NetworkManager[54920]: <info>  [1769161598.1249] device (tap01da91c5-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.133 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 ovn_controller[94697]: 2026-01-23T09:46:38Z|00747|binding|INFO|Releasing lport 01da91c5-3e43-4507-80c4-3878ddbe17d2 from this chassis (sb_readonly=0)
Jan 23 09:46:38 compute-0 ovn_controller[94697]: 2026-01-23T09:46:38Z|00748|binding|INFO|Setting lport 01da91c5-3e43-4507-80c4-3878ddbe17d2 down in Southbound
Jan 23 09:46:38 compute-0 ovn_controller[94697]: 2026-01-23T09:46:38Z|00749|binding|INFO|Removing iface tap01da91c5-3e ovn-installed in OVS
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.134 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.147 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:c0:07 10.100.0.14 2001:db8:0:1:f816:3eff:fefd:c007 2001:db8::f816:3eff:fefd:c007'], port_security=['fa:16:3e:fd:c0:07 10.100.0.14 2001:db8:0:1:f816:3eff:fefd:c007 2001:db8::f816:3eff:fefd:c007'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28 2001:db8:0:1:f816:3eff:fefd:c007/64 2001:db8::f816:3eff:fefd:c007/64', 'neutron:device_id': '78ec13a1-543d-4ea8-8050-65fa6257d9fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-136fe3f2-f077-4093-8452-cdc02e0c2016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f4795df8-adb2-4ea9-adaf-b20039ae5a87', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b15eafd7-caf2-4600-afb8-dc5a43f9194c, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=01da91c5-3e43-4507-80c4-3878ddbe17d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.149 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 01da91c5-3e43-4507-80c4-3878ddbe17d2 in datapath 136fe3f2-f077-4093-8452-cdc02e0c2016 unbound from our chassis
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.150 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 136fe3f2-f077-4093-8452-cdc02e0c2016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.151 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.152 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[67f6c2ce-94a1-43d4-a4e5-422445bc7754]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.152 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016 namespace which is not needed anymore
Jan 23 09:46:38 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Jan 23 09:46:38 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b4.scope: Consumed 11.680s CPU time.
Jan 23 09:46:38 compute-0 systemd-machined[153562]: Machine qemu-86-instance-000000b4 terminated.
Jan 23 09:46:38 compute-0 neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016[233421]: [NOTICE]   (233425) : haproxy version is 2.8.14-c23fe91
Jan 23 09:46:38 compute-0 neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016[233421]: [NOTICE]   (233425) : path to executable is /usr/sbin/haproxy
Jan 23 09:46:38 compute-0 neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016[233421]: [WARNING]  (233425) : Exiting Master process...
Jan 23 09:46:38 compute-0 neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016[233421]: [ALERT]    (233425) : Current worker (233427) exited with code 143 (Terminated)
Jan 23 09:46:38 compute-0 neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016[233421]: [WARNING]  (233425) : All workers exited. Exiting... (0)
Jan 23 09:46:38 compute-0 systemd[1]: libpod-077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56.scope: Deactivated successfully.
Jan 23 09:46:38 compute-0 podman[233595]: 2026-01-23 09:46:38.249348036 +0000 UTC m=+0.034175111 container died 077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:46:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56-userdata-shm.mount: Deactivated successfully.
Jan 23 09:46:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f6a0808d0dce8bf896430a811aef7eff1ef94bf49cd28fe7be490c38b347359-merged.mount: Deactivated successfully.
Jan 23 09:46:38 compute-0 podman[233595]: 2026-01-23 09:46:38.27263626 +0000 UTC m=+0.057463325 container cleanup 077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:46:38 compute-0 systemd[1]: libpod-conmon-077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56.scope: Deactivated successfully.
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.314 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 podman[233618]: 2026-01-23 09:46:38.318901664 +0000 UTC m=+0.029739048 container remove 077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.318 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.325 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[16600984-68e0-405d-9a77-1de5fcb795a7]: (4, ('Fri Jan 23 09:46:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016 (077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56)\n077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56\nFri Jan 23 09:46:38 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016 (077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56)\n077feb652ec3a97477af0f383a146024acd6baceadd416b93721a9dfda8a1d56\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.326 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[53a522c1-3afb-4e8f-90f5-7a3e94fd2eea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.327 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap136fe3f2-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:38 compute-0 kernel: tap136fe3f2-f0: left promiscuous mode
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.330 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.345 182096 INFO nova.virt.libvirt.driver [-] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Instance destroyed successfully.
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.346 182096 DEBUG nova.objects.instance [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'resources' on Instance uuid 78ec13a1-543d-4ea8-8050-65fa6257d9fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.347 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.347 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[50e2bc0e-8c1f-4fe8-8675-2e26619a8849]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.358 182096 DEBUG nova.virt.libvirt.vif [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1431643373',display_name='tempest-TestGettingAddress-server-1431643373',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1431643373',id=180,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1WmpVwZgssx0dKhDaYyOZoXvrh//REcNU1I6fcdxkeIdHI6/R2eKf0MVdctH9uq6G3nY4ECs0iB1WYuY4olFiFPn/iZzzrfzhp9S8favg4NKplZPC7FtUDcxIHvirdfQ==',key_name='tempest-TestGettingAddress-1521133497',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:46:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-8zpw90zg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:46:15Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=78ec13a1-543d-4ea8-8050-65fa6257d9fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.358 182096 DEBUG nova.network.os_vif_util [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.359 182096 DEBUG nova.network.os_vif_util [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:c0:07,bridge_name='br-int',has_traffic_filtering=True,id=01da91c5-3e43-4507-80c4-3878ddbe17d2,network=Network(136fe3f2-f077-4093-8452-cdc02e0c2016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01da91c5-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.358 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3872a4-311c-4c1b-8d9a-bf5a17d2f253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.359 182096 DEBUG os_vif [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:c0:07,bridge_name='br-int',has_traffic_filtering=True,id=01da91c5-3e43-4507-80c4-3878ddbe17d2,network=Network(136fe3f2-f077-4093-8452-cdc02e0c2016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01da91c5-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.359 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[71ded6f1-3a49-4dbe-9103-2a79e3d7eb49]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.360 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.360 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01da91c5-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.363 182096 DEBUG nova.compute.manager [req-3abcccb4-7a41-454b-8d08-cc017081bab8 req-60fb492e-0440-407f-bf7d-fc6b2eda54a4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-vif-unplugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.363 182096 DEBUG oslo_concurrency.lockutils [req-3abcccb4-7a41-454b-8d08-cc017081bab8 req-60fb492e-0440-407f-bf7d-fc6b2eda54a4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.363 182096 DEBUG oslo_concurrency.lockutils [req-3abcccb4-7a41-454b-8d08-cc017081bab8 req-60fb492e-0440-407f-bf7d-fc6b2eda54a4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.364 182096 DEBUG oslo_concurrency.lockutils [req-3abcccb4-7a41-454b-8d08-cc017081bab8 req-60fb492e-0440-407f-bf7d-fc6b2eda54a4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.364 182096 DEBUG nova.compute.manager [req-3abcccb4-7a41-454b-8d08-cc017081bab8 req-60fb492e-0440-407f-bf7d-fc6b2eda54a4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] No waiting events found dispatching network-vif-unplugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.364 182096 DEBUG nova.compute.manager [req-3abcccb4-7a41-454b-8d08-cc017081bab8 req-60fb492e-0440-407f-bf7d-fc6b2eda54a4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-vif-unplugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.364 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.365 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.366 182096 INFO os_vif [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:c0:07,bridge_name='br-int',has_traffic_filtering=True,id=01da91c5-3e43-4507-80c4-3878ddbe17d2,network=Network(136fe3f2-f077-4093-8452-cdc02e0c2016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01da91c5-3e')
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.367 182096 INFO nova.virt.libvirt.driver [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Deleting instance files /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb_del
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.367 182096 INFO nova.virt.libvirt.driver [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Deletion of /var/lib/nova/instances/78ec13a1-543d-4ea8-8050-65fa6257d9fb_del complete
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.417 182096 INFO nova.compute.manager [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Took 0.32 seconds to destroy the instance on the hypervisor.
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.418 182096 DEBUG oslo.service.loopingcall [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.418 182096 DEBUG nova.compute.manager [-] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:46:38 compute-0 nova_compute[182092]: 2026-01-23 09:46:38.418 182096 DEBUG nova.network.neutron [-] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.440 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[7477b2c5-750b-4ce1-9bca-f2e6349ddc3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507999, 'reachable_time': 32588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233649, 'error': None, 'target': 'ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:38 compute-0 systemd[1]: run-netns-ovnmeta\x2d136fe3f2\x2df077\x2d4093\x2d8452\x2dcdc02e0c2016.mount: Deactivated successfully.
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.444 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-136fe3f2-f077-4093-8452-cdc02e0c2016 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:46:38 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:38.444 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[69fc722e-274e-4590-af07-8849de86fb3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.057 182096 DEBUG nova.network.neutron [-] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.069 182096 INFO nova.compute.manager [-] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Took 0.65 seconds to deallocate network for instance.
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.164 182096 DEBUG nova.compute.manager [req-78ead01e-61ca-44cb-be21-90200f4187f7 req-728f11a2-153d-4ead-ab78-4b5dcd70098a 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-vif-deleted-01da91c5-3e43-4507-80c4-3878ddbe17d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.166 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.166 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.210 182096 DEBUG nova.compute.provider_tree [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.222 182096 DEBUG nova.scheduler.client.report [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.240 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.260 182096 INFO nova.scheduler.client.report [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Deleted allocations for instance 78ec13a1-543d-4ea8-8050-65fa6257d9fb
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.315 182096 DEBUG oslo_concurrency.lockutils [None req-2adcad7d-792e-42e5-8ef0-48c10dc73133 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.481 182096 DEBUG nova.network.neutron [req-03eae5d8-146f-4350-9d89-69b9c1aa63bc req-376f0f05-2fcd-46ab-afce-3ead60564e78 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updated VIF entry in instance network info cache for port 01da91c5-3e43-4507-80c4-3878ddbe17d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.482 182096 DEBUG nova.network.neutron [req-03eae5d8-146f-4350-9d89-69b9c1aa63bc req-376f0f05-2fcd-46ab-afce-3ead60564e78 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Updating instance_info_cache with network_info: [{"id": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "address": "fa:16:3e:fd:c0:07", "network": {"id": "136fe3f2-f077-4093-8452-cdc02e0c2016", "bridge": "br-int", "label": "tempest-network-smoke--1223250690", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefd:c007", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01da91c5-3e", "ovs_interfaceid": "01da91c5-3e43-4507-80c4-3878ddbe17d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:46:39 compute-0 nova_compute[182092]: 2026-01-23 09:46:39.511 182096 DEBUG oslo_concurrency.lockutils [req-03eae5d8-146f-4350-9d89-69b9c1aa63bc req-376f0f05-2fcd-46ab-afce-3ead60564e78 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-78ec13a1-543d-4ea8-8050-65fa6257d9fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:46:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:39.877 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:39.877 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:39.877 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:40 compute-0 nova_compute[182092]: 2026-01-23 09:46:40.435 182096 DEBUG nova.compute.manager [req-4ed715b8-1d2c-41f7-9279-8e5c9c22f4d3 req-b6cadc67-c2b5-4282-933b-6923fbf5e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received event network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:46:40 compute-0 nova_compute[182092]: 2026-01-23 09:46:40.436 182096 DEBUG oslo_concurrency.lockutils [req-4ed715b8-1d2c-41f7-9279-8e5c9c22f4d3 req-b6cadc67-c2b5-4282-933b-6923fbf5e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:46:40 compute-0 nova_compute[182092]: 2026-01-23 09:46:40.436 182096 DEBUG oslo_concurrency.lockutils [req-4ed715b8-1d2c-41f7-9279-8e5c9c22f4d3 req-b6cadc67-c2b5-4282-933b-6923fbf5e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:46:40 compute-0 nova_compute[182092]: 2026-01-23 09:46:40.436 182096 DEBUG oslo_concurrency.lockutils [req-4ed715b8-1d2c-41f7-9279-8e5c9c22f4d3 req-b6cadc67-c2b5-4282-933b-6923fbf5e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "78ec13a1-543d-4ea8-8050-65fa6257d9fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:46:40 compute-0 nova_compute[182092]: 2026-01-23 09:46:40.437 182096 DEBUG nova.compute.manager [req-4ed715b8-1d2c-41f7-9279-8e5c9c22f4d3 req-b6cadc67-c2b5-4282-933b-6923fbf5e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] No waiting events found dispatching network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:46:40 compute-0 nova_compute[182092]: 2026-01-23 09:46:40.437 182096 WARNING nova.compute.manager [req-4ed715b8-1d2c-41f7-9279-8e5c9c22f4d3 req-b6cadc67-c2b5-4282-933b-6923fbf5e8e1 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Received unexpected event network-vif-plugged-01da91c5-3e43-4507-80c4-3878ddbe17d2 for instance with vm_state deleted and task_state None.
Jan 23 09:46:42 compute-0 nova_compute[182092]: 2026-01-23 09:46:42.642 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:43 compute-0 nova_compute[182092]: 2026-01-23 09:46:43.225 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:43 compute-0 nova_compute[182092]: 2026-01-23 09:46:43.345 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:43 compute-0 nova_compute[182092]: 2026-01-23 09:46:43.362 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:44 compute-0 nova_compute[182092]: 2026-01-23 09:46:44.012 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:44 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:44.011 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:46:44 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:44.012 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:46:47 compute-0 podman[233651]: 2026-01-23 09:46:47.209203396 +0000 UTC m=+0.038959429 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 09:46:47 compute-0 podman[233652]: 2026-01-23 09:46:47.212346392 +0000 UTC m=+0.040375269 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:46:47 compute-0 nova_compute[182092]: 2026-01-23 09:46:47.643 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:48 compute-0 nova_compute[182092]: 2026-01-23 09:46:48.362 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:50 compute-0 podman[233688]: 2026-01-23 09:46:50.198196926 +0000 UTC m=+0.036384164 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Jan 23 09:46:51 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:51.014 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:46:52 compute-0 nova_compute[182092]: 2026-01-23 09:46:52.646 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:53 compute-0 nova_compute[182092]: 2026-01-23 09:46:53.344 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161598.3439648, 78ec13a1-543d-4ea8-8050-65fa6257d9fb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:46:53 compute-0 nova_compute[182092]: 2026-01-23 09:46:53.344 182096 INFO nova.compute.manager [-] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] VM Stopped (Lifecycle Event)
Jan 23 09:46:53 compute-0 nova_compute[182092]: 2026-01-23 09:46:53.363 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:53 compute-0 nova_compute[182092]: 2026-01-23 09:46:53.391 182096 DEBUG nova.compute.manager [None req-a573140e-c03d-4501-b900-b74e64f74e0d - - - - - -] [instance: 78ec13a1-543d-4ea8-8050-65fa6257d9fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:46:57 compute-0 nova_compute[182092]: 2026-01-23 09:46:57.646 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:46:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:58.278 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:88:b2 10.100.0.2 2001:db8::f816:3eff:fef2:88b2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef2:88b2/64', 'neutron:device_id': 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=109fc9f6-a6c6-47a3-931c-e9c7ea41cfc2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=53226a52-dc8a-495d-99f1-6e2094800533) old=Port_Binding(mac=['fa:16:3e:f2:88:b2 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:46:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:58.279 103978 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 53226a52-dc8a-495d-99f1-6e2094800533 in datapath 51a21ab3-c794-4eb4-8d73-d4bfcad68539 updated
Jan 23 09:46:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:58.280 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51a21ab3-c794-4eb4-8d73-d4bfcad68539, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:46:58 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:46:58.281 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[38aa444e-4992-40ec-aa81-0bfa7148e434]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:46:58 compute-0 nova_compute[182092]: 2026-01-23 09:46:58.365 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:01 compute-0 podman[233707]: 2026-01-23 09:47:01.215629338 +0000 UTC m=+0.053876391 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 09:47:02 compute-0 nova_compute[182092]: 2026-01-23 09:47:02.649 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:02.989 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:88:b2 10.100.0.2 2001:db8:0:1:f816:3eff:fef2:88b2 2001:db8::f816:3eff:fef2:88b2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fef2:88b2/64 2001:db8::f816:3eff:fef2:88b2/64', 'neutron:device_id': 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=109fc9f6-a6c6-47a3-931c-e9c7ea41cfc2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=53226a52-dc8a-495d-99f1-6e2094800533) old=Port_Binding(mac=['fa:16:3e:f2:88:b2 10.100.0.2 2001:db8::f816:3eff:fef2:88b2'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef2:88b2/64', 'neutron:device_id': 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:47:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:02.990 103978 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 53226a52-dc8a-495d-99f1-6e2094800533 in datapath 51a21ab3-c794-4eb4-8d73-d4bfcad68539 updated
Jan 23 09:47:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:02.991 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51a21ab3-c794-4eb4-8d73-d4bfcad68539, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:47:02 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:02.992 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[39477716-77f1-4950-99ad-c2886d458b5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:03 compute-0 nova_compute[182092]: 2026-01-23 09:47:03.366 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:07 compute-0 nova_compute[182092]: 2026-01-23 09:47:07.649 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:08 compute-0 podman[233732]: 2026-01-23 09:47:08.2112544 +0000 UTC m=+0.038858820 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:47:08 compute-0 podman[233731]: 2026-01-23 09:47:08.212512021 +0000 UTC m=+0.042670507 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:47:08 compute-0 nova_compute[182092]: 2026-01-23 09:47:08.367 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:12 compute-0 nova_compute[182092]: 2026-01-23 09:47:12.650 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:13 compute-0 nova_compute[182092]: 2026-01-23 09:47:13.368 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:17 compute-0 nova_compute[182092]: 2026-01-23 09:47:17.652 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:18 compute-0 podman[233770]: 2026-01-23 09:47:18.204185501 +0000 UTC m=+0.036779902 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:47:18 compute-0 podman[233769]: 2026-01-23 09:47:18.227196844 +0000 UTC m=+0.062274024 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:47:18 compute-0 nova_compute[182092]: 2026-01-23 09:47:18.369 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:19 compute-0 nova_compute[182092]: 2026-01-23 09:47:19.659 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:21 compute-0 podman[233806]: 2026-01-23 09:47:21.20012331 +0000 UTC m=+0.038237427 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:47:21 compute-0 nova_compute[182092]: 2026-01-23 09:47:21.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:21 compute-0 nova_compute[182092]: 2026-01-23 09:47:21.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:22 compute-0 nova_compute[182092]: 2026-01-23 09:47:22.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:22 compute-0 nova_compute[182092]: 2026-01-23 09:47:22.653 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.370 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.671 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.671 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.671 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.866 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.866 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.21261596679688GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.867 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.867 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.928 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.928 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.964 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:47:23 compute-0 nova_compute[182092]: 2026-01-23 09:47:23.988 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:47:24 compute-0 nova_compute[182092]: 2026-01-23 09:47:24.006 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:47:24 compute-0 nova_compute[182092]: 2026-01-23 09:47:24.006 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:25 compute-0 ovn_controller[94697]: 2026-01-23T09:47:25Z|00750|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 23 09:47:26 compute-0 nova_compute[182092]: 2026-01-23 09:47:26.007 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:26 compute-0 nova_compute[182092]: 2026-01-23 09:47:26.007 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:47:26 compute-0 nova_compute[182092]: 2026-01-23 09:47:26.007 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:47:26 compute-0 nova_compute[182092]: 2026-01-23 09:47:26.025 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:47:27 compute-0 nova_compute[182092]: 2026-01-23 09:47:27.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:27 compute-0 nova_compute[182092]: 2026-01-23 09:47:27.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:27 compute-0 nova_compute[182092]: 2026-01-23 09:47:27.655 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:28 compute-0 nova_compute[182092]: 2026-01-23 09:47:28.372 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:30.190 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:47:30 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:30.191 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:47:30 compute-0 nova_compute[182092]: 2026-01-23 09:47:30.192 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:30 compute-0 nova_compute[182092]: 2026-01-23 09:47:30.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:30 compute-0 nova_compute[182092]: 2026-01-23 09:47:30.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:47:32 compute-0 podman[233826]: 2026-01-23 09:47:32.238159017 +0000 UTC m=+0.077526879 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:47:32 compute-0 nova_compute[182092]: 2026-01-23 09:47:32.656 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:33 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:33.192 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:47:33 compute-0 nova_compute[182092]: 2026-01-23 09:47:33.373 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:37 compute-0 nova_compute[182092]: 2026-01-23 09:47:37.657 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:38 compute-0 nova_compute[182092]: 2026-01-23 09:47:38.374 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:39 compute-0 podman[233850]: 2026-01-23 09:47:39.198881707 +0000 UTC m=+0.034646730 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:47:39 compute-0 podman[233849]: 2026-01-23 09:47:39.199848049 +0000 UTC m=+0.037325359 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:47:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:39.878 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:39.878 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:39.878 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.322 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "46b2697d-901b-419d-ba8b-4082a8dacef9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.322 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.331 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.415 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.415 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.420 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.420 182096 INFO nova.compute.claims [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.523 182096 DEBUG nova.compute.provider_tree [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.535 182096 DEBUG nova.scheduler.client.report [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.560 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.560 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.624 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.624 182096 DEBUG nova.network.neutron [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.639 182096 INFO nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.652 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.658 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.738 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.738 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.739 182096 INFO nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Creating image(s)
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.739 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "/var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.740 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.740 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.750 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.798 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.799 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.799 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.809 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.856 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.857 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.878 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.879 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.880 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.926 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.927 182096 DEBUG nova.virt.disk.api [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Checking if we can resize image /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.927 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.947 182096 DEBUG nova.policy [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.973 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.973 182096 DEBUG nova.virt.disk.api [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Cannot resize image /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.974 182096 DEBUG nova.objects.instance [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'migration_context' on Instance uuid 46b2697d-901b-419d-ba8b-4082a8dacef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.986 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.986 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Ensure instance console log exists: /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.987 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.987 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:42 compute-0 nova_compute[182092]: 2026-01-23 09:47:42.987 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:43 compute-0 nova_compute[182092]: 2026-01-23 09:47:43.375 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:43 compute-0 nova_compute[182092]: 2026-01-23 09:47:43.934 182096 DEBUG nova.network.neutron [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Successfully created port: 3cdd3d90-ad23-43fb-acff-243955771b50 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:47:44 compute-0 nova_compute[182092]: 2026-01-23 09:47:44.607 182096 DEBUG nova.network.neutron [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Successfully updated port: 3cdd3d90-ad23-43fb-acff-243955771b50 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:47:44 compute-0 nova_compute[182092]: 2026-01-23 09:47:44.629 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:47:44 compute-0 nova_compute[182092]: 2026-01-23 09:47:44.629 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquired lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:47:44 compute-0 nova_compute[182092]: 2026-01-23 09:47:44.630 182096 DEBUG nova.network.neutron [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:47:44 compute-0 nova_compute[182092]: 2026-01-23 09:47:44.727 182096 DEBUG nova.compute.manager [req-011fa1db-760a-4fb5-8d2e-563fea7dacc1 req-216479ae-20e0-4770-9f35-01a1d944fbd4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-changed-3cdd3d90-ad23-43fb-acff-243955771b50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:47:44 compute-0 nova_compute[182092]: 2026-01-23 09:47:44.728 182096 DEBUG nova.compute.manager [req-011fa1db-760a-4fb5-8d2e-563fea7dacc1 req-216479ae-20e0-4770-9f35-01a1d944fbd4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Refreshing instance network info cache due to event network-changed-3cdd3d90-ad23-43fb-acff-243955771b50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:47:44 compute-0 nova_compute[182092]: 2026-01-23 09:47:44.728 182096 DEBUG oslo_concurrency.lockutils [req-011fa1db-760a-4fb5-8d2e-563fea7dacc1 req-216479ae-20e0-4770-9f35-01a1d944fbd4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:47:44 compute-0 nova_compute[182092]: 2026-01-23 09:47:44.916 182096 DEBUG nova.network.neutron [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.649 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.650 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.650 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.650 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.651 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.651 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.660 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.667 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.668 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Image id 84bf9744-ebe0-4357-9697-347a3a1a297e yields fingerprint 84599409cc5581bc3853b7dfcf4b30782ffdf147 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.668 182096 INFO nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] image 84bf9744-ebe0-4357-9697-347a3a1a297e at (/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147): checking
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.668 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] image 84bf9744-ebe0-4357-9697-347a3a1a297e at (/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.669 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.670 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] 46b2697d-901b-419d-ba8b-4082a8dacef9 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.670 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] 46b2697d-901b-419d-ba8b-4082a8dacef9 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.670 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.714 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.715 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance 46b2697d-901b-419d-ba8b-4082a8dacef9 is backed by 84599409cc5581bc3853b7dfcf4b30782ffdf147 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.715 182096 WARNING nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.715 182096 WARNING nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.715 182096 WARNING nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.715 182096 INFO nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Active base files: /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.715 182096 INFO nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Removable base files: /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89 /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.716 182096 INFO nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/048a134f49b908e4f6e5b4543a869b574113fd89
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.716 182096 INFO nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3ab4138fc997788c63c306bdd47c259649cf0f6c
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.716 182096 INFO nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/9813441954c0e6188ad4957fcfa70eab6148e9cb
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.716 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.717 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.717 182096 DEBUG nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 23 09:47:45 compute-0 nova_compute[182092]: 2026-01-23 09:47:45.717 182096 INFO nova.virt.libvirt.imagecache [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.180 182096 DEBUG nova.network.neutron [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Updating instance_info_cache with network_info: [{"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.194 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Releasing lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.195 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Instance network_info: |[{"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.195 182096 DEBUG oslo_concurrency.lockutils [req-011fa1db-760a-4fb5-8d2e-563fea7dacc1 req-216479ae-20e0-4770-9f35-01a1d944fbd4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.195 182096 DEBUG nova.network.neutron [req-011fa1db-760a-4fb5-8d2e-563fea7dacc1 req-216479ae-20e0-4770-9f35-01a1d944fbd4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Refreshing network info cache for port 3cdd3d90-ad23-43fb-acff-243955771b50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.197 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Start _get_guest_xml network_info=[{"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.200 182096 WARNING nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.207 182096 DEBUG nova.virt.libvirt.host [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.207 182096 DEBUG nova.virt.libvirt.host [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.210 182096 DEBUG nova.virt.libvirt.host [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.210 182096 DEBUG nova.virt.libvirt.host [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.211 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.211 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.212 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.212 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.212 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.212 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.212 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.213 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.213 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.213 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.213 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.213 182096 DEBUG nova.virt.hardware [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.216 182096 DEBUG nova.virt.libvirt.vif [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-128960503',display_name='tempest-TestGettingAddress-server-128960503',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-128960503',id=184,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCkOps738Cbl+qKh9DJYOjz9dYsW32X9JFwZB7j+2+qfntsDw+vwN8d9MyopNTBWjZ3Vn12IydbvrfZ2XU//tnNjGMJ2/JRwuHcAgbnWa48XbXHMrusizoMhFMhFgnEHoA==',key_name='tempest-TestGettingAddress-263944147',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-rwvau6co',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:47:42Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=46b2697d-901b-419d-ba8b-4082a8dacef9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.216 182096 DEBUG nova.network.os_vif_util [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.217 182096 DEBUG nova.network.os_vif_util [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:45:55,bridge_name='br-int',has_traffic_filtering=True,id=3cdd3d90-ad23-43fb-acff-243955771b50,network=Network(51a21ab3-c794-4eb4-8d73-d4bfcad68539),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cdd3d90-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.218 182096 DEBUG nova.objects.instance [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'pci_devices' on Instance uuid 46b2697d-901b-419d-ba8b-4082a8dacef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.229 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <uuid>46b2697d-901b-419d-ba8b-4082a8dacef9</uuid>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <name>instance-000000b8</name>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <nova:name>tempest-TestGettingAddress-server-128960503</nova:name>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:47:46</nova:creationTime>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:47:46 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:47:46 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:47:46 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:47:46 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:47:46 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:47:46 compute-0 nova_compute[182092]:         <nova:user uuid="2223cd913aab4f7cbffc6e9c703c6acc">tempest-TestGettingAddress-10741833-project-member</nova:user>
Jan 23 09:47:46 compute-0 nova_compute[182092]:         <nova:project uuid="d4181f6c647942e881af13381cc2f253">tempest-TestGettingAddress-10741833</nova:project>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:47:46 compute-0 nova_compute[182092]:         <nova:port uuid="3cdd3d90-ad23-43fb-acff-243955771b50">
Jan 23 09:47:46 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe8b:4555" ipVersion="6"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe8b:4555" ipVersion="6"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <system>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <entry name="serial">46b2697d-901b-419d-ba8b-4082a8dacef9</entry>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <entry name="uuid">46b2697d-901b-419d-ba8b-4082a8dacef9</entry>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </system>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <os>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   </os>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <features>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   </features>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk.config"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:8b:45:55"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <target dev="tap3cdd3d90-ad"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/console.log" append="off"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <video>
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </video>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:47:46 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:47:46 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:47:46 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:47:46 compute-0 nova_compute[182092]: </domain>
Jan 23 09:47:46 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.230 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Preparing to wait for external event network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.230 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.231 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.231 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.231 182096 DEBUG nova.virt.libvirt.vif [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-128960503',display_name='tempest-TestGettingAddress-server-128960503',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-128960503',id=184,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCkOps738Cbl+qKh9DJYOjz9dYsW32X9JFwZB7j+2+qfntsDw+vwN8d9MyopNTBWjZ3Vn12IydbvrfZ2XU//tnNjGMJ2/JRwuHcAgbnWa48XbXHMrusizoMhFMhFgnEHoA==',key_name='tempest-TestGettingAddress-263944147',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-rwvau6co',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:47:42Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=46b2697d-901b-419d-ba8b-4082a8dacef9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.232 182096 DEBUG nova.network.os_vif_util [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.232 182096 DEBUG nova.network.os_vif_util [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:45:55,bridge_name='br-int',has_traffic_filtering=True,id=3cdd3d90-ad23-43fb-acff-243955771b50,network=Network(51a21ab3-c794-4eb4-8d73-d4bfcad68539),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cdd3d90-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.232 182096 DEBUG os_vif [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:45:55,bridge_name='br-int',has_traffic_filtering=True,id=3cdd3d90-ad23-43fb-acff-243955771b50,network=Network(51a21ab3-c794-4eb4-8d73-d4bfcad68539),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cdd3d90-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.233 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.233 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.234 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.236 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.236 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3cdd3d90-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.236 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3cdd3d90-ad, col_values=(('external_ids', {'iface-id': '3cdd3d90-ad23-43fb-acff-243955771b50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:45:55', 'vm-uuid': '46b2697d-901b-419d-ba8b-4082a8dacef9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.237 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.2384] manager: (tap3cdd3d90-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.239 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.242 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.242 182096 INFO os_vif [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:45:55,bridge_name='br-int',has_traffic_filtering=True,id=3cdd3d90-ad23-43fb-acff-243955771b50,network=Network(51a21ab3-c794-4eb4-8d73-d4bfcad68539),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cdd3d90-ad')
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.269 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.270 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.270 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No VIF found with MAC fa:16:3e:8b:45:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.270 182096 INFO nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Using config drive
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.567 182096 INFO nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Creating config drive at /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk.config
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.571 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1i2t83if execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.688 182096 DEBUG oslo_concurrency.processutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1i2t83if" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:47:46 compute-0 kernel: tap3cdd3d90-ad: entered promiscuous mode
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.7235] manager: (tap3cdd3d90-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/370)
Jan 23 09:47:46 compute-0 ovn_controller[94697]: 2026-01-23T09:47:46Z|00751|binding|INFO|Claiming lport 3cdd3d90-ad23-43fb-acff-243955771b50 for this chassis.
Jan 23 09:47:46 compute-0 ovn_controller[94697]: 2026-01-23T09:47:46Z|00752|binding|INFO|3cdd3d90-ad23-43fb-acff-243955771b50: Claiming fa:16:3e:8b:45:55 10.100.0.11 2001:db8:0:1:f816:3eff:fe8b:4555 2001:db8::f816:3eff:fe8b:4555
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.725 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.727 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.728 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.733 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.736 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.7380] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.7385] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.739 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:45:55 10.100.0.11 2001:db8:0:1:f816:3eff:fe8b:4555 2001:db8::f816:3eff:fe8b:4555'], port_security=['fa:16:3e:8b:45:55 10.100.0.11 2001:db8:0:1:f816:3eff:fe8b:4555 2001:db8::f816:3eff:fe8b:4555'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8:0:1:f816:3eff:fe8b:4555/64 2001:db8::f816:3eff:fe8b:4555/64', 'neutron:device_id': '46b2697d-901b-419d-ba8b-4082a8dacef9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4f64f43c-b2a1-4d58-905b-776c7d1ea8bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=109fc9f6-a6c6-47a3-931c-e9c7ea41cfc2, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=3cdd3d90-ad23-43fb-acff-243955771b50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.740 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 3cdd3d90-ad23-43fb-acff-243955771b50 in datapath 51a21ab3-c794-4eb4-8d73-d4bfcad68539 bound to our chassis
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.741 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 51a21ab3-c794-4eb4-8d73-d4bfcad68539
Jan 23 09:47:46 compute-0 systemd-machined[153562]: New machine qemu-87-instance-000000b8.
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.750 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e2c47f-0d42-4b3a-aa9c-eea8c34c7fa1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.750 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap51a21ab3-c1 in ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.751 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap51a21ab3-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.751 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[21d86a2c-3343-4417-95bd-e3c1076c6731]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.752 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0bb74a-681f-4b21-aff2-11c0129b7acb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.759 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[ee36b132-ee0a-4b8d-b5d3-7fe492bbef4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 systemd[1]: Started Virtual Machine qemu-87-instance-000000b8.
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.779 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2b38c8-ba45-4bff-8051-556a304396ad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 systemd-udevd[233928]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.7947] device (tap3cdd3d90-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.7953] device (tap3cdd3d90-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.800 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6f86ba-c0d9-44cc-9d3f-fe36ed98bbd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.823 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0ca12f-ad15-4a9e-b445-4c06ae4b345a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.8244] manager: (tap51a21ab3-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/373)
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.845 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[0f399bdd-4a03-4f14-8a5f-dcdbb1b6f529]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.847 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8af98a41-4083-41f3-a1a7-2ad0528debfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.858 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.8617] device (tap51a21ab3-c0): carrier: link connected
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.864 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2a7778-36ee-49b9-bba4-dee7eac86d03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.871 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 ovn_controller[94697]: 2026-01-23T09:47:46Z|00753|binding|INFO|Setting lport 3cdd3d90-ad23-43fb-acff-243955771b50 ovn-installed in OVS
Jan 23 09:47:46 compute-0 ovn_controller[94697]: 2026-01-23T09:47:46Z|00754|binding|INFO|Setting lport 3cdd3d90-ad23-43fb-acff-243955771b50 up in Southbound
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.883 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.885 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[25531401-3023-420c-8ac5-efacce0e659d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51a21ab3-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:88:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517234, 'reachable_time': 35808, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233949, 'error': None, 'target': 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.895 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[eb043b6f-b721-4fe6-b71b-3dd737706a02]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef2:88b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517234, 'tstamp': 517234}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233950, 'error': None, 'target': 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.905 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[10149577-b764-46ac-b7f4-2a3e3e5183ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51a21ab3-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:88:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517234, 'reachable_time': 35808, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233951, 'error': None, 'target': 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.922 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[155f9637-f090-43ca-a789-60b9f3f81112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.952 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bbac5419-4964-4a46-9a88-f6e8fd785e11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.953 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51a21ab3-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.953 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.954 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51a21ab3-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.955 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 NetworkManager[54920]: <info>  [1769161666.9559] manager: (tap51a21ab3-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Jan 23 09:47:46 compute-0 kernel: tap51a21ab3-c0: entered promiscuous mode
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.957 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.959 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap51a21ab3-c0, col_values=(('external_ids', {'iface-id': '53226a52-dc8a-495d-99f1-6e2094800533'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.960 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 ovn_controller[94697]: 2026-01-23T09:47:46Z|00755|binding|INFO|Releasing lport 53226a52-dc8a-495d-99f1-6e2094800533 from this chassis (sb_readonly=0)
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.960 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.961 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/51a21ab3-c794-4eb4-8d73-d4bfcad68539.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/51a21ab3-c794-4eb4-8d73-d4bfcad68539.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.962 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d8afa4cd-8fee-4cac-8526-b243b4686583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.962 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-51a21ab3-c794-4eb4-8d73-d4bfcad68539
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/51a21ab3-c794-4eb4-8d73-d4bfcad68539.pid.haproxy
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 51a21ab3-c794-4eb4-8d73-d4bfcad68539
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:47:46 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:47:46.964 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'env', 'PROCESS_TAG=haproxy-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/51a21ab3-c794-4eb4-8d73-d4bfcad68539.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:47:46 compute-0 nova_compute[182092]: 2026-01-23 09:47:46.972 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:47 compute-0 podman[233979]: 2026-01-23 09:47:47.233742981 +0000 UTC m=+0.029771098 container create beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 09:47:47 compute-0 systemd[1]: Started libpod-conmon-beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551.scope.
Jan 23 09:47:47 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:47:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b49b8eb24d1513ef80ae7bb86f7c7ade2fc666635483b21aab99e210b239277/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:47:47 compute-0 podman[233979]: 2026-01-23 09:47:47.291118479 +0000 UTC m=+0.087146596 container init beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 09:47:47 compute-0 podman[233979]: 2026-01-23 09:47:47.296081688 +0000 UTC m=+0.092109803 container start beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:47:47 compute-0 podman[233979]: 2026-01-23 09:47:47.220176775 +0000 UTC m=+0.016204912 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:47:47 compute-0 neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539[233991]: [NOTICE]   (233995) : New worker (233997) forked
Jan 23 09:47:47 compute-0 neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539[233991]: [NOTICE]   (233995) : Loading success.
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.391 182096 DEBUG nova.compute.manager [req-7931550a-d5f7-4c18-9840-663a8881190e req-265e72e5-054d-4c6d-b3ef-51dbb4b680d8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.392 182096 DEBUG oslo_concurrency.lockutils [req-7931550a-d5f7-4c18-9840-663a8881190e req-265e72e5-054d-4c6d-b3ef-51dbb4b680d8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.392 182096 DEBUG oslo_concurrency.lockutils [req-7931550a-d5f7-4c18-9840-663a8881190e req-265e72e5-054d-4c6d-b3ef-51dbb4b680d8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.392 182096 DEBUG oslo_concurrency.lockutils [req-7931550a-d5f7-4c18-9840-663a8881190e req-265e72e5-054d-4c6d-b3ef-51dbb4b680d8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.393 182096 DEBUG nova.compute.manager [req-7931550a-d5f7-4c18-9840-663a8881190e req-265e72e5-054d-4c6d-b3ef-51dbb4b680d8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Processing event network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.659 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.842 182096 DEBUG nova.network.neutron [req-011fa1db-760a-4fb5-8d2e-563fea7dacc1 req-216479ae-20e0-4770-9f35-01a1d944fbd4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Updated VIF entry in instance network info cache for port 3cdd3d90-ad23-43fb-acff-243955771b50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.843 182096 DEBUG nova.network.neutron [req-011fa1db-760a-4fb5-8d2e-563fea7dacc1 req-216479ae-20e0-4770-9f35-01a1d944fbd4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Updating instance_info_cache with network_info: [{"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:47:47 compute-0 nova_compute[182092]: 2026-01-23 09:47:47.857 182096 DEBUG oslo_concurrency.lockutils [req-011fa1db-760a-4fb5-8d2e-563fea7dacc1 req-216479ae-20e0-4770-9f35-01a1d944fbd4 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.469 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.601 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161668.6005428, 46b2697d-901b-419d-ba8b-4082a8dacef9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.601 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] VM Started (Lifecycle Event)
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.603 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.605 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.608 182096 INFO nova.virt.libvirt.driver [-] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Instance spawned successfully.
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.608 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.620 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.627 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.629 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.630 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.630 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.630 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.631 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.631 182096 DEBUG nova.virt.libvirt.driver [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.652 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.652 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161668.600727, 46b2697d-901b-419d-ba8b-4082a8dacef9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.652 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] VM Paused (Lifecycle Event)
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.680 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.682 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161668.6047626, 46b2697d-901b-419d-ba8b-4082a8dacef9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.682 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] VM Resumed (Lifecycle Event)
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.699 182096 INFO nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Took 5.96 seconds to spawn the instance on the hypervisor.
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.700 182096 DEBUG nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.704 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.706 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.730 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.781 182096 INFO nova.compute.manager [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Took 6.39 seconds to build instance.
Jan 23 09:47:48 compute-0 nova_compute[182092]: 2026-01-23 09:47:48.793 182096 DEBUG oslo_concurrency.lockutils [None req-17a000d1-773e-4e42-8cb0-d14fe721bef2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:49 compute-0 podman[234010]: 2026-01-23 09:47:49.22121102 +0000 UTC m=+0.051427797 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:47:49 compute-0 podman[234009]: 2026-01-23 09:47:49.243201729 +0000 UTC m=+0.075073024 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:47:49 compute-0 nova_compute[182092]: 2026-01-23 09:47:49.715 182096 DEBUG nova.compute.manager [req-22cba765-618f-4ca0-8e1c-746ec14538d8 req-c795516d-cdb5-4137-a0c3-bd9bc75f2244 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:47:49 compute-0 nova_compute[182092]: 2026-01-23 09:47:49.716 182096 DEBUG oslo_concurrency.lockutils [req-22cba765-618f-4ca0-8e1c-746ec14538d8 req-c795516d-cdb5-4137-a0c3-bd9bc75f2244 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:47:49 compute-0 nova_compute[182092]: 2026-01-23 09:47:49.716 182096 DEBUG oslo_concurrency.lockutils [req-22cba765-618f-4ca0-8e1c-746ec14538d8 req-c795516d-cdb5-4137-a0c3-bd9bc75f2244 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:47:49 compute-0 nova_compute[182092]: 2026-01-23 09:47:49.717 182096 DEBUG oslo_concurrency.lockutils [req-22cba765-618f-4ca0-8e1c-746ec14538d8 req-c795516d-cdb5-4137-a0c3-bd9bc75f2244 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:47:49 compute-0 nova_compute[182092]: 2026-01-23 09:47:49.717 182096 DEBUG nova.compute.manager [req-22cba765-618f-4ca0-8e1c-746ec14538d8 req-c795516d-cdb5-4137-a0c3-bd9bc75f2244 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] No waiting events found dispatching network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:47:49 compute-0 nova_compute[182092]: 2026-01-23 09:47:49.717 182096 WARNING nova.compute.manager [req-22cba765-618f-4ca0-8e1c-746ec14538d8 req-c795516d-cdb5-4137-a0c3-bd9bc75f2244 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received unexpected event network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 for instance with vm_state active and task_state None.
Jan 23 09:47:51 compute-0 nova_compute[182092]: 2026-01-23 09:47:51.238 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:51 compute-0 nova_compute[182092]: 2026-01-23 09:47:51.307 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:51 compute-0 nova_compute[182092]: 2026-01-23 09:47:51.734 182096 DEBUG nova.compute.manager [req-cb489793-5eed-411a-a617-56352cb20651 req-f38a2ff7-b274-4b9e-9b88-55a5d5d8702c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-changed-3cdd3d90-ad23-43fb-acff-243955771b50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:47:51 compute-0 nova_compute[182092]: 2026-01-23 09:47:51.734 182096 DEBUG nova.compute.manager [req-cb489793-5eed-411a-a617-56352cb20651 req-f38a2ff7-b274-4b9e-9b88-55a5d5d8702c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Refreshing instance network info cache due to event network-changed-3cdd3d90-ad23-43fb-acff-243955771b50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:47:51 compute-0 nova_compute[182092]: 2026-01-23 09:47:51.734 182096 DEBUG oslo_concurrency.lockutils [req-cb489793-5eed-411a-a617-56352cb20651 req-f38a2ff7-b274-4b9e-9b88-55a5d5d8702c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:47:51 compute-0 nova_compute[182092]: 2026-01-23 09:47:51.735 182096 DEBUG oslo_concurrency.lockutils [req-cb489793-5eed-411a-a617-56352cb20651 req-f38a2ff7-b274-4b9e-9b88-55a5d5d8702c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:47:51 compute-0 nova_compute[182092]: 2026-01-23 09:47:51.735 182096 DEBUG nova.network.neutron [req-cb489793-5eed-411a-a617-56352cb20651 req-f38a2ff7-b274-4b9e-9b88-55a5d5d8702c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Refreshing network info cache for port 3cdd3d90-ad23-43fb-acff-243955771b50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:47:52 compute-0 podman[234046]: 2026-01-23 09:47:52.197562605 +0000 UTC m=+0.040327048 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 23 09:47:52 compute-0 nova_compute[182092]: 2026-01-23 09:47:52.661 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:53 compute-0 nova_compute[182092]: 2026-01-23 09:47:53.734 182096 DEBUG nova.network.neutron [req-cb489793-5eed-411a-a617-56352cb20651 req-f38a2ff7-b274-4b9e-9b88-55a5d5d8702c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Updated VIF entry in instance network info cache for port 3cdd3d90-ad23-43fb-acff-243955771b50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:47:53 compute-0 nova_compute[182092]: 2026-01-23 09:47:53.735 182096 DEBUG nova.network.neutron [req-cb489793-5eed-411a-a617-56352cb20651 req-f38a2ff7-b274-4b9e-9b88-55a5d5d8702c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Updating instance_info_cache with network_info: [{"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:47:53 compute-0 nova_compute[182092]: 2026-01-23 09:47:53.753 182096 DEBUG oslo_concurrency.lockutils [req-cb489793-5eed-411a-a617-56352cb20651 req-f38a2ff7-b274-4b9e-9b88-55a5d5d8702c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:47:56 compute-0 nova_compute[182092]: 2026-01-23 09:47:56.240 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:47:57 compute-0 nova_compute[182092]: 2026-01-23 09:47:57.661 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:00 compute-0 ovn_controller[94697]: 2026-01-23T09:48:00Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:45:55 10.100.0.11
Jan 23 09:48:00 compute-0 ovn_controller[94697]: 2026-01-23T09:48:00Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:45:55 10.100.0.11
Jan 23 09:48:01 compute-0 nova_compute[182092]: 2026-01-23 09:48:01.242 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:02 compute-0 nova_compute[182092]: 2026-01-23 09:48:02.664 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:03 compute-0 podman[234076]: 2026-01-23 09:48:03.217307789 +0000 UTC m=+0.056600659 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true)
Jan 23 09:48:06 compute-0 nova_compute[182092]: 2026-01-23 09:48:06.244 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:07 compute-0 nova_compute[182092]: 2026-01-23 09:48:07.664 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.270 182096 DEBUG nova.compute.manager [req-c99dbadb-4800-438c-af09-df675e61b120 req-a3c119c9-1669-4f7a-94b1-d3b64f9f38d2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-changed-3cdd3d90-ad23-43fb-acff-243955771b50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.271 182096 DEBUG nova.compute.manager [req-c99dbadb-4800-438c-af09-df675e61b120 req-a3c119c9-1669-4f7a-94b1-d3b64f9f38d2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Refreshing instance network info cache due to event network-changed-3cdd3d90-ad23-43fb-acff-243955771b50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.271 182096 DEBUG oslo_concurrency.lockutils [req-c99dbadb-4800-438c-af09-df675e61b120 req-a3c119c9-1669-4f7a-94b1-d3b64f9f38d2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.271 182096 DEBUG oslo_concurrency.lockutils [req-c99dbadb-4800-438c-af09-df675e61b120 req-a3c119c9-1669-4f7a-94b1-d3b64f9f38d2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.271 182096 DEBUG nova.network.neutron [req-c99dbadb-4800-438c-af09-df675e61b120 req-a3c119c9-1669-4f7a-94b1-d3b64f9f38d2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Refreshing network info cache for port 3cdd3d90-ad23-43fb-acff-243955771b50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.320 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "46b2697d-901b-419d-ba8b-4082a8dacef9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.320 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.321 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.321 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.321 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.329 182096 INFO nova.compute.manager [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Terminating instance
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.334 182096 DEBUG nova.compute.manager [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:48:09 compute-0 kernel: tap3cdd3d90-ad (unregistering): left promiscuous mode
Jan 23 09:48:09 compute-0 NetworkManager[54920]: <info>  [1769161689.3616] device (tap3cdd3d90-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:48:09 compute-0 ovn_controller[94697]: 2026-01-23T09:48:09Z|00756|binding|INFO|Releasing lport 3cdd3d90-ad23-43fb-acff-243955771b50 from this chassis (sb_readonly=0)
Jan 23 09:48:09 compute-0 ovn_controller[94697]: 2026-01-23T09:48:09Z|00757|binding|INFO|Setting lport 3cdd3d90-ad23-43fb-acff-243955771b50 down in Southbound
Jan 23 09:48:09 compute-0 ovn_controller[94697]: 2026-01-23T09:48:09Z|00758|binding|INFO|Removing iface tap3cdd3d90-ad ovn-installed in OVS
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.379 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.381 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:45:55 10.100.0.11 2001:db8:0:1:f816:3eff:fe8b:4555 2001:db8::f816:3eff:fe8b:4555'], port_security=['fa:16:3e:8b:45:55 10.100.0.11 2001:db8:0:1:f816:3eff:fe8b:4555 2001:db8::f816:3eff:fe8b:4555'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8:0:1:f816:3eff:fe8b:4555/64 2001:db8::f816:3eff:fe8b:4555/64', 'neutron:device_id': '46b2697d-901b-419d-ba8b-4082a8dacef9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4f64f43c-b2a1-4d58-905b-776c7d1ea8bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=109fc9f6-a6c6-47a3-931c-e9c7ea41cfc2, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=3cdd3d90-ad23-43fb-acff-243955771b50) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.382 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 3cdd3d90-ad23-43fb-acff-243955771b50 in datapath 51a21ab3-c794-4eb4-8d73-d4bfcad68539 unbound from our chassis
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.384 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51a21ab3-c794-4eb4-8d73-d4bfcad68539, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.385 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5adf84c8-c6ef-4b4e-b9f6-a6401bf76b17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.385 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539 namespace which is not needed anymore
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.390 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:09 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b8.scope: Deactivated successfully.
Jan 23 09:48:09 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b8.scope: Consumed 12.819s CPU time.
Jan 23 09:48:09 compute-0 systemd-machined[153562]: Machine qemu-87-instance-000000b8 terminated.
Jan 23 09:48:09 compute-0 podman[234102]: 2026-01-23 09:48:09.44027496 +0000 UTC m=+0.059258049 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:48:09 compute-0 podman[234099]: 2026-01-23 09:48:09.456494018 +0000 UTC m=+0.069018379 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:48:09 compute-0 neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539[233991]: [NOTICE]   (233995) : haproxy version is 2.8.14-c23fe91
Jan 23 09:48:09 compute-0 neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539[233991]: [NOTICE]   (233995) : path to executable is /usr/sbin/haproxy
Jan 23 09:48:09 compute-0 neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539[233991]: [WARNING]  (233995) : Exiting Master process...
Jan 23 09:48:09 compute-0 neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539[233991]: [ALERT]    (233995) : Current worker (233997) exited with code 143 (Terminated)
Jan 23 09:48:09 compute-0 neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539[233991]: [WARNING]  (233995) : All workers exited. Exiting... (0)
Jan 23 09:48:09 compute-0 systemd[1]: libpod-beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551.scope: Deactivated successfully.
Jan 23 09:48:09 compute-0 podman[234158]: 2026-01-23 09:48:09.49355106 +0000 UTC m=+0.035286335 container died beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:48:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551-userdata-shm.mount: Deactivated successfully.
Jan 23 09:48:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-5b49b8eb24d1513ef80ae7bb86f7c7ade2fc666635483b21aab99e210b239277-merged.mount: Deactivated successfully.
Jan 23 09:48:09 compute-0 podman[234158]: 2026-01-23 09:48:09.511344986 +0000 UTC m=+0.053080261 container cleanup beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:48:09 compute-0 systemd[1]: libpod-conmon-beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551.scope: Deactivated successfully.
Jan 23 09:48:09 compute-0 podman[234181]: 2026-01-23 09:48:09.566027869 +0000 UTC m=+0.038879780 container remove beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.569 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[25624447-c49c-4de9-b58d-fd6acc01889d]: (4, ('Fri Jan 23 09:48:09 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539 (beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551)\nbeed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551\nFri Jan 23 09:48:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539 (beed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551)\nbeed3e3f48a60623d05a4a7a1a04c9b1a35eb917937fb743da47a2ebf7512551\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.571 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[3151654a-5a7f-4dd3-a051-0073361deed5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.572 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51a21ab3-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.574 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:09 compute-0 kernel: tap51a21ab3-c0: left promiscuous mode
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.578 182096 INFO nova.virt.libvirt.driver [-] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Instance destroyed successfully.
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.578 182096 DEBUG nova.objects.instance [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'resources' on Instance uuid 46b2697d-901b-419d-ba8b-4082a8dacef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.590 182096 DEBUG nova.virt.libvirt.vif [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-128960503',display_name='tempest-TestGettingAddress-server-128960503',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-128960503',id=184,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCkOps738Cbl+qKh9DJYOjz9dYsW32X9JFwZB7j+2+qfntsDw+vwN8d9MyopNTBWjZ3Vn12IydbvrfZ2XU//tnNjGMJ2/JRwuHcAgbnWa48XbXHMrusizoMhFMhFgnEHoA==',key_name='tempest-TestGettingAddress-263944147',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-rwvau6co',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:48Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=46b2697d-901b-419d-ba8b-4082a8dacef9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.590 182096 DEBUG nova.network.os_vif_util [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.591 182096 DEBUG nova.network.os_vif_util [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:45:55,bridge_name='br-int',has_traffic_filtering=True,id=3cdd3d90-ad23-43fb-acff-243955771b50,network=Network(51a21ab3-c794-4eb4-8d73-d4bfcad68539),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cdd3d90-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.591 182096 DEBUG os_vif [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:45:55,bridge_name='br-int',has_traffic_filtering=True,id=3cdd3d90-ad23-43fb-acff-243955771b50,network=Network(51a21ab3-c794-4eb4-8d73-d4bfcad68539),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cdd3d90-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.592 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.592 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f1567721-891c-462a-b5e8-b7b22325b61d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.593 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3cdd3d90-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.593 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.594 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.597 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.598 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.599 182096 INFO os_vif [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:45:55,bridge_name='br-int',has_traffic_filtering=True,id=3cdd3d90-ad23-43fb-acff-243955771b50,network=Network(51a21ab3-c794-4eb4-8d73-d4bfcad68539),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cdd3d90-ad')
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.600 182096 INFO nova.virt.libvirt.driver [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Deleting instance files /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9_del
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.600 182096 INFO nova.virt.libvirt.driver [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Deletion of /var/lib/nova/instances/46b2697d-901b-419d-ba8b-4082a8dacef9_del complete
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.604 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f63959b9-6f05-4a45-ba92-738fda6de9df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.604 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1e1368-511e-45c3-8e80-926b9e41dfb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.617 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ee406c6a-b2d3-4cd1-804d-191248731cf4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517228, 'reachable_time': 39129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234212, 'error': None, 'target': 'ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d51a21ab3\x2dc794\x2d4eb4\x2d8d73\x2dd4bfcad68539.mount: Deactivated successfully.
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.620 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-51a21ab3-c794-4eb4-8d73-d4bfcad68539 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:48:09 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:09.620 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[fb80d46d-af60-4567-a48d-439e05bace2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.662 182096 INFO nova.compute.manager [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.662 182096 DEBUG oslo.service.loopingcall [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.662 182096 DEBUG nova.compute.manager [-] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:48:09 compute-0 nova_compute[182092]: 2026-01-23 09:48:09.662 182096 DEBUG nova.network.neutron [-] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.096 182096 DEBUG nova.compute.manager [req-603efeb6-c729-4315-b633-67c6d9f4b95f req-ca19a5d4-beb8-4e3a-8753-b9f9fa9dd03c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-vif-unplugged-3cdd3d90-ad23-43fb-acff-243955771b50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.096 182096 DEBUG oslo_concurrency.lockutils [req-603efeb6-c729-4315-b633-67c6d9f4b95f req-ca19a5d4-beb8-4e3a-8753-b9f9fa9dd03c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.096 182096 DEBUG oslo_concurrency.lockutils [req-603efeb6-c729-4315-b633-67c6d9f4b95f req-ca19a5d4-beb8-4e3a-8753-b9f9fa9dd03c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.096 182096 DEBUG oslo_concurrency.lockutils [req-603efeb6-c729-4315-b633-67c6d9f4b95f req-ca19a5d4-beb8-4e3a-8753-b9f9fa9dd03c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.097 182096 DEBUG nova.compute.manager [req-603efeb6-c729-4315-b633-67c6d9f4b95f req-ca19a5d4-beb8-4e3a-8753-b9f9fa9dd03c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] No waiting events found dispatching network-vif-unplugged-3cdd3d90-ad23-43fb-acff-243955771b50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.097 182096 DEBUG nova.compute.manager [req-603efeb6-c729-4315-b633-67c6d9f4b95f req-ca19a5d4-beb8-4e3a-8753-b9f9fa9dd03c 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-vif-unplugged-3cdd3d90-ad23-43fb-acff-243955771b50 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:48:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:10.347 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.347 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:10 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:10.349 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.367 182096 DEBUG nova.network.neutron [-] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.381 182096 INFO nova.compute.manager [-] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Took 0.72 seconds to deallocate network for instance.
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.426 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.426 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.442 182096 DEBUG nova.compute.manager [req-8764ef23-e71c-4ad4-8a1a-6cbddab0751e req-d03dcc7c-f572-4a38-b5e3-ae16963d04a2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-vif-deleted-3cdd3d90-ad23-43fb-acff-243955771b50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.478 182096 DEBUG nova.compute.provider_tree [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.494 182096 DEBUG nova.scheduler.client.report [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.513 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.532 182096 INFO nova.scheduler.client.report [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Deleted allocations for instance 46b2697d-901b-419d-ba8b-4082a8dacef9
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.597 182096 DEBUG oslo_concurrency.lockutils [None req-3f6972f4-ae6c-4690-988d-b5971a44b7d2 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.947 182096 DEBUG nova.network.neutron [req-c99dbadb-4800-438c-af09-df675e61b120 req-a3c119c9-1669-4f7a-94b1-d3b64f9f38d2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Updated VIF entry in instance network info cache for port 3cdd3d90-ad23-43fb-acff-243955771b50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.948 182096 DEBUG nova.network.neutron [req-c99dbadb-4800-438c-af09-df675e61b120 req-a3c119c9-1669-4f7a-94b1-d3b64f9f38d2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Updating instance_info_cache with network_info: [{"id": "3cdd3d90-ad23-43fb-acff-243955771b50", "address": "fa:16:3e:8b:45:55", "network": {"id": "51a21ab3-c794-4eb4-8d73-d4bfcad68539", "bridge": "br-int", "label": "tempest-network-smoke--113102690", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe8b:4555", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cdd3d90-ad", "ovs_interfaceid": "3cdd3d90-ad23-43fb-acff-243955771b50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:48:10 compute-0 nova_compute[182092]: 2026-01-23 09:48:10.967 182096 DEBUG oslo_concurrency.lockutils [req-c99dbadb-4800-438c-af09-df675e61b120 req-a3c119c9-1669-4f7a-94b1-d3b64f9f38d2 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-46b2697d-901b-419d-ba8b-4082a8dacef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:48:12 compute-0 nova_compute[182092]: 2026-01-23 09:48:12.195 182096 DEBUG nova.compute.manager [req-3aea0b49-1f1d-43e2-8e36-6e43f58a155f req-bec1637b-1dba-429f-a3b2-4ae47e2008ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received event network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:48:12 compute-0 nova_compute[182092]: 2026-01-23 09:48:12.195 182096 DEBUG oslo_concurrency.lockutils [req-3aea0b49-1f1d-43e2-8e36-6e43f58a155f req-bec1637b-1dba-429f-a3b2-4ae47e2008ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:12 compute-0 nova_compute[182092]: 2026-01-23 09:48:12.195 182096 DEBUG oslo_concurrency.lockutils [req-3aea0b49-1f1d-43e2-8e36-6e43f58a155f req-bec1637b-1dba-429f-a3b2-4ae47e2008ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:12 compute-0 nova_compute[182092]: 2026-01-23 09:48:12.195 182096 DEBUG oslo_concurrency.lockutils [req-3aea0b49-1f1d-43e2-8e36-6e43f58a155f req-bec1637b-1dba-429f-a3b2-4ae47e2008ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "46b2697d-901b-419d-ba8b-4082a8dacef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:12 compute-0 nova_compute[182092]: 2026-01-23 09:48:12.196 182096 DEBUG nova.compute.manager [req-3aea0b49-1f1d-43e2-8e36-6e43f58a155f req-bec1637b-1dba-429f-a3b2-4ae47e2008ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] No waiting events found dispatching network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:48:12 compute-0 nova_compute[182092]: 2026-01-23 09:48:12.196 182096 WARNING nova.compute.manager [req-3aea0b49-1f1d-43e2-8e36-6e43f58a155f req-bec1637b-1dba-429f-a3b2-4ae47e2008ee 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Received unexpected event network-vif-plugged-3cdd3d90-ad23-43fb-acff-243955771b50 for instance with vm_state deleted and task_state None.
Jan 23 09:48:12 compute-0 nova_compute[182092]: 2026-01-23 09:48:12.666 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:14 compute-0 nova_compute[182092]: 2026-01-23 09:48:14.594 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:15 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:15.350 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:17 compute-0 nova_compute[182092]: 2026-01-23 09:48:17.667 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:19 compute-0 nova_compute[182092]: 2026-01-23 09:48:19.595 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:20 compute-0 podman[234213]: 2026-01-23 09:48:20.208358498 +0000 UTC m=+0.040184350 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:48:20 compute-0 podman[234214]: 2026-01-23 09:48:20.239192669 +0000 UTC m=+0.069423984 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:48:20 compute-0 nova_compute[182092]: 2026-01-23 09:48:20.713 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:22 compute-0 nova_compute[182092]: 2026-01-23 09:48:22.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:22 compute-0 nova_compute[182092]: 2026-01-23 09:48:22.670 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:23 compute-0 podman[234249]: 2026-01-23 09:48:23.201369267 +0000 UTC m=+0.039700467 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Jan 23 09:48:23 compute-0 nova_compute[182092]: 2026-01-23 09:48:23.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:24 compute-0 nova_compute[182092]: 2026-01-23 09:48:24.575 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161689.5741837, 46b2697d-901b-419d-ba8b-4082a8dacef9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:48:24 compute-0 nova_compute[182092]: 2026-01-23 09:48:24.575 182096 INFO nova.compute.manager [-] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] VM Stopped (Lifecycle Event)
Jan 23 09:48:24 compute-0 nova_compute[182092]: 2026-01-23 09:48:24.589 182096 DEBUG nova.compute.manager [None req-91acedba-5f25-464e-9dae-2b76dc4ff19d - - - - - -] [instance: 46b2697d-901b-419d-ba8b-4082a8dacef9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:48:24 compute-0 nova_compute[182092]: 2026-01-23 09:48:24.596 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:24 compute-0 nova_compute[182092]: 2026-01-23 09:48:24.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.675 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.675 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.876 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.877 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.21207427978516GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.877 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.877 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:25 compute-0 nova_compute[182092]: 2026-01-23 09:48:25.977 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.041 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.041 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.103 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.137 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.156 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.156 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.172 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.189 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.203 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.216 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.237 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:48:26 compute-0 nova_compute[182092]: 2026-01-23 09:48:26.238 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:27 compute-0 nova_compute[182092]: 2026-01-23 09:48:27.671 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:28 compute-0 nova_compute[182092]: 2026-01-23 09:48:28.238 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:28 compute-0 nova_compute[182092]: 2026-01-23 09:48:28.238 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:48:28 compute-0 nova_compute[182092]: 2026-01-23 09:48:28.238 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:48:28 compute-0 nova_compute[182092]: 2026-01-23 09:48:28.253 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:48:28 compute-0 nova_compute[182092]: 2026-01-23 09:48:28.253 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:28 compute-0 nova_compute[182092]: 2026-01-23 09:48:28.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:29 compute-0 nova_compute[182092]: 2026-01-23 09:48:29.597 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:30 compute-0 nova_compute[182092]: 2026-01-23 09:48:30.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:30 compute-0 nova_compute[182092]: 2026-01-23 09:48:30.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:48:32 compute-0 nova_compute[182092]: 2026-01-23 09:48:32.673 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:48:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:48:34 compute-0 podman[234269]: 2026-01-23 09:48:34.222294415 +0000 UTC m=+0.058565533 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:48:34 compute-0 nova_compute[182092]: 2026-01-23 09:48:34.598 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:34 compute-0 nova_compute[182092]: 2026-01-23 09:48:34.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:37 compute-0 nova_compute[182092]: 2026-01-23 09:48:37.674 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:38 compute-0 nova_compute[182092]: 2026-01-23 09:48:38.654 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:38 compute-0 nova_compute[182092]: 2026-01-23 09:48:38.666 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:38 compute-0 nova_compute[182092]: 2026-01-23 09:48:38.666 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:48:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:39.492 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:16:8a 10.100.0.2 2001:db8::f816:3eff:fe36:168a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe36:168a/64', 'neutron:device_id': 'ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd5f8946-48e7-4f2c-8855-f64181aa7847, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4d28d5ae-45b7-4a14-8cda-7a830666433e) old=Port_Binding(mac=['fa:16:3e:36:16:8a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:48:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:39.493 103978 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4d28d5ae-45b7-4a14-8cda-7a830666433e in datapath 851738b3-cf3b-4d16-99d6-1b0160959d1e updated
Jan 23 09:48:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:39.494 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 851738b3-cf3b-4d16-99d6-1b0160959d1e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:48:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:39.495 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3c6e36-9e22-4ceb-922d-7d8d842566fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:39 compute-0 nova_compute[182092]: 2026-01-23 09:48:39.599 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:39.879 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:39.879 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:39.880 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:40 compute-0 podman[234293]: 2026-01-23 09:48:40.221276682 +0000 UTC m=+0.044496028 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:48:40 compute-0 podman[234292]: 2026-01-23 09:48:40.249262804 +0000 UTC m=+0.074795182 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 09:48:42 compute-0 nova_compute[182092]: 2026-01-23 09:48:42.658 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:48:42 compute-0 nova_compute[182092]: 2026-01-23 09:48:42.658 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:48:42 compute-0 nova_compute[182092]: 2026-01-23 09:48:42.673 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:48:42 compute-0 nova_compute[182092]: 2026-01-23 09:48:42.675 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:44 compute-0 nova_compute[182092]: 2026-01-23 09:48:44.600 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.227 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.228 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.239 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.304 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.304 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.308 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.308 182096 INFO nova.compute.claims [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Claim successful on node compute-0.ctlplane.example.com
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.401 182096 DEBUG nova.compute.provider_tree [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.410 182096 DEBUG nova.scheduler.client.report [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.423 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.423 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.471 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.471 182096 DEBUG nova.network.neutron [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.483 182096 INFO nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.496 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.577 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.578 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.578 182096 INFO nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Creating image(s)
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.579 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "/var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.579 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.580 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "/var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.590 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.637 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.638 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.639 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.648 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.694 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.695 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.719 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147,backing_fmt=raw /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.720 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "84599409cc5581bc3853b7dfcf4b30782ffdf147" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.720 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.766 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/84599409cc5581bc3853b7dfcf4b30782ffdf147 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.767 182096 DEBUG nova.virt.disk.api [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Checking if we can resize image /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.767 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.814 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.815 182096 DEBUG nova.virt.disk.api [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Cannot resize image /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.815 182096 DEBUG nova.objects.instance [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'migration_context' on Instance uuid a3fa434c-5abe-4d55-a14d-f540ec42a580 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.830 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.830 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Ensure instance console log exists: /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.831 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.831 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.831 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:45 compute-0 nova_compute[182092]: 2026-01-23 09:48:45.962 182096 DEBUG nova.policy [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2223cd913aab4f7cbffc6e9c703c6acc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4181f6c647942e881af13381cc2f253', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 09:48:46 compute-0 nova_compute[182092]: 2026-01-23 09:48:46.538 182096 DEBUG nova.network.neutron [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Successfully created port: 4ba2d913-a539-4f82-8b0a-74163c704c0d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.382 182096 DEBUG nova.network.neutron [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Successfully updated port: 4ba2d913-a539-4f82-8b0a-74163c704c0d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.395 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.396 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquired lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.396 182096 DEBUG nova.network.neutron [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.459 182096 DEBUG nova.compute.manager [req-b749fe88-0c8e-4e49-b8c3-f3c39b180a32 req-ced9e439-e13f-4252-807e-75078a72fb2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-changed-4ba2d913-a539-4f82-8b0a-74163c704c0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.460 182096 DEBUG nova.compute.manager [req-b749fe88-0c8e-4e49-b8c3-f3c39b180a32 req-ced9e439-e13f-4252-807e-75078a72fb2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Refreshing instance network info cache due to event network-changed-4ba2d913-a539-4f82-8b0a-74163c704c0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.460 182096 DEBUG oslo_concurrency.lockutils [req-b749fe88-0c8e-4e49-b8c3-f3c39b180a32 req-ced9e439-e13f-4252-807e-75078a72fb2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.526 182096 DEBUG nova.network.neutron [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 09:48:47 compute-0 nova_compute[182092]: 2026-01-23 09:48:47.677 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.642 182096 DEBUG nova.network.neutron [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updating instance_info_cache with network_info: [{"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.659 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Releasing lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.659 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Instance network_info: |[{"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.659 182096 DEBUG oslo_concurrency.lockutils [req-b749fe88-0c8e-4e49-b8c3-f3c39b180a32 req-ced9e439-e13f-4252-807e-75078a72fb2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.659 182096 DEBUG nova.network.neutron [req-b749fe88-0c8e-4e49-b8c3-f3c39b180a32 req-ced9e439-e13f-4252-807e-75078a72fb2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Refreshing network info cache for port 4ba2d913-a539-4f82-8b0a-74163c704c0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.662 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Start _get_guest_xml network_info=[{"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'encryption_options': None, 'encryption_format': None, 'image_id': '84bf9744-ebe0-4357-9697-347a3a1a297e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.666 182096 WARNING nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.673 182096 DEBUG nova.virt.libvirt.host [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.673 182096 DEBUG nova.virt.libvirt.host [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.676 182096 DEBUG nova.virt.libvirt.host [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.676 182096 DEBUG nova.virt.libvirt.host [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.677 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.678 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:11:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98e818ca-8ca1-4177-8a64-bde266c399d2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:11:55Z,direct_url=<?>,disk_format='qcow2',id=84bf9744-ebe0-4357-9697-347a3a1a297e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22ffac567c764637a2b549ff36a436c0',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:11:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.678 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.678 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.678 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.679 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.679 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.679 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.679 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.679 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.680 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.680 182096 DEBUG nova.virt.hardware [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.683 182096 DEBUG nova.virt.libvirt.vif [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:48:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-704129977',display_name='tempest-TestGettingAddress-server-704129977',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-704129977',id=186,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJgLJeFOzNTHEEI2Bw8lK4zjOz2xbO6XtGnUS/1veMSGVE5UBl5KMZE1BWI4iYOwtc9ZDRT1bam8Cj+8EfrZRDIHN76dap7/ylnixLNrBHMzKPZDOp36/7j4T+9G59cRTg==',key_name='tempest-TestGettingAddress-2014497851',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-07hff9g5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:48:45Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a3fa434c-5abe-4d55-a14d-f540ec42a580,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.683 182096 DEBUG nova.network.os_vif_util [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.684 182096 DEBUG nova.network.os_vif_util [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:50,bridge_name='br-int',has_traffic_filtering=True,id=4ba2d913-a539-4f82-8b0a-74163c704c0d,network=Network(851738b3-cf3b-4d16-99d6-1b0160959d1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba2d913-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.685 182096 DEBUG nova.objects.instance [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'pci_devices' on Instance uuid a3fa434c-5abe-4d55-a14d-f540ec42a580 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.695 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] End _get_guest_xml xml=<domain type="kvm">
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <uuid>a3fa434c-5abe-4d55-a14d-f540ec42a580</uuid>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <name>instance-000000ba</name>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <memory>131072</memory>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <vcpu>1</vcpu>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <metadata>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <nova:name>tempest-TestGettingAddress-server-704129977</nova:name>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <nova:creationTime>2026-01-23 09:48:48</nova:creationTime>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <nova:flavor name="m1.nano">
Jan 23 09:48:48 compute-0 nova_compute[182092]:         <nova:memory>128</nova:memory>
Jan 23 09:48:48 compute-0 nova_compute[182092]:         <nova:disk>1</nova:disk>
Jan 23 09:48:48 compute-0 nova_compute[182092]:         <nova:swap>0</nova:swap>
Jan 23 09:48:48 compute-0 nova_compute[182092]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 09:48:48 compute-0 nova_compute[182092]:         <nova:vcpus>1</nova:vcpus>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       </nova:flavor>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <nova:owner>
Jan 23 09:48:48 compute-0 nova_compute[182092]:         <nova:user uuid="2223cd913aab4f7cbffc6e9c703c6acc">tempest-TestGettingAddress-10741833-project-member</nova:user>
Jan 23 09:48:48 compute-0 nova_compute[182092]:         <nova:project uuid="d4181f6c647942e881af13381cc2f253">tempest-TestGettingAddress-10741833</nova:project>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       </nova:owner>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <nova:root type="image" uuid="84bf9744-ebe0-4357-9697-347a3a1a297e"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <nova:ports>
Jan 23 09:48:48 compute-0 nova_compute[182092]:         <nova:port uuid="4ba2d913-a539-4f82-8b0a-74163c704c0d">
Jan 23 09:48:48 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fedb:5150" ipVersion="6"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:         </nova:port>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       </nova:ports>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </nova:instance>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   </metadata>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <sysinfo type="smbios">
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <system>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <entry name="manufacturer">RDO</entry>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <entry name="product">OpenStack Compute</entry>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <entry name="serial">a3fa434c-5abe-4d55-a14d-f540ec42a580</entry>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <entry name="uuid">a3fa434c-5abe-4d55-a14d-f540ec42a580</entry>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <entry name="family">Virtual Machine</entry>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </system>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   </sysinfo>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <os>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <boot dev="hd"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <smbios mode="sysinfo"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   </os>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <features>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <acpi/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <apic/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <vmcoreinfo/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   </features>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <clock offset="utc">
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <timer name="hpet" present="no"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   </clock>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <cpu mode="custom" match="exact">
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <model>Nehalem</model>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   </cpu>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   <devices>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <disk type="file" device="disk">
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <target dev="vda" bus="virtio"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <disk type="file" device="cdrom">
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <driver name="qemu" type="raw" cache="none"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <source file="/var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk.config"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <target dev="sda" bus="sata"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </disk>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <interface type="ethernet">
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <mac address="fa:16:3e:db:51:50"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <mtu size="1442"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <target dev="tap4ba2d913-a5"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </interface>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <serial type="pty">
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <log file="/var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/console.log" append="off"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </serial>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <video>
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <model type="virtio"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </video>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <input type="tablet" bus="usb"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <rng model="virtio">
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <backend model="random">/dev/urandom</backend>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </rng>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <controller type="usb" index="0"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     <memballoon model="virtio">
Jan 23 09:48:48 compute-0 nova_compute[182092]:       <stats period="10"/>
Jan 23 09:48:48 compute-0 nova_compute[182092]:     </memballoon>
Jan 23 09:48:48 compute-0 nova_compute[182092]:   </devices>
Jan 23 09:48:48 compute-0 nova_compute[182092]: </domain>
Jan 23 09:48:48 compute-0 nova_compute[182092]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.696 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Preparing to wait for external event network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.696 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.696 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.696 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.697 182096 DEBUG nova.virt.libvirt.vif [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:48:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-704129977',display_name='tempest-TestGettingAddress-server-704129977',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-704129977',id=186,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJgLJeFOzNTHEEI2Bw8lK4zjOz2xbO6XtGnUS/1veMSGVE5UBl5KMZE1BWI4iYOwtc9ZDRT1bam8Cj+8EfrZRDIHN76dap7/ylnixLNrBHMzKPZDOp36/7j4T+9G59cRTg==',key_name='tempest-TestGettingAddress-2014497851',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-07hff9g5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:48:45Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a3fa434c-5abe-4d55-a14d-f540ec42a580,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.697 182096 DEBUG nova.network.os_vif_util [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.697 182096 DEBUG nova.network.os_vif_util [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:50,bridge_name='br-int',has_traffic_filtering=True,id=4ba2d913-a539-4f82-8b0a-74163c704c0d,network=Network(851738b3-cf3b-4d16-99d6-1b0160959d1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba2d913-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.698 182096 DEBUG os_vif [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:50,bridge_name='br-int',has_traffic_filtering=True,id=4ba2d913-a539-4f82-8b0a-74163c704c0d,network=Network(851738b3-cf3b-4d16-99d6-1b0160959d1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba2d913-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.698 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.698 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.699 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.700 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.701 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ba2d913-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.701 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ba2d913-a5, col_values=(('external_ids', {'iface-id': '4ba2d913-a539-4f82-8b0a-74163c704c0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:51:50', 'vm-uuid': 'a3fa434c-5abe-4d55-a14d-f540ec42a580'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:48 compute-0 NetworkManager[54920]: <info>  [1769161728.7031] manager: (tap4ba2d913-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.704 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.706 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.707 182096 INFO os_vif [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:51:50,bridge_name='br-int',has_traffic_filtering=True,id=4ba2d913-a539-4f82-8b0a-74163c704c0d,network=Network(851738b3-cf3b-4d16-99d6-1b0160959d1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba2d913-a5')
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.758 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.758 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.759 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] No VIF found with MAC fa:16:3e:db:51:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 09:48:48 compute-0 nova_compute[182092]: 2026-01-23 09:48:48.759 182096 INFO nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Using config drive
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.068 182096 INFO nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Creating config drive at /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk.config
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.072 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl8n26m9p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.191 182096 DEBUG oslo_concurrency.processutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl8n26m9p" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:48:49 compute-0 kernel: tap4ba2d913-a5: entered promiscuous mode
Jan 23 09:48:49 compute-0 NetworkManager[54920]: <info>  [1769161729.2284] manager: (tap4ba2d913-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/376)
Jan 23 09:48:49 compute-0 ovn_controller[94697]: 2026-01-23T09:48:49Z|00759|binding|INFO|Claiming lport 4ba2d913-a539-4f82-8b0a-74163c704c0d for this chassis.
Jan 23 09:48:49 compute-0 ovn_controller[94697]: 2026-01-23T09:48:49Z|00760|binding|INFO|4ba2d913-a539-4f82-8b0a-74163c704c0d: Claiming fa:16:3e:db:51:50 10.100.0.14 2001:db8::f816:3eff:fedb:5150
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.232 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.237 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.247 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:51:50 10.100.0.14 2001:db8::f816:3eff:fedb:5150'], port_security=['fa:16:3e:db:51:50 10.100.0.14 2001:db8::f816:3eff:fedb:5150'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28 2001:db8::f816:3eff:fedb:5150/64', 'neutron:device_id': 'a3fa434c-5abe-4d55-a14d-f540ec42a580', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '2', 'neutron:security_group_ids': '792e64e4-b6fd-4169-bbdc-d65c28b4c5e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd5f8946-48e7-4f2c-8855-f64181aa7847, chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=4ba2d913-a539-4f82-8b0a-74163c704c0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.248 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 4ba2d913-a539-4f82-8b0a-74163c704c0d in datapath 851738b3-cf3b-4d16-99d6-1b0160959d1e bound to our chassis
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.249 103978 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 851738b3-cf3b-4d16-99d6-1b0160959d1e
Jan 23 09:48:49 compute-0 systemd-udevd[234365]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.259 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[e8951f46-7e5f-4d03-b521-9687efbaa7be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.260 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap851738b3-c1 in ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.261 210209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap851738b3-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.261 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[95cac75b-255f-48be-bcb3-3dc20ad7f8d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.262 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9e49b37f-124b-4185-af38-c4e8191c16dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 NetworkManager[54920]: <info>  [1769161729.2682] device (tap4ba2d913-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:48:49 compute-0 NetworkManager[54920]: <info>  [1769161729.2689] device (tap4ba2d913-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 09:48:49 compute-0 systemd-machined[153562]: New machine qemu-88-instance-000000ba.
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.276 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[63e98e82-f764-48f6-a654-0444d9c90de3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.293 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[6230f181-b562-41b7-9ba6-5a28c9eb7384]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.294 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:49 compute-0 systemd[1]: Started Virtual Machine qemu-88-instance-000000ba.
Jan 23 09:48:49 compute-0 ovn_controller[94697]: 2026-01-23T09:48:49Z|00761|binding|INFO|Setting lport 4ba2d913-a539-4f82-8b0a-74163c704c0d ovn-installed in OVS
Jan 23 09:48:49 compute-0 ovn_controller[94697]: 2026-01-23T09:48:49Z|00762|binding|INFO|Setting lport 4ba2d913-a539-4f82-8b0a-74163c704c0d up in Southbound
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.299 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.318 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[158124a2-4102-4285-bd34-c29ff3940cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 systemd-udevd[234369]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:48:49 compute-0 NetworkManager[54920]: <info>  [1769161729.3231] manager: (tap851738b3-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/377)
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.322 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[1f62847a-7889-4005-9dea-d31e86e4c675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.344 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[3f3d5263-1944-4257-9243-9f191cfb9527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.346 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[4b89c78d-b3c0-4ab5-9360-cb85428bab29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 NetworkManager[54920]: <info>  [1769161729.3640] device (tap851738b3-c0): carrier: link connected
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.368 210245 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd45429-00b3-429b-9b23-4d14aa2794e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.380 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[75a682ff-d1f9-417c-982e-a3043e96c15f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap851738b3-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:16:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523484, 'reachable_time': 30951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234390, 'error': None, 'target': 'ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.390 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[673d5034-3b19-4cde-b181-abc8e211713c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:168a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523484, 'tstamp': 523484}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234391, 'error': None, 'target': 'ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.400 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[5d845d73-8aad-40a7-9640-ccf26bbd3276]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap851738b3-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:16:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523484, 'reachable_time': 30951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234392, 'error': None, 'target': 'ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.420 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[ee833b24-6c4c-40fe-a9dd-8661413a0df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.464 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[d8900116-efce-4244-ba85-f308bf452435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.465 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap851738b3-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.465 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.466 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap851738b3-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:49 compute-0 NetworkManager[54920]: <info>  [1769161729.4682] manager: (tap851738b3-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Jan 23 09:48:49 compute-0 kernel: tap851738b3-c0: entered promiscuous mode
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.470 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap851738b3-c0, col_values=(('external_ids', {'iface-id': '4d28d5ae-45b7-4a14-8cda-7a830666433e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:49 compute-0 ovn_controller[94697]: 2026-01-23T09:48:49Z|00763|binding|INFO|Releasing lport 4d28d5ae-45b7-4a14-8cda-7a830666433e from this chassis (sb_readonly=0)
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.472 103978 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/851738b3-cf3b-4d16-99d6-1b0160959d1e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/851738b3-cf3b-4d16-99d6-1b0160959d1e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.483 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.484 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b38e2d33-91f7-43bc-84da-899a8da8f173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.485 103978 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: global
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     log         /dev/log local0 debug
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     log-tag     haproxy-metadata-proxy-851738b3-cf3b-4d16-99d6-1b0160959d1e
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     user        root
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     group       root
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     maxconn     1024
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     pidfile     /var/lib/neutron/external/pids/851738b3-cf3b-4d16-99d6-1b0160959d1e.pid.haproxy
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     daemon
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: defaults
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     log global
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     mode http
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     option httplog
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     option dontlognull
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     option http-server-close
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     option forwardfor
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     retries                 3
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     timeout http-request    30s
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     timeout connect         30s
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     timeout client          32s
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     timeout server          32s
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     timeout http-keep-alive 30s
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: listen listener
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     bind 169.254.169.254:80
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:     http-request add-header X-OVN-Network-ID 851738b3-cf3b-4d16-99d6-1b0160959d1e
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.486 103978 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'env', 'PROCESS_TAG=haproxy-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/851738b3-cf3b-4d16-99d6-1b0160959d1e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.487 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.510 182096 DEBUG nova.compute.manager [req-3b312c54-a35a-4369-8a45-59e0459767e6 req-2fad372f-c8fc-459f-a82a-31fc7bcf4465 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.510 182096 DEBUG oslo_concurrency.lockutils [req-3b312c54-a35a-4369-8a45-59e0459767e6 req-2fad372f-c8fc-459f-a82a-31fc7bcf4465 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.510 182096 DEBUG oslo_concurrency.lockutils [req-3b312c54-a35a-4369-8a45-59e0459767e6 req-2fad372f-c8fc-459f-a82a-31fc7bcf4465 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.510 182096 DEBUG oslo_concurrency.lockutils [req-3b312c54-a35a-4369-8a45-59e0459767e6 req-2fad372f-c8fc-459f-a82a-31fc7bcf4465 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.511 182096 DEBUG nova.compute.manager [req-3b312c54-a35a-4369-8a45-59e0459767e6 req-2fad372f-c8fc-459f-a82a-31fc7bcf4465 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Processing event network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.577 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.577 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:48:49 compute-0 podman[234420]: 2026-01-23 09:48:49.76971396 +0000 UTC m=+0.031634900 container create 789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 09:48:49 compute-0 systemd[1]: Started libpod-conmon-789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8.scope.
Jan 23 09:48:49 compute-0 systemd[1]: Started libcrun container.
Jan 23 09:48:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb4f8907eff4449abe58895372ebb207ffdb5e8a5d7ac78ce037314098474e6f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 09:48:49 compute-0 podman[234420]: 2026-01-23 09:48:49.825493469 +0000 UTC m=+0.087414431 container init 789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:48:49 compute-0 podman[234420]: 2026-01-23 09:48:49.831185199 +0000 UTC m=+0.093106140 container start 789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 09:48:49 compute-0 podman[234420]: 2026-01-23 09:48:49.754781939 +0000 UTC m=+0.016702900 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 09:48:49 compute-0 neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e[234432]: [NOTICE]   (234436) : New worker (234438) forked
Jan 23 09:48:49 compute-0 neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e[234432]: [NOTICE]   (234436) : Loading success.
Jan 23 09:48:49 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:49.869 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.966 182096 DEBUG nova.network.neutron [req-b749fe88-0c8e-4e49-b8c3-f3c39b180a32 req-ced9e439-e13f-4252-807e-75078a72fb2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updated VIF entry in instance network info cache for port 4ba2d913-a539-4f82-8b0a-74163c704c0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.967 182096 DEBUG nova.network.neutron [req-b749fe88-0c8e-4e49-b8c3-f3c39b180a32 req-ced9e439-e13f-4252-807e-75078a72fb2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updating instance_info_cache with network_info: [{"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:48:49 compute-0 nova_compute[182092]: 2026-01-23 09:48:49.984 182096 DEBUG oslo_concurrency.lockutils [req-b749fe88-0c8e-4e49-b8c3-f3c39b180a32 req-ced9e439-e13f-4252-807e-75078a72fb2d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.283 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161730.283134, a3fa434c-5abe-4d55-a14d-f540ec42a580 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.283 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] VM Started (Lifecycle Event)
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.285 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.292 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.295 182096 INFO nova.virt.libvirt.driver [-] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Instance spawned successfully.
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.295 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.302 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.304 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.312 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.312 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.313 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.313 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.313 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.313 182096 DEBUG nova.virt.libvirt.driver [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.316 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.316 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161730.2850788, a3fa434c-5abe-4d55-a14d-f540ec42a580 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.317 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] VM Paused (Lifecycle Event)
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.344 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.346 182096 DEBUG nova.virt.driver [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] Emitting event <LifecycleEvent: 1769161730.2870154, a3fa434c-5abe-4d55-a14d-f540ec42a580 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.346 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] VM Resumed (Lifecycle Event)
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.361 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.363 182096 DEBUG nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.375 182096 INFO nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Took 4.80 seconds to spawn the instance on the hypervisor.
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.375 182096 DEBUG nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.379 182096 INFO nova.compute.manager [None req-b5bbd20e-6139-4d20-a30c-ccf970702789 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.425 182096 INFO nova.compute.manager [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Took 5.15 seconds to build instance.
Jan 23 09:48:50 compute-0 nova_compute[182092]: 2026-01-23 09:48:50.434 182096 DEBUG oslo_concurrency.lockutils [None req-e11cd57b-69e5-4f2f-a908-aa09f10117e1 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:51 compute-0 podman[234450]: 2026-01-23 09:48:51.215239901 +0000 UTC m=+0.049293964 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:48:51 compute-0 podman[234451]: 2026-01-23 09:48:51.215272762 +0000 UTC m=+0.047678748 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:48:51 compute-0 nova_compute[182092]: 2026-01-23 09:48:51.578 182096 DEBUG nova.compute.manager [req-8ac0ff6d-9d28-42d7-9698-0c17eb65f962 req-9968ab3f-661b-4ef8-84e8-6ed69d7307e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:48:51 compute-0 nova_compute[182092]: 2026-01-23 09:48:51.579 182096 DEBUG oslo_concurrency.lockutils [req-8ac0ff6d-9d28-42d7-9698-0c17eb65f962 req-9968ab3f-661b-4ef8-84e8-6ed69d7307e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:48:51 compute-0 nova_compute[182092]: 2026-01-23 09:48:51.579 182096 DEBUG oslo_concurrency.lockutils [req-8ac0ff6d-9d28-42d7-9698-0c17eb65f962 req-9968ab3f-661b-4ef8-84e8-6ed69d7307e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:48:51 compute-0 nova_compute[182092]: 2026-01-23 09:48:51.579 182096 DEBUG oslo_concurrency.lockutils [req-8ac0ff6d-9d28-42d7-9698-0c17eb65f962 req-9968ab3f-661b-4ef8-84e8-6ed69d7307e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:48:51 compute-0 nova_compute[182092]: 2026-01-23 09:48:51.579 182096 DEBUG nova.compute.manager [req-8ac0ff6d-9d28-42d7-9698-0c17eb65f962 req-9968ab3f-661b-4ef8-84e8-6ed69d7307e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] No waiting events found dispatching network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:48:51 compute-0 nova_compute[182092]: 2026-01-23 09:48:51.579 182096 WARNING nova.compute.manager [req-8ac0ff6d-9d28-42d7-9698-0c17eb65f962 req-9968ab3f-661b-4ef8-84e8-6ed69d7307e8 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received unexpected event network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d for instance with vm_state active and task_state None.
Jan 23 09:48:52 compute-0 nova_compute[182092]: 2026-01-23 09:48:52.678 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:53 compute-0 nova_compute[182092]: 2026-01-23 09:48:53.703 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:54 compute-0 NetworkManager[54920]: <info>  [1769161734.0935] manager: (patch-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Jan 23 09:48:54 compute-0 nova_compute[182092]: 2026-01-23 09:48:54.092 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:54 compute-0 NetworkManager[54920]: <info>  [1769161734.0944] manager: (patch-br-int-to-provnet-b70ddb5c-fd34-4dbc-9836-a1de220595f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Jan 23 09:48:54 compute-0 nova_compute[182092]: 2026-01-23 09:48:54.213 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:54 compute-0 ovn_controller[94697]: 2026-01-23T09:48:54Z|00764|binding|INFO|Releasing lport 4d28d5ae-45b7-4a14-8cda-7a830666433e from this chassis (sb_readonly=0)
Jan 23 09:48:54 compute-0 nova_compute[182092]: 2026-01-23 09:48:54.223 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:54 compute-0 podman[234488]: 2026-01-23 09:48:54.26689355 +0000 UTC m=+0.041938457 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter)
Jan 23 09:48:54 compute-0 nova_compute[182092]: 2026-01-23 09:48:54.346 182096 DEBUG nova.compute.manager [req-619d467c-1e8a-4fc8-a3a2-9c186092f1f4 req-59a491ac-1d4c-4bea-8095-883931212397 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-changed-4ba2d913-a539-4f82-8b0a-74163c704c0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:48:54 compute-0 nova_compute[182092]: 2026-01-23 09:48:54.346 182096 DEBUG nova.compute.manager [req-619d467c-1e8a-4fc8-a3a2-9c186092f1f4 req-59a491ac-1d4c-4bea-8095-883931212397 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Refreshing instance network info cache due to event network-changed-4ba2d913-a539-4f82-8b0a-74163c704c0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:48:54 compute-0 nova_compute[182092]: 2026-01-23 09:48:54.347 182096 DEBUG oslo_concurrency.lockutils [req-619d467c-1e8a-4fc8-a3a2-9c186092f1f4 req-59a491ac-1d4c-4bea-8095-883931212397 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:48:54 compute-0 nova_compute[182092]: 2026-01-23 09:48:54.347 182096 DEBUG oslo_concurrency.lockutils [req-619d467c-1e8a-4fc8-a3a2-9c186092f1f4 req-59a491ac-1d4c-4bea-8095-883931212397 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:48:54 compute-0 nova_compute[182092]: 2026-01-23 09:48:54.347 182096 DEBUG nova.network.neutron [req-619d467c-1e8a-4fc8-a3a2-9c186092f1f4 req-59a491ac-1d4c-4bea-8095-883931212397 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Refreshing network info cache for port 4ba2d913-a539-4f82-8b0a-74163c704c0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:48:55 compute-0 nova_compute[182092]: 2026-01-23 09:48:55.980 182096 DEBUG nova.network.neutron [req-619d467c-1e8a-4fc8-a3a2-9c186092f1f4 req-59a491ac-1d4c-4bea-8095-883931212397 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updated VIF entry in instance network info cache for port 4ba2d913-a539-4f82-8b0a-74163c704c0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:48:55 compute-0 nova_compute[182092]: 2026-01-23 09:48:55.981 182096 DEBUG nova.network.neutron [req-619d467c-1e8a-4fc8-a3a2-9c186092f1f4 req-59a491ac-1d4c-4bea-8095-883931212397 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updating instance_info_cache with network_info: [{"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:48:55 compute-0 nova_compute[182092]: 2026-01-23 09:48:55.996 182096 DEBUG oslo_concurrency.lockutils [req-619d467c-1e8a-4fc8-a3a2-9c186092f1f4 req-59a491ac-1d4c-4bea-8095-883931212397 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:48:57 compute-0 nova_compute[182092]: 2026-01-23 09:48:57.682 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:48:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:48:57.871 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:48:58 compute-0 nova_compute[182092]: 2026-01-23 09:48:58.705 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:02 compute-0 ovn_controller[94697]: 2026-01-23T09:49:02Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:51:50 10.100.0.14
Jan 23 09:49:02 compute-0 ovn_controller[94697]: 2026-01-23T09:49:02Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:51:50 10.100.0.14
Jan 23 09:49:02 compute-0 nova_compute[182092]: 2026-01-23 09:49:02.682 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:03 compute-0 nova_compute[182092]: 2026-01-23 09:49:03.706 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:04 compute-0 ovn_controller[94697]: 2026-01-23T09:49:04Z|00765|binding|INFO|Releasing lport 4d28d5ae-45b7-4a14-8cda-7a830666433e from this chassis (sb_readonly=0)
Jan 23 09:49:04 compute-0 nova_compute[182092]: 2026-01-23 09:49:04.414 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:05 compute-0 podman[234518]: 2026-01-23 09:49:05.228327872 +0000 UTC m=+0.060973422 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:49:07 compute-0 ovn_controller[94697]: 2026-01-23T09:49:07Z|00766|binding|INFO|Releasing lport 4d28d5ae-45b7-4a14-8cda-7a830666433e from this chassis (sb_readonly=0)
Jan 23 09:49:07 compute-0 nova_compute[182092]: 2026-01-23 09:49:07.390 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:07 compute-0 nova_compute[182092]: 2026-01-23 09:49:07.683 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:08 compute-0 nova_compute[182092]: 2026-01-23 09:49:08.709 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:11 compute-0 podman[234542]: 2026-01-23 09:49:11.205205637 +0000 UTC m=+0.038836788 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:49:11 compute-0 podman[234541]: 2026-01-23 09:49:11.207793416 +0000 UTC m=+0.044604562 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:49:12 compute-0 nova_compute[182092]: 2026-01-23 09:49:12.686 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:13 compute-0 nova_compute[182092]: 2026-01-23 09:49:13.712 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:16 compute-0 nova_compute[182092]: 2026-01-23 09:49:16.094 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:16 compute-0 nova_compute[182092]: 2026-01-23 09:49:16.113 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Triggering sync for uuid a3fa434c-5abe-4d55-a14d-f540ec42a580 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 23 09:49:16 compute-0 nova_compute[182092]: 2026-01-23 09:49:16.114 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:16 compute-0 nova_compute[182092]: 2026-01-23 09:49:16.114 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:16 compute-0 nova_compute[182092]: 2026-01-23 09:49:16.134 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:17 compute-0 nova_compute[182092]: 2026-01-23 09:49:17.687 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:18 compute-0 nova_compute[182092]: 2026-01-23 09:49:18.714 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:22 compute-0 podman[234580]: 2026-01-23 09:49:22.201415962 +0000 UTC m=+0.039242644 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:49:22 compute-0 podman[234581]: 2026-01-23 09:49:22.20641689 +0000 UTC m=+0.041171229 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:49:22 compute-0 nova_compute[182092]: 2026-01-23 09:49:22.665 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:22 compute-0 nova_compute[182092]: 2026-01-23 09:49:22.689 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:23 compute-0 nova_compute[182092]: 2026-01-23 09:49:23.717 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:24 compute-0 nova_compute[182092]: 2026-01-23 09:49:24.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:24 compute-0 nova_compute[182092]: 2026-01-23 09:49:24.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:25 compute-0 podman[234618]: 2026-01-23 09:49:25.201158525 +0000 UTC m=+0.039464572 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.669 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.669 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.670 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.713 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.760 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.760 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 09:49:25 compute-0 nova_compute[182092]: 2026-01-23 09:49:25.806 182096 DEBUG oslo_concurrency.processutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.020 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.021 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5566MB free_disk=73.18291854858398GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.021 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.022 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.075 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Instance a3fa434c-5abe-4d55-a14d-f540ec42a580 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.076 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.076 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.107 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.118 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.132 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:49:26 compute-0 nova_compute[182092]: 2026-01-23 09:49:26.133 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.133 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.691 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.959 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.960 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquired lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.960 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 09:49:27 compute-0 nova_compute[182092]: 2026-01-23 09:49:27.960 182096 DEBUG nova.objects.instance [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a3fa434c-5abe-4d55-a14d-f540ec42a580 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:49:28 compute-0 nova_compute[182092]: 2026-01-23 09:49:28.719 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:28 compute-0 nova_compute[182092]: 2026-01-23 09:49:28.977 182096 DEBUG nova.network.neutron [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updating instance_info_cache with network_info: [{"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:49:28 compute-0 nova_compute[182092]: 2026-01-23 09:49:28.990 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Releasing lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:49:28 compute-0 nova_compute[182092]: 2026-01-23 09:49:28.991 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 09:49:28 compute-0 nova_compute[182092]: 2026-01-23 09:49:28.991 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:28 compute-0 nova_compute[182092]: 2026-01-23 09:49:28.991 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:30 compute-0 nova_compute[182092]: 2026-01-23 09:49:30.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:49:30 compute-0 nova_compute[182092]: 2026-01-23 09:49:30.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:49:32 compute-0 nova_compute[182092]: 2026-01-23 09:49:32.692 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:33 compute-0 nova_compute[182092]: 2026-01-23 09:49:33.720 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:36 compute-0 podman[234644]: 2026-01-23 09:49:36.221615641 +0000 UTC m=+0.058935369 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:49:37 compute-0 nova_compute[182092]: 2026-01-23 09:49:37.693 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:38 compute-0 nova_compute[182092]: 2026-01-23 09:49:38.722 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:39.880 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:39.880 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:39.880 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:42 compute-0 podman[234668]: 2026-01-23 09:49:42.201193357 +0000 UTC m=+0.039008562 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 23 09:49:42 compute-0 podman[234669]: 2026-01-23 09:49:42.206028142 +0000 UTC m=+0.041452540 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:49:42 compute-0 nova_compute[182092]: 2026-01-23 09:49:42.695 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:43 compute-0 nova_compute[182092]: 2026-01-23 09:49:43.724 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:47 compute-0 nova_compute[182092]: 2026-01-23 09:49:47.696 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:48 compute-0 nova_compute[182092]: 2026-01-23 09:49:48.727 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:50 compute-0 ovn_controller[94697]: 2026-01-23T09:49:50Z|00767|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 09:49:52 compute-0 nova_compute[182092]: 2026-01-23 09:49:52.697 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:53 compute-0 nova_compute[182092]: 2026-01-23 09:49:53.185 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:53.184 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:49:53 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:53.185 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:49:53 compute-0 podman[234707]: 2026-01-23 09:49:53.207193868 +0000 UTC m=+0.041336071 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:49:53 compute-0 podman[234708]: 2026-01-23 09:49:53.207947589 +0000 UTC m=+0.040138492 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:49:53 compute-0 nova_compute[182092]: 2026-01-23 09:49:53.729 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 podman[234745]: 2026-01-23 09:49:56.206266507 +0000 UTC m=+0.038828785 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.209 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.210 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.210 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.210 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.210 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.217 182096 INFO nova.compute.manager [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Terminating instance
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.219 182096 DEBUG nova.compute.manager [req-2b578efe-949e-4dfe-a306-9ecd5a7725ce req-d92d96bc-3a4e-4b5f-8c23-01f1cfe00385 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-changed-4ba2d913-a539-4f82-8b0a-74163c704c0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.219 182096 DEBUG nova.compute.manager [req-2b578efe-949e-4dfe-a306-9ecd5a7725ce req-d92d96bc-3a4e-4b5f-8c23-01f1cfe00385 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Refreshing instance network info cache due to event network-changed-4ba2d913-a539-4f82-8b0a-74163c704c0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.220 182096 DEBUG oslo_concurrency.lockutils [req-2b578efe-949e-4dfe-a306-9ecd5a7725ce req-d92d96bc-3a4e-4b5f-8c23-01f1cfe00385 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.220 182096 DEBUG oslo_concurrency.lockutils [req-2b578efe-949e-4dfe-a306-9ecd5a7725ce req-d92d96bc-3a4e-4b5f-8c23-01f1cfe00385 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquired lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.220 182096 DEBUG nova.network.neutron [req-2b578efe-949e-4dfe-a306-9ecd5a7725ce req-d92d96bc-3a4e-4b5f-8c23-01f1cfe00385 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Refreshing network info cache for port 4ba2d913-a539-4f82-8b0a-74163c704c0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.225 182096 DEBUG nova.compute.manager [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 09:49:56 compute-0 kernel: tap4ba2d913-a5 (unregistering): left promiscuous mode
Jan 23 09:49:56 compute-0 NetworkManager[54920]: <info>  [1769161796.2548] device (tap4ba2d913-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.259 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 ovn_controller[94697]: 2026-01-23T09:49:56Z|00768|binding|INFO|Releasing lport 4ba2d913-a539-4f82-8b0a-74163c704c0d from this chassis (sb_readonly=0)
Jan 23 09:49:56 compute-0 ovn_controller[94697]: 2026-01-23T09:49:56Z|00769|binding|INFO|Setting lport 4ba2d913-a539-4f82-8b0a-74163c704c0d down in Southbound
Jan 23 09:49:56 compute-0 ovn_controller[94697]: 2026-01-23T09:49:56Z|00770|binding|INFO|Removing iface tap4ba2d913-a5 ovn-installed in OVS
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.262 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.265 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:51:50 10.100.0.14 2001:db8::f816:3eff:fedb:5150'], port_security=['fa:16:3e:db:51:50 10.100.0.14 2001:db8::f816:3eff:fedb:5150'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28 2001:db8::f816:3eff:fedb:5150/64', 'neutron:device_id': 'a3fa434c-5abe-4d55-a14d-f540ec42a580', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4181f6c647942e881af13381cc2f253', 'neutron:revision_number': '4', 'neutron:security_group_ids': '792e64e4-b6fd-4169-bbdc-d65c28b4c5e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd5f8946-48e7-4f2c-8855-f64181aa7847, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>], logical_port=4ba2d913-a539-4f82-8b0a-74163c704c0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f363770ea00>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.266 103978 INFO neutron.agent.ovn.metadata.agent [-] Port 4ba2d913-a539-4f82-8b0a-74163c704c0d in datapath 851738b3-cf3b-4d16-99d6-1b0160959d1e unbound from our chassis
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.267 103978 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 851738b3-cf3b-4d16-99d6-1b0160959d1e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.268 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e0b223-d6a6-48f2-8869-813df0324dc5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.269 103978 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e namespace which is not needed anymore
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.278 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000ba.scope: Deactivated successfully.
Jan 23 09:49:56 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000ba.scope: Consumed 12.869s CPU time.
Jan 23 09:49:56 compute-0 systemd-machined[153562]: Machine qemu-88-instance-000000ba terminated.
Jan 23 09:49:56 compute-0 neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e[234432]: [NOTICE]   (234436) : haproxy version is 2.8.14-c23fe91
Jan 23 09:49:56 compute-0 neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e[234432]: [NOTICE]   (234436) : path to executable is /usr/sbin/haproxy
Jan 23 09:49:56 compute-0 neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e[234432]: [WARNING]  (234436) : Exiting Master process...
Jan 23 09:49:56 compute-0 neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e[234432]: [WARNING]  (234436) : Exiting Master process...
Jan 23 09:49:56 compute-0 neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e[234432]: [ALERT]    (234436) : Current worker (234438) exited with code 143 (Terminated)
Jan 23 09:49:56 compute-0 neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e[234432]: [WARNING]  (234436) : All workers exited. Exiting... (0)
Jan 23 09:49:56 compute-0 systemd[1]: libpod-789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8.scope: Deactivated successfully.
Jan 23 09:49:56 compute-0 podman[234787]: 2026-01-23 09:49:56.370512026 +0000 UTC m=+0.032946052 container died 789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 09:49:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8-userdata-shm.mount: Deactivated successfully.
Jan 23 09:49:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-bb4f8907eff4449abe58895372ebb207ffdb5e8a5d7ac78ce037314098474e6f-merged.mount: Deactivated successfully.
Jan 23 09:49:56 compute-0 podman[234787]: 2026-01-23 09:49:56.389776564 +0000 UTC m=+0.052210591 container cleanup 789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 09:49:56 compute-0 systemd[1]: libpod-conmon-789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8.scope: Deactivated successfully.
Jan 23 09:49:56 compute-0 podman[234810]: 2026-01-23 09:49:56.430306635 +0000 UTC m=+0.025160185 container remove 789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.435 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb25a5e-6279-4c7b-8e7c-9445a0970419]: (4, ('Fri Jan 23 09:49:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e (789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8)\n789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8\nFri Jan 23 09:49:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e (789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8)\n789e7f0aea185890d268b767374d31615a563a85c23eae178b496b45906a84d8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.438 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.439 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f7096b-f366-40a6-b7fd-71cf2860bbe6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.440 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap851738b3-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.441 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 kernel: tap851738b3-c0: left promiscuous mode
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.456 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.459 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[b69c987a-64d7-482b-86f2-7993188f4d52]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.466 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[27a57d5e-21da-4379-90ec-25549ae7da6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.467 182096 INFO nova.virt.libvirt.driver [-] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Instance destroyed successfully.
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.467 182096 DEBUG nova.objects.instance [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lazy-loading 'resources' on Instance uuid a3fa434c-5abe-4d55-a14d-f540ec42a580 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.467 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[8d6c7d3e-2a62-4ac6-bad8-00bcd184cfc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.480 210209 DEBUG oslo.privsep.daemon [-] privsep: reply[f57cb444-cd3c-4cf6-8bef-22727bd73b33]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523480, 'reachable_time': 43436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234841, 'error': None, 'target': 'ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:49:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d851738b3\x2dcf3b\x2d4d16\x2d99d6\x2d1b0160959d1e.mount: Deactivated successfully.
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.483 104488 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-851738b3-cf3b-4d16-99d6-1b0160959d1e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 09:49:56 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:56.483 104488 DEBUG oslo.privsep.daemon [-] privsep: reply[92c1b7e9-0d36-4901-b27c-66db13ff244b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.484 182096 DEBUG nova.virt.libvirt.vif [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:48:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-704129977',display_name='tempest-TestGettingAddress-server-704129977',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-704129977',id=186,image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJgLJeFOzNTHEEI2Bw8lK4zjOz2xbO6XtGnUS/1veMSGVE5UBl5KMZE1BWI4iYOwtc9ZDRT1bam8Cj+8EfrZRDIHN76dap7/ylnixLNrBHMzKPZDOp36/7j4T+9G59cRTg==',key_name='tempest-TestGettingAddress-2014497851',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:48:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4181f6c647942e881af13381cc2f253',ramdisk_id='',reservation_id='r-07hff9g5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84bf9744-ebe0-4357-9697-347a3a1a297e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-10741833',owner_user_name='tempest-TestGettingAddress-10741833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:48:50Z,user_data=None,user_id='2223cd913aab4f7cbffc6e9c703c6acc',uuid=a3fa434c-5abe-4d55-a14d-f540ec42a580,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.485 182096 DEBUG nova.network.os_vif_util [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converting VIF {"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.485 182096 DEBUG nova.network.os_vif_util [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:51:50,bridge_name='br-int',has_traffic_filtering=True,id=4ba2d913-a539-4f82-8b0a-74163c704c0d,network=Network(851738b3-cf3b-4d16-99d6-1b0160959d1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba2d913-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.486 182096 DEBUG os_vif [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:51:50,bridge_name='br-int',has_traffic_filtering=True,id=4ba2d913-a539-4f82-8b0a-74163c704c0d,network=Network(851738b3-cf3b-4d16-99d6-1b0160959d1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba2d913-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.487 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.487 182096 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ba2d913-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.490 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.492 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.504 182096 INFO os_vif [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:51:50,bridge_name='br-int',has_traffic_filtering=True,id=4ba2d913-a539-4f82-8b0a-74163c704c0d,network=Network(851738b3-cf3b-4d16-99d6-1b0160959d1e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ba2d913-a5')
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.505 182096 INFO nova.virt.libvirt.driver [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Deleting instance files /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580_del
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.511 182096 INFO nova.virt.libvirt.driver [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Deletion of /var/lib/nova/instances/a3fa434c-5abe-4d55-a14d-f540ec42a580_del complete
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.514 182096 DEBUG nova.compute.manager [req-2fcf11f3-e21f-4763-828a-13e28ba8b763 req-862ccf09-67f1-4799-9252-050a3fdf830d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-vif-unplugged-4ba2d913-a539-4f82-8b0a-74163c704c0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.514 182096 DEBUG oslo_concurrency.lockutils [req-2fcf11f3-e21f-4763-828a-13e28ba8b763 req-862ccf09-67f1-4799-9252-050a3fdf830d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.514 182096 DEBUG oslo_concurrency.lockutils [req-2fcf11f3-e21f-4763-828a-13e28ba8b763 req-862ccf09-67f1-4799-9252-050a3fdf830d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.514 182096 DEBUG oslo_concurrency.lockutils [req-2fcf11f3-e21f-4763-828a-13e28ba8b763 req-862ccf09-67f1-4799-9252-050a3fdf830d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.515 182096 DEBUG nova.compute.manager [req-2fcf11f3-e21f-4763-828a-13e28ba8b763 req-862ccf09-67f1-4799-9252-050a3fdf830d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] No waiting events found dispatching network-vif-unplugged-4ba2d913-a539-4f82-8b0a-74163c704c0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.515 182096 DEBUG nova.compute.manager [req-2fcf11f3-e21f-4763-828a-13e28ba8b763 req-862ccf09-67f1-4799-9252-050a3fdf830d 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-vif-unplugged-4ba2d913-a539-4f82-8b0a-74163c704c0d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.574 182096 INFO nova.compute.manager [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.574 182096 DEBUG oslo.service.loopingcall [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.574 182096 DEBUG nova.compute.manager [-] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 09:49:56 compute-0 nova_compute[182092]: 2026-01-23 09:49:56.574 182096 DEBUG nova.network.neutron [-] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 09:49:57 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:49:57.187 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.331 182096 DEBUG nova.network.neutron [-] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.341 182096 INFO nova.compute.manager [-] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Took 0.77 seconds to deallocate network for instance.
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.399 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.400 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.452 182096 DEBUG nova.compute.provider_tree [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.474 182096 DEBUG nova.scheduler.client.report [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.498 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.518 182096 INFO nova.scheduler.client.report [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Deleted allocations for instance a3fa434c-5abe-4d55-a14d-f540ec42a580
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.578 182096 DEBUG oslo_concurrency.lockutils [None req-1a205c28-b7cc-462e-b930-632bf9f96d73 2223cd913aab4f7cbffc6e9c703c6acc d4181f6c647942e881af13381cc2f253 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.606 182096 DEBUG nova.network.neutron [req-2b578efe-949e-4dfe-a306-9ecd5a7725ce req-d92d96bc-3a4e-4b5f-8c23-01f1cfe00385 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updated VIF entry in instance network info cache for port 4ba2d913-a539-4f82-8b0a-74163c704c0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.607 182096 DEBUG nova.network.neutron [req-2b578efe-949e-4dfe-a306-9ecd5a7725ce req-d92d96bc-3a4e-4b5f-8c23-01f1cfe00385 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Updating instance_info_cache with network_info: [{"id": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "address": "fa:16:3e:db:51:50", "network": {"id": "851738b3-cf3b-4d16-99d6-1b0160959d1e", "bridge": "br-int", "label": "tempest-network-smoke--2103742249", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fedb:5150", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d4181f6c647942e881af13381cc2f253", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ba2d913-a5", "ovs_interfaceid": "4ba2d913-a539-4f82-8b0a-74163c704c0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.619 182096 DEBUG oslo_concurrency.lockutils [req-2b578efe-949e-4dfe-a306-9ecd5a7725ce req-d92d96bc-3a4e-4b5f-8c23-01f1cfe00385 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Releasing lock "refresh_cache-a3fa434c-5abe-4d55-a14d-f540ec42a580" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 09:49:57 compute-0 nova_compute[182092]: 2026-01-23 09:49:57.698 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:49:58 compute-0 nova_compute[182092]: 2026-01-23 09:49:58.289 182096 DEBUG nova.compute.manager [req-693f2cea-fc60-4214-be66-08ed8731b51a req-c7cd8fd3-5af5-46b8-8e0e-b46701e5e8c7 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-vif-deleted-4ba2d913-a539-4f82-8b0a-74163c704c0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:49:58 compute-0 nova_compute[182092]: 2026-01-23 09:49:58.578 182096 DEBUG nova.compute.manager [req-844aa4e6-55c3-4fde-bf0e-729ed6d225bb req-41e5b5e9-c4da-41fb-b1c4-ec8e12c43b6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received event network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 09:49:58 compute-0 nova_compute[182092]: 2026-01-23 09:49:58.579 182096 DEBUG oslo_concurrency.lockutils [req-844aa4e6-55c3-4fde-bf0e-729ed6d225bb req-41e5b5e9-c4da-41fb-b1c4-ec8e12c43b6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Acquiring lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:49:58 compute-0 nova_compute[182092]: 2026-01-23 09:49:58.579 182096 DEBUG oslo_concurrency.lockutils [req-844aa4e6-55c3-4fde-bf0e-729ed6d225bb req-41e5b5e9-c4da-41fb-b1c4-ec8e12c43b6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:49:58 compute-0 nova_compute[182092]: 2026-01-23 09:49:58.579 182096 DEBUG oslo_concurrency.lockutils [req-844aa4e6-55c3-4fde-bf0e-729ed6d225bb req-41e5b5e9-c4da-41fb-b1c4-ec8e12c43b6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] Lock "a3fa434c-5abe-4d55-a14d-f540ec42a580-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:49:58 compute-0 nova_compute[182092]: 2026-01-23 09:49:58.579 182096 DEBUG nova.compute.manager [req-844aa4e6-55c3-4fde-bf0e-729ed6d225bb req-41e5b5e9-c4da-41fb-b1c4-ec8e12c43b6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] No waiting events found dispatching network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 09:49:58 compute-0 nova_compute[182092]: 2026-01-23 09:49:58.579 182096 WARNING nova.compute.manager [req-844aa4e6-55c3-4fde-bf0e-729ed6d225bb req-41e5b5e9-c4da-41fb-b1c4-ec8e12c43b6f 80fbf79f0bc24bd8be7e47b9ef0e3515 73d1185fd9024fff9eed7e2e41c48f37 - - default default] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Received unexpected event network-vif-plugged-4ba2d913-a539-4f82-8b0a-74163c704c0d for instance with vm_state deleted and task_state None.
Jan 23 09:50:00 compute-0 nova_compute[182092]: 2026-01-23 09:50:00.430 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:00 compute-0 nova_compute[182092]: 2026-01-23 09:50:00.504 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:01 compute-0 nova_compute[182092]: 2026-01-23 09:50:01.488 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:02 compute-0 nova_compute[182092]: 2026-01-23 09:50:02.699 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:06 compute-0 nova_compute[182092]: 2026-01-23 09:50:06.490 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:07 compute-0 podman[234844]: 2026-01-23 09:50:07.219461245 +0000 UTC m=+0.056637908 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 09:50:07 compute-0 nova_compute[182092]: 2026-01-23 09:50:07.700 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:11 compute-0 nova_compute[182092]: 2026-01-23 09:50:11.466 182096 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161796.465499, a3fa434c-5abe-4d55-a14d-f540ec42a580 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 09:50:11 compute-0 nova_compute[182092]: 2026-01-23 09:50:11.467 182096 INFO nova.compute.manager [-] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] VM Stopped (Lifecycle Event)
Jan 23 09:50:11 compute-0 nova_compute[182092]: 2026-01-23 09:50:11.492 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:11 compute-0 nova_compute[182092]: 2026-01-23 09:50:11.524 182096 DEBUG nova.compute.manager [None req-eeac9b8d-f8da-4907-a33b-4ac8d0e536c3 - - - - - -] [instance: a3fa434c-5abe-4d55-a14d-f540ec42a580] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 09:50:12 compute-0 nova_compute[182092]: 2026-01-23 09:50:12.701 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:13 compute-0 podman[234868]: 2026-01-23 09:50:13.20409471 +0000 UTC m=+0.035419837 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:50:13 compute-0 podman[234867]: 2026-01-23 09:50:13.208958829 +0000 UTC m=+0.042442297 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 09:50:16 compute-0 nova_compute[182092]: 2026-01-23 09:50:16.493 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:17 compute-0 nova_compute[182092]: 2026-01-23 09:50:17.702 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:21 compute-0 nova_compute[182092]: 2026-01-23 09:50:21.495 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:22 compute-0 nova_compute[182092]: 2026-01-23 09:50:22.704 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:23 compute-0 nova_compute[182092]: 2026-01-23 09:50:23.646 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:24 compute-0 podman[234907]: 2026-01-23 09:50:24.201198609 +0000 UTC m=+0.036125848 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 09:50:24 compute-0 podman[234908]: 2026-01-23 09:50:24.211257131 +0000 UTC m=+0.043618144 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:50:26 compute-0 nova_compute[182092]: 2026-01-23 09:50:26.497 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:26 compute-0 nova_compute[182092]: 2026-01-23 09:50:26.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:26 compute-0 nova_compute[182092]: 2026-01-23 09:50:26.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:27 compute-0 podman[234945]: 2026-01-23 09:50:27.203185834 +0000 UTC m=+0.040957838 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.669 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.670 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.705 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.880 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.881 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5741MB free_disk=73.2120361328125GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.881 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:50:27 compute-0 nova_compute[182092]: 2026-01-23 09:50:27.881 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:50:30 compute-0 nova_compute[182092]: 2026-01-23 09:50:30.291 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:50:30 compute-0 nova_compute[182092]: 2026-01-23 09:50:30.291 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:50:30 compute-0 nova_compute[182092]: 2026-01-23 09:50:30.309 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:50:30 compute-0 nova_compute[182092]: 2026-01-23 09:50:30.321 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:50:30 compute-0 nova_compute[182092]: 2026-01-23 09:50:30.339 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:50:30 compute-0 nova_compute[182092]: 2026-01-23 09:50:30.339 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:50:31 compute-0 nova_compute[182092]: 2026-01-23 09:50:31.338 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:31 compute-0 nova_compute[182092]: 2026-01-23 09:50:31.339 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:50:31 compute-0 nova_compute[182092]: 2026-01-23 09:50:31.339 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:50:31 compute-0 nova_compute[182092]: 2026-01-23 09:50:31.362 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:50:31 compute-0 nova_compute[182092]: 2026-01-23 09:50:31.362 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:31 compute-0 nova_compute[182092]: 2026-01-23 09:50:31.362 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:31 compute-0 nova_compute[182092]: 2026-01-23 09:50:31.497 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:32 compute-0 nova_compute[182092]: 2026-01-23 09:50:32.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:32 compute-0 nova_compute[182092]: 2026-01-23 09:50:32.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:50:32 compute-0 nova_compute[182092]: 2026-01-23 09:50:32.706 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:50:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:50:36 compute-0 nova_compute[182092]: 2026-01-23 09:50:36.498 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:37 compute-0 nova_compute[182092]: 2026-01-23 09:50:37.707 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:38 compute-0 podman[234964]: 2026-01-23 09:50:38.227395434 +0000 UTC m=+0.061350474 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 09:50:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:50:39.880 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:50:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:50:39.880 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:50:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:50:39.881 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:50:40 compute-0 ovn_controller[94697]: 2026-01-23T09:50:40Z|00771|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 09:50:41 compute-0 nova_compute[182092]: 2026-01-23 09:50:41.499 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:42 compute-0 nova_compute[182092]: 2026-01-23 09:50:42.646 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:50:42 compute-0 nova_compute[182092]: 2026-01-23 09:50:42.708 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:44 compute-0 podman[234988]: 2026-01-23 09:50:44.195586632 +0000 UTC m=+0.030617703 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:50:44 compute-0 podman[234987]: 2026-01-23 09:50:44.203211366 +0000 UTC m=+0.040148581 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:50:46 compute-0 nova_compute[182092]: 2026-01-23 09:50:46.500 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:47 compute-0 nova_compute[182092]: 2026-01-23 09:50:47.709 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:51 compute-0 nova_compute[182092]: 2026-01-23 09:50:51.501 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:52 compute-0 nova_compute[182092]: 2026-01-23 09:50:52.711 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:55 compute-0 podman[235026]: 2026-01-23 09:50:55.203151123 +0000 UTC m=+0.036852237 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:50:55 compute-0 podman[235025]: 2026-01-23 09:50:55.226178216 +0000 UTC m=+0.061723607 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:50:56 compute-0 nova_compute[182092]: 2026-01-23 09:50:56.502 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:57 compute-0 nova_compute[182092]: 2026-01-23 09:50:57.712 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:50:58 compute-0 podman[235064]: 2026-01-23 09:50:58.228295548 +0000 UTC m=+0.067334207 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal)
Jan 23 09:51:01 compute-0 nova_compute[182092]: 2026-01-23 09:51:01.503 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:02 compute-0 nova_compute[182092]: 2026-01-23 09:51:02.714 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:06 compute-0 nova_compute[182092]: 2026-01-23 09:51:06.504 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:07 compute-0 nova_compute[182092]: 2026-01-23 09:51:07.715 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:09 compute-0 podman[235083]: 2026-01-23 09:51:09.218852195 +0000 UTC m=+0.056444575 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 23 09:51:11 compute-0 nova_compute[182092]: 2026-01-23 09:51:11.505 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:12 compute-0 nova_compute[182092]: 2026-01-23 09:51:12.716 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:51:13.366 103978 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:2e:94', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '56:4a:04:e5:13:72'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 09:51:13 compute-0 nova_compute[182092]: 2026-01-23 09:51:13.367 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:13 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:51:13.367 103978 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 09:51:15 compute-0 podman[235107]: 2026-01-23 09:51:15.207157785 +0000 UTC m=+0.043667807 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:51:15 compute-0 podman[235106]: 2026-01-23 09:51:15.236159852 +0000 UTC m=+0.075267351 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:51:16 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:51:16.369 103978 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5bdcc3dc-0ac8-4139-a9e7-75947c17f20e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 09:51:16 compute-0 nova_compute[182092]: 2026-01-23 09:51:16.506 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:17 compute-0 nova_compute[182092]: 2026-01-23 09:51:17.717 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:21 compute-0 nova_compute[182092]: 2026-01-23 09:51:21.507 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:22 compute-0 nova_compute[182092]: 2026-01-23 09:51:22.718 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:25 compute-0 nova_compute[182092]: 2026-01-23 09:51:25.664 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:26 compute-0 podman[235145]: 2026-01-23 09:51:26.205272712 +0000 UTC m=+0.038695341 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:51:26 compute-0 podman[235146]: 2026-01-23 09:51:26.211166093 +0000 UTC m=+0.042578634 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:51:26 compute-0 nova_compute[182092]: 2026-01-23 09:51:26.508 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:27 compute-0 nova_compute[182092]: 2026-01-23 09:51:27.720 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.661 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.661 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.662 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.662 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.662 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.679 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.679 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.679 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.679 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.863 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.864 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5761MB free_disk=73.2120361328125GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.865 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.865 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.923 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.923 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.939 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.948 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.949 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:51:28 compute-0 nova_compute[182092]: 2026-01-23 09:51:28.949 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:51:29 compute-0 podman[235182]: 2026-01-23 09:51:29.221267727 +0000 UTC m=+0.059318292 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public)
Jan 23 09:51:29 compute-0 nova_compute[182092]: 2026-01-23 09:51:29.936 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:29 compute-0 nova_compute[182092]: 2026-01-23 09:51:29.937 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:31 compute-0 nova_compute[182092]: 2026-01-23 09:51:31.509 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:32 compute-0 nova_compute[182092]: 2026-01-23 09:51:32.722 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:33 compute-0 nova_compute[182092]: 2026-01-23 09:51:33.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:51:33 compute-0 nova_compute[182092]: 2026-01-23 09:51:33.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:51:36 compute-0 nova_compute[182092]: 2026-01-23 09:51:36.510 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:37 compute-0 nova_compute[182092]: 2026-01-23 09:51:37.724 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:51:39.881 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:51:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:51:39.881 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:51:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:51:39.881 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:51:39 compute-0 podman[235200]: 2026-01-23 09:51:39.952438648 +0000 UTC m=+0.052516058 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:51:41 compute-0 nova_compute[182092]: 2026-01-23 09:51:41.511 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:42 compute-0 nova_compute[182092]: 2026-01-23 09:51:42.724 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:46 compute-0 podman[235224]: 2026-01-23 09:51:46.202272824 +0000 UTC m=+0.037646525 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:51:46 compute-0 podman[235223]: 2026-01-23 09:51:46.205321512 +0000 UTC m=+0.043355799 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 09:51:46 compute-0 nova_compute[182092]: 2026-01-23 09:51:46.512 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:47 compute-0 nova_compute[182092]: 2026-01-23 09:51:47.725 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:51 compute-0 nova_compute[182092]: 2026-01-23 09:51:51.513 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:52 compute-0 nova_compute[182092]: 2026-01-23 09:51:52.726 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:56 compute-0 nova_compute[182092]: 2026-01-23 09:51:56.514 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:51:57 compute-0 podman[235261]: 2026-01-23 09:51:57.205155635 +0000 UTC m=+0.037872862 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:51:57 compute-0 podman[235262]: 2026-01-23 09:51:57.214337185 +0000 UTC m=+0.044857700 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:51:57 compute-0 nova_compute[182092]: 2026-01-23 09:51:57.729 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:00 compute-0 podman[235298]: 2026-01-23 09:52:00.200298564 +0000 UTC m=+0.039337373 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 23 09:52:01 compute-0 anacron[84233]: Job `cron.weekly' started
Jan 23 09:52:01 compute-0 anacron[84233]: Job `cron.weekly' terminated
Jan 23 09:52:01 compute-0 nova_compute[182092]: 2026-01-23 09:52:01.515 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:02 compute-0 nova_compute[182092]: 2026-01-23 09:52:02.729 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:06 compute-0 nova_compute[182092]: 2026-01-23 09:52:06.517 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:07 compute-0 nova_compute[182092]: 2026-01-23 09:52:07.730 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:10 compute-0 podman[235318]: 2026-01-23 09:52:10.217057184 +0000 UTC m=+0.054263753 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 23 09:52:11 compute-0 nova_compute[182092]: 2026-01-23 09:52:11.518 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:12 compute-0 nova_compute[182092]: 2026-01-23 09:52:12.732 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:16 compute-0 nova_compute[182092]: 2026-01-23 09:52:16.520 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:17 compute-0 podman[235343]: 2026-01-23 09:52:17.206158723 +0000 UTC m=+0.039013433 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 23 09:52:17 compute-0 podman[235342]: 2026-01-23 09:52:17.206691356 +0000 UTC m=+0.041762145 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 09:52:17 compute-0 nova_compute[182092]: 2026-01-23 09:52:17.735 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:21 compute-0 nova_compute[182092]: 2026-01-23 09:52:21.521 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:22 compute-0 nova_compute[182092]: 2026-01-23 09:52:22.736 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:26 compute-0 nova_compute[182092]: 2026-01-23 09:52:26.522 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:27 compute-0 nova_compute[182092]: 2026-01-23 09:52:27.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:27 compute-0 nova_compute[182092]: 2026-01-23 09:52:27.739 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:28 compute-0 podman[235380]: 2026-01-23 09:52:28.198220808 +0000 UTC m=+0.036589170 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:52:28 compute-0 podman[235381]: 2026-01-23 09:52:28.198328602 +0000 UTC m=+0.035115563 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.667 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.667 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.858 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.859 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5760MB free_disk=73.21203231811523GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.859 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.859 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.907 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.907 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.923 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.932 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.933 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:52:28 compute-0 nova_compute[182092]: 2026-01-23 09:52:28.933 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:52:29 compute-0 nova_compute[182092]: 2026-01-23 09:52:29.932 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:29 compute-0 nova_compute[182092]: 2026-01-23 09:52:29.932 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:52:29 compute-0 nova_compute[182092]: 2026-01-23 09:52:29.933 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:52:30 compute-0 nova_compute[182092]: 2026-01-23 09:52:30.071 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:52:30 compute-0 nova_compute[182092]: 2026-01-23 09:52:30.071 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:30 compute-0 nova_compute[182092]: 2026-01-23 09:52:30.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:30 compute-0 nova_compute[182092]: 2026-01-23 09:52:30.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:31 compute-0 podman[235416]: 2026-01-23 09:52:31.196855955 +0000 UTC m=+0.035955577 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Jan 23 09:52:31 compute-0 nova_compute[182092]: 2026-01-23 09:52:31.523 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:31 compute-0 nova_compute[182092]: 2026-01-23 09:52:31.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:32 compute-0 nova_compute[182092]: 2026-01-23 09:52:32.739 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:52:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:52:34 compute-0 nova_compute[182092]: 2026-01-23 09:52:34.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:34 compute-0 nova_compute[182092]: 2026-01-23 09:52:34.649 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:52:36 compute-0 nova_compute[182092]: 2026-01-23 09:52:36.524 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:37 compute-0 nova_compute[182092]: 2026-01-23 09:52:37.740 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:52:39.883 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:52:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:52:39.883 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:52:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:52:39.883 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:52:41 compute-0 podman[235435]: 2026-01-23 09:52:41.216314975 +0000 UTC m=+0.053982213 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 09:52:41 compute-0 nova_compute[182092]: 2026-01-23 09:52:41.525 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:42 compute-0 nova_compute[182092]: 2026-01-23 09:52:42.742 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:44 compute-0 nova_compute[182092]: 2026-01-23 09:52:44.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:52:46 compute-0 nova_compute[182092]: 2026-01-23 09:52:46.525 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:47 compute-0 nova_compute[182092]: 2026-01-23 09:52:47.744 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:48 compute-0 podman[235458]: 2026-01-23 09:52:48.205158647 +0000 UTC m=+0.040396621 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 23 09:52:48 compute-0 podman[235459]: 2026-01-23 09:52:48.226138219 +0000 UTC m=+0.059518160 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:52:51 compute-0 nova_compute[182092]: 2026-01-23 09:52:51.526 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:52 compute-0 nova_compute[182092]: 2026-01-23 09:52:52.745 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:56 compute-0 nova_compute[182092]: 2026-01-23 09:52:56.527 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:57 compute-0 nova_compute[182092]: 2026-01-23 09:52:57.747 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:52:59 compute-0 podman[235497]: 2026-01-23 09:52:59.196172436 +0000 UTC m=+0.034505062 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 09:52:59 compute-0 podman[235498]: 2026-01-23 09:52:59.219850225 +0000 UTC m=+0.056386895 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:53:01 compute-0 nova_compute[182092]: 2026-01-23 09:53:01.528 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:02 compute-0 podman[235535]: 2026-01-23 09:53:02.202077388 +0000 UTC m=+0.032490285 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:53:02 compute-0 nova_compute[182092]: 2026-01-23 09:53:02.749 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:06 compute-0 nova_compute[182092]: 2026-01-23 09:53:06.528 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:07 compute-0 nova_compute[182092]: 2026-01-23 09:53:07.751 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:11 compute-0 nova_compute[182092]: 2026-01-23 09:53:11.529 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:11 compute-0 podman[235554]: 2026-01-23 09:53:11.603344669 +0000 UTC m=+0.053719848 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 23 09:53:12 compute-0 nova_compute[182092]: 2026-01-23 09:53:12.753 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:16 compute-0 nova_compute[182092]: 2026-01-23 09:53:16.530 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:17 compute-0 nova_compute[182092]: 2026-01-23 09:53:17.755 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:19 compute-0 podman[235578]: 2026-01-23 09:53:19.205195926 +0000 UTC m=+0.040765054 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:53:19 compute-0 podman[235577]: 2026-01-23 09:53:19.208296021 +0000 UTC m=+0.046385503 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 09:53:21 compute-0 nova_compute[182092]: 2026-01-23 09:53:21.531 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:22 compute-0 nova_compute[182092]: 2026-01-23 09:53:22.757 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:26 compute-0 nova_compute[182092]: 2026-01-23 09:53:26.532 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:27 compute-0 nova_compute[182092]: 2026-01-23 09:53:27.654 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:27 compute-0 nova_compute[182092]: 2026-01-23 09:53:27.758 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.670 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.862 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.863 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5754MB free_disk=73.21254348754883GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.863 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.863 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.995 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:53:28 compute-0 nova_compute[182092]: 2026-01-23 09:53:28.995 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.077 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing inventories for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.129 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating ProviderTree inventory for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.129 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Updating inventory in ProviderTree for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.143 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing aggregate associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.160 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Refreshing trait associations for resource provider 052a7ae7-9ec7-49ca-a013-73791f9c049a, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.174 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.189 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.190 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:53:29 compute-0 nova_compute[182092]: 2026-01-23 09:53:29.190 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:53:30 compute-0 podman[235616]: 2026-01-23 09:53:30.203185758 +0000 UTC m=+0.041200426 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 09:53:30 compute-0 podman[235617]: 2026-01-23 09:53:30.225133976 +0000 UTC m=+0.060931935 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:53:31 compute-0 nova_compute[182092]: 2026-01-23 09:53:31.534 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.190 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.190 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.190 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.206 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.207 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.207 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.207 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:32 compute-0 nova_compute[182092]: 2026-01-23 09:53:32.759 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:33 compute-0 podman[235652]: 2026-01-23 09:53:33.193964743 +0000 UTC m=+0.032985478 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 23 09:53:35 compute-0 nova_compute[182092]: 2026-01-23 09:53:35.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:35 compute-0 nova_compute[182092]: 2026-01-23 09:53:35.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:53:36 compute-0 nova_compute[182092]: 2026-01-23 09:53:36.534 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:37 compute-0 nova_compute[182092]: 2026-01-23 09:53:37.760 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:53:39.884 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:53:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:53:39.884 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:53:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:53:39.884 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:53:41 compute-0 nova_compute[182092]: 2026-01-23 09:53:41.535 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:42 compute-0 podman[235670]: 2026-01-23 09:53:42.213343104 +0000 UTC m=+0.052830822 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:53:42 compute-0 nova_compute[182092]: 2026-01-23 09:53:42.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:42 compute-0 nova_compute[182092]: 2026-01-23 09:53:42.761 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:46 compute-0 nova_compute[182092]: 2026-01-23 09:53:46.535 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:46 compute-0 nova_compute[182092]: 2026-01-23 09:53:46.661 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:46 compute-0 nova_compute[182092]: 2026-01-23 09:53:46.661 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 09:53:46 compute-0 nova_compute[182092]: 2026-01-23 09:53:46.673 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 09:53:47 compute-0 nova_compute[182092]: 2026-01-23 09:53:47.762 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:49 compute-0 nova_compute[182092]: 2026-01-23 09:53:49.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:53:49 compute-0 nova_compute[182092]: 2026-01-23 09:53:49.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 09:53:50 compute-0 podman[235694]: 2026-01-23 09:53:50.202716071 +0000 UTC m=+0.039277930 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:53:50 compute-0 podman[235693]: 2026-01-23 09:53:50.204252678 +0000 UTC m=+0.042579434 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 09:53:51 compute-0 nova_compute[182092]: 2026-01-23 09:53:51.537 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:52 compute-0 nova_compute[182092]: 2026-01-23 09:53:52.765 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:56 compute-0 nova_compute[182092]: 2026-01-23 09:53:56.540 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:53:57 compute-0 nova_compute[182092]: 2026-01-23 09:53:57.767 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:01 compute-0 podman[235732]: 2026-01-23 09:54:01.197350085 +0000 UTC m=+0.034668650 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:54:01 compute-0 podman[235733]: 2026-01-23 09:54:01.22722105 +0000 UTC m=+0.062890516 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 23 09:54:01 compute-0 nova_compute[182092]: 2026-01-23 09:54:01.543 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:02 compute-0 nova_compute[182092]: 2026-01-23 09:54:02.769 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:04 compute-0 podman[235770]: 2026-01-23 09:54:04.196051567 +0000 UTC m=+0.035450535 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 23 09:54:06 compute-0 nova_compute[182092]: 2026-01-23 09:54:06.544 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:07 compute-0 nova_compute[182092]: 2026-01-23 09:54:07.770 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:11 compute-0 nova_compute[182092]: 2026-01-23 09:54:11.547 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:12 compute-0 nova_compute[182092]: 2026-01-23 09:54:12.771 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:13 compute-0 podman[235788]: 2026-01-23 09:54:13.21042855 +0000 UTC m=+0.048934536 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 09:54:16 compute-0 nova_compute[182092]: 2026-01-23 09:54:16.549 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:17 compute-0 nova_compute[182092]: 2026-01-23 09:54:17.771 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:21 compute-0 podman[235812]: 2026-01-23 09:54:21.19817473 +0000 UTC m=+0.034837839 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:54:21 compute-0 podman[235811]: 2026-01-23 09:54:21.202164532 +0000 UTC m=+0.040724418 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:54:21 compute-0 nova_compute[182092]: 2026-01-23 09:54:21.551 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:22 compute-0 nova_compute[182092]: 2026-01-23 09:54:22.772 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:26 compute-0 nova_compute[182092]: 2026-01-23 09:54:26.553 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:27 compute-0 nova_compute[182092]: 2026-01-23 09:54:27.773 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.658 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.659 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.687 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.687 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.687 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.687 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.859 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.860 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5755MB free_disk=73.21254348754883GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.860 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.860 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.924 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.924 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.942 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.960 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.961 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:54:28 compute-0 nova_compute[182092]: 2026-01-23 09:54:28.961 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:54:30 compute-0 nova_compute[182092]: 2026-01-23 09:54:30.952 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:31 compute-0 nova_compute[182092]: 2026-01-23 09:54:31.556 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:31 compute-0 nova_compute[182092]: 2026-01-23 09:54:31.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:32 compute-0 podman[235850]: 2026-01-23 09:54:32.203221664 +0000 UTC m=+0.040411038 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 09:54:32 compute-0 podman[235851]: 2026-01-23 09:54:32.203443101 +0000 UTC m=+0.038968697 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:54:32 compute-0 nova_compute[182092]: 2026-01-23 09:54:32.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:32 compute-0 nova_compute[182092]: 2026-01-23 09:54:32.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:54:32 compute-0 nova_compute[182092]: 2026-01-23 09:54:32.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:54:32 compute-0 nova_compute[182092]: 2026-01-23 09:54:32.659 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:54:32 compute-0 nova_compute[182092]: 2026-01-23 09:54:32.659 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:32 compute-0 nova_compute[182092]: 2026-01-23 09:54:32.774 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:54:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:54:33 compute-0 nova_compute[182092]: 2026-01-23 09:54:33.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:33 compute-0 nova_compute[182092]: 2026-01-23 09:54:33.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:35 compute-0 podman[235888]: 2026-01-23 09:54:35.198187613 +0000 UTC m=+0.036334141 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6)
Jan 23 09:54:35 compute-0 nova_compute[182092]: 2026-01-23 09:54:35.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:35 compute-0 nova_compute[182092]: 2026-01-23 09:54:35.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:54:36 compute-0 nova_compute[182092]: 2026-01-23 09:54:36.558 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:37 compute-0 nova_compute[182092]: 2026-01-23 09:54:37.776 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:54:39.885 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:54:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:54:39.885 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:54:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:54:39.885 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:54:41 compute-0 nova_compute[182092]: 2026-01-23 09:54:41.560 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:42 compute-0 nova_compute[182092]: 2026-01-23 09:54:42.777 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:44 compute-0 podman[235906]: 2026-01-23 09:54:44.212244314 +0000 UTC m=+0.051185409 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 09:54:44 compute-0 nova_compute[182092]: 2026-01-23 09:54:44.645 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:54:46 compute-0 nova_compute[182092]: 2026-01-23 09:54:46.562 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:47 compute-0 nova_compute[182092]: 2026-01-23 09:54:47.778 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:51 compute-0 nova_compute[182092]: 2026-01-23 09:54:51.565 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:52 compute-0 podman[235930]: 2026-01-23 09:54:52.205166562 +0000 UTC m=+0.041765170 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:54:52 compute-0 podman[235929]: 2026-01-23 09:54:52.207370007 +0000 UTC m=+0.045739833 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 23 09:54:52 compute-0 nova_compute[182092]: 2026-01-23 09:54:52.780 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:56 compute-0 nova_compute[182092]: 2026-01-23 09:54:56.567 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:54:57 compute-0 nova_compute[182092]: 2026-01-23 09:54:57.782 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:01 compute-0 nova_compute[182092]: 2026-01-23 09:55:01.569 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:02 compute-0 nova_compute[182092]: 2026-01-23 09:55:02.784 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:03 compute-0 podman[235968]: 2026-01-23 09:55:03.192245924 +0000 UTC m=+0.031655930 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 09:55:03 compute-0 podman[235969]: 2026-01-23 09:55:03.199144589 +0000 UTC m=+0.036689671 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:55:06 compute-0 podman[236006]: 2026-01-23 09:55:06.201173985 +0000 UTC m=+0.039825815 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Jan 23 09:55:06 compute-0 nova_compute[182092]: 2026-01-23 09:55:06.571 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:07 compute-0 nova_compute[182092]: 2026-01-23 09:55:07.785 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:11 compute-0 nova_compute[182092]: 2026-01-23 09:55:11.573 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:12 compute-0 nova_compute[182092]: 2026-01-23 09:55:12.787 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:15 compute-0 podman[236024]: 2026-01-23 09:55:15.215234835 +0000 UTC m=+0.054075811 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 09:55:16 compute-0 nova_compute[182092]: 2026-01-23 09:55:16.575 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:17 compute-0 nova_compute[182092]: 2026-01-23 09:55:17.787 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:21 compute-0 nova_compute[182092]: 2026-01-23 09:55:21.577 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:22 compute-0 nova_compute[182092]: 2026-01-23 09:55:22.788 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:23 compute-0 podman[236047]: 2026-01-23 09:55:23.206211689 +0000 UTC m=+0.041234122 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 23 09:55:23 compute-0 podman[236048]: 2026-01-23 09:55:23.207479489 +0000 UTC m=+0.039397631 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 23 09:55:26 compute-0 nova_compute[182092]: 2026-01-23 09:55:26.578 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:27 compute-0 nova_compute[182092]: 2026-01-23 09:55:27.790 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.669 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.670 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.670 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.850 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.851 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5750MB free_disk=73.21254348754883GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.851 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.851 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.902 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.902 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.920 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.929 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.930 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:55:29 compute-0 nova_compute[182092]: 2026-01-23 09:55:29.930 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:55:30 compute-0 nova_compute[182092]: 2026-01-23 09:55:30.926 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:31 compute-0 nova_compute[182092]: 2026-01-23 09:55:31.580 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:31 compute-0 nova_compute[182092]: 2026-01-23 09:55:31.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:32 compute-0 nova_compute[182092]: 2026-01-23 09:55:32.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:32 compute-0 nova_compute[182092]: 2026-01-23 09:55:32.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:32 compute-0 nova_compute[182092]: 2026-01-23 09:55:32.790 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:33 compute-0 nova_compute[182092]: 2026-01-23 09:55:33.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:33 compute-0 nova_compute[182092]: 2026-01-23 09:55:33.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:55:33 compute-0 nova_compute[182092]: 2026-01-23 09:55:33.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:55:33 compute-0 nova_compute[182092]: 2026-01-23 09:55:33.663 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:55:33 compute-0 nova_compute[182092]: 2026-01-23 09:55:33.663 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:33 compute-0 nova_compute[182092]: 2026-01-23 09:55:33.663 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:34 compute-0 podman[236087]: 2026-01-23 09:55:34.205289775 +0000 UTC m=+0.039576258 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 09:55:34 compute-0 podman[236088]: 2026-01-23 09:55:34.207875219 +0000 UTC m=+0.039888346 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:55:36 compute-0 nova_compute[182092]: 2026-01-23 09:55:36.581 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:36 compute-0 nova_compute[182092]: 2026-01-23 09:55:36.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:55:36 compute-0 nova_compute[182092]: 2026-01-23 09:55:36.650 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 09:55:37 compute-0 podman[236124]: 2026-01-23 09:55:37.226297254 +0000 UTC m=+0.065306213 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=)
Jan 23 09:55:37 compute-0 nova_compute[182092]: 2026-01-23 09:55:37.791 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:55:39.886 103978 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:55:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:55:39.887 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:55:39 compute-0 ovn_metadata_agent[103973]: 2026-01-23 09:55:39.887 103978 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:55:41 compute-0 nova_compute[182092]: 2026-01-23 09:55:41.583 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:42 compute-0 nova_compute[182092]: 2026-01-23 09:55:42.793 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:46 compute-0 podman[236142]: 2026-01-23 09:55:46.241203567 +0000 UTC m=+0.080218095 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 09:55:46 compute-0 nova_compute[182092]: 2026-01-23 09:55:46.585 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:47 compute-0 nova_compute[182092]: 2026-01-23 09:55:47.793 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:51 compute-0 nova_compute[182092]: 2026-01-23 09:55:51.588 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:52 compute-0 nova_compute[182092]: 2026-01-23 09:55:52.796 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:54 compute-0 podman[236166]: 2026-01-23 09:55:54.207197028 +0000 UTC m=+0.042429657 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 23 09:55:54 compute-0 podman[236165]: 2026-01-23 09:55:54.211164197 +0000 UTC m=+0.049394057 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 09:55:56 compute-0 nova_compute[182092]: 2026-01-23 09:55:56.590 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:55:57 compute-0 nova_compute[182092]: 2026-01-23 09:55:57.796 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:01 compute-0 nova_compute[182092]: 2026-01-23 09:56:01.592 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:02 compute-0 nova_compute[182092]: 2026-01-23 09:56:02.798 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:05 compute-0 podman[236203]: 2026-01-23 09:56:05.205277344 +0000 UTC m=+0.038874264 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 23 09:56:05 compute-0 podman[236202]: 2026-01-23 09:56:05.228242978 +0000 UTC m=+0.064029155 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:56:06 compute-0 nova_compute[182092]: 2026-01-23 09:56:06.593 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:07 compute-0 nova_compute[182092]: 2026-01-23 09:56:07.799 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:08 compute-0 podman[236239]: 2026-01-23 09:56:08.198320365 +0000 UTC m=+0.036161941 container health_status 7bf4b5995bd63f33948ca96d18504330f2d5a2367869ff30d135dc578a1683df (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Jan 23 09:56:11 compute-0 nova_compute[182092]: 2026-01-23 09:56:11.595 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:12 compute-0 nova_compute[182092]: 2026-01-23 09:56:12.801 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:16 compute-0 nova_compute[182092]: 2026-01-23 09:56:16.597 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:17 compute-0 podman[236257]: 2026-01-23 09:56:17.213221395 +0000 UTC m=+0.050291498 container health_status f03aba2a4e69f30e753a7e21354c57fa8bd5d0775ee8f96f7c5999a5d1c5085d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:56:17 compute-0 nova_compute[182092]: 2026-01-23 09:56:17.803 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:18 compute-0 sshd-session[236280]: Accepted publickey for zuul from 192.168.122.10 port 36172 ssh2: ECDSA SHA256:UzQQmtS/YS2ihhvgl74NKeafgjxNhU8HinfCv0WFa70
Jan 23 09:56:18 compute-0 systemd-logind[746]: New session 64 of user zuul.
Jan 23 09:56:18 compute-0 systemd[1]: Started Session 64 of User zuul.
Jan 23 09:56:18 compute-0 sshd-session[236280]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:56:18 compute-0 sudo[236284]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 23 09:56:18 compute-0 sudo[236284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:21 compute-0 nova_compute[182092]: 2026-01-23 09:56:21.598 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:22 compute-0 nova_compute[182092]: 2026-01-23 09:56:22.805 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:25 compute-0 podman[236438]: 2026-01-23 09:56:25.205355582 +0000 UTC m=+0.038496121 container health_status e187c1e04fadb06f0d4d1c96368105d10033b2f08f9678fa48cf8570f51c20a3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 23 09:56:25 compute-0 podman[236437]: 2026-01-23 09:56:25.208136685 +0000 UTC m=+0.041439912 container health_status 29f52509e9442576d169cceed55538f813c645506fe24c4307d843ad6eccadbc (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 09:56:26 compute-0 nova_compute[182092]: 2026-01-23 09:56:26.600 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:27 compute-0 nova_compute[182092]: 2026-01-23 09:56:27.806 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:28 compute-0 ovs-vsctl[236526]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 09:56:29 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 236308 (sos)
Jan 23 09:56:29 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 23 09:56:29 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 23 09:56:29 compute-0 virtqemud[181713]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 09:56:29 compute-0 virtqemud[181713]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 09:56:29 compute-0 virtqemud[181713]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.669 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.669 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.669 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.669 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.881 182096 WARNING nova.virt.libvirt.driver [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.881 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5690MB free_disk=73.19392013549805GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.882 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.882 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.933 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.933 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.952 182096 DEBUG nova.compute.provider_tree [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed in ProviderTree for provider: 052a7ae7-9ec7-49ca-a013-73791f9c049a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.963 182096 DEBUG nova.scheduler.client.report [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Inventory has not changed for provider 052a7ae7-9ec7-49ca-a013-73791f9c049a based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.964 182096 DEBUG nova.compute.resource_tracker [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 09:56:29 compute-0 nova_compute[182092]: 2026-01-23 09:56:29.964 182096 DEBUG oslo_concurrency.lockutils [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 09:56:30 compute-0 crontab[236909]: (root) LIST (root)
Jan 23 09:56:30 compute-0 nova_compute[182092]: 2026-01-23 09:56:30.960 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:56:31 compute-0 nova_compute[182092]: 2026-01-23 09:56:31.602 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:32 compute-0 systemd[1]: Starting Hostname Service...
Jan 23 09:56:32 compute-0 systemd[1]: Started Hostname Service.
Jan 23 09:56:32 compute-0 nova_compute[182092]: 2026-01-23 09:56:32.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:56:32 compute-0 nova_compute[182092]: 2026-01-23 09:56:32.810 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 ceilometer_agent_compute[191702]: 2026-01-23 09:56:33.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 23 09:56:33 compute-0 nova_compute[182092]: 2026-01-23 09:56:33.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:56:33 compute-0 nova_compute[182092]: 2026-01-23 09:56:33.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:56:34 compute-0 nova_compute[182092]: 2026-01-23 09:56:34.649 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:56:34 compute-0 nova_compute[182092]: 2026-01-23 09:56:34.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:56:35 compute-0 nova_compute[182092]: 2026-01-23 09:56:35.650 182096 DEBUG oslo_service.periodic_task [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 09:56:35 compute-0 nova_compute[182092]: 2026-01-23 09:56:35.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 09:56:35 compute-0 nova_compute[182092]: 2026-01-23 09:56:35.651 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 09:56:35 compute-0 nova_compute[182092]: 2026-01-23 09:56:35.667 182096 DEBUG nova.compute.manager [None req-645291ee-bb8c-458f-9525-d50750f8eee2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 09:56:36 compute-0 podman[237425]: 2026-01-23 09:56:36.248228088 +0000 UTC m=+0.083760383 container health_status a31c901a6c146fb612e50d188e6b0589e6269e5bde2dc7b2710f89c01409940d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6b92037f0cc6a48124cc14a7143f91eba80b900c59b235fa991b5c26f47a86e0-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:56:36 compute-0 podman[237427]: 2026-01-23 09:56:36.293504761 +0000 UTC m=+0.110264536 container health_status d75e8a4fe37a77be0c1cfaf8faa01e9d5f44be9e18f82b6e01b9cb820e8f84fd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '002b5d52bd4ea8b0cf09e2c7e8362912d3980d2f361bd12c782f0b4b73512558-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 23 09:56:36 compute-0 nova_compute[182092]: 2026-01-23 09:56:36.603 182096 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
